Deep Dreaming On a Mac

Deep Dreaming in Andernach

Deep Dreaming in Andernach

Make Google’s Deep Dream Code run on your Mac

From the moment I’ve seen these spectacular images of neural networks running hot I wanted to get the software and run it on my computer. The only thing that prevented me from doing so was the abundance of libraries and tools needed. Though I’m used to this kind of software development for many years, I hesitated, for I didn’t know a single one of the appropriate products. Until last week. It took me four evenings, my Mac and a bit of patience and stubbornness. The results are by all means rewarding; of course, there are ready-to-go solutions using virtual systems, even online services, but if you are a bit like me then you are looking for insight, and not just quick satisfaction.

Inceptionism

Things started with an exciting article on the Google Research Blog. Impressive image galleries were following quickly, presenting a lot of funky images. A new term for this kind of imaging was born: inceptionism, meaning this special kind of ‘tuning’ neural networks, making them see things that aren’t there. News magazines and websites were filled with these images, but there’s this list of software to be installed, too: that’s NumPy, SciPy, PIL, IPython (or a scientific python distribution such as Anaconda or Canopy), and finally, the Caffe deep learning framework. (And Google’s protobuf library, for completeness.) So let’s get started! My working environment is a Mini Mac running Mavericks with enough RAM (16 GB) and disk space (1 TB). Short forewarning: this isn’t a step-by-step guide. I have found a lot of these, and many of them are misleading. Installing complex software requires you to think for yourself, and to research for solutions to the inevitable problems you’re going to encounter. But the good news is: this task is feasible.

Two essential tools

Deep Dreaming in Andernach II

Deep Dreaming in Andernach II

There are two essential tools you should have installed on your machine before proceeding: that’s XCode, Apple’s software development package, and a decent package manager. There are some good managers in the wild, but especially for installing the packages mentioned above, you’ll need Homebrew. Of course, you can do it the hard way and manually install all these packages. But there are many dependencies you will have to fulfill, too, and that’s not for the faint at heart. I really recommend using Homebrew.

Oh, and there’s another tool: you shouldn’t be afraid of terminal applications. There are a lot of command lines you will have to execute. Oh, and things get _a lot_ easier when you’re using bash instead of any other (probably cooler) shell.

Sourced

The complete Python-based source code of deepdream – that’s the official project name – can be found on GitHub. The Readme isn’t as interesting as the Python notebook is: it is runnable code, comments on this code, and a tutorial all in one file. This is a lot of convenience for just one file, that sort of convenience that can get you feel a bit intimidated. At least I felt that way at first, but don’t get inhibited by this.

So there is this Python scientific stack: NumPy, SciPy, PIL, and IPython. You need these for executing Python code and for all things science and imaging. Do yourself a favour and install Anaconda. I didn’t try Canopy, so your mileage may vary if you’re going to use that. All needed scientific libraries are already included with Anaconda and Canopy or can be easily installed using these environments.

Before we are going to install Caffe let’s first install Google’s protobuf library. If you want to take the easy way, just use Homebrew: brew install protobuf – and you are done. On protobuf’s GitHub Readme page you can find install notes for Mac, but I wouldn’t use them unless you know what you’re doing. There’s enough work for the compiler when we are going to install Caffe, so enjoy a little convenience while you can.

Caffe

Everything installed without fatal errors so far? Good, so let’s have a look at Caffe! Caffe is a deep learning framework, the neural network backend to the Google code (put in a nutshell). And it’s a beast. It’s already starting with that homepage which is missing a navigation bar! If you found something interesting there, make sure you bookmark it: you might have trouble finding it again. Before we are even going to question its inner operation, let us have a look at its install requirements first: you will have to install

  • CUDA – a parallel computing library and GPU (graphics processing unit) support,
  • BLAS – a math library for vectors, matrices and stuff,
  • Boost – even more math and algorithms,
  • OpenCV – a visualization library (hey, it even has a Clojure API!),
  • protobuf, glog, gflags – at least one of these is already installed,
  • various IO libraries: hdf5, leveldb, snappy, lmdb

After these preliminaries, we might think about installing Caffe. Don’t panic, brew will help us with that. At this time we probably should reconsider the comment on deepdream’s Python notebook: “This notebook is designed to have as few dependencies as possible.” Things could be worse, no? Have a look at Caffe’s prerequisites page, it’s all listed there. The folks at BVLC also provided detailed instructions on how to install Caffe on a Mac: we are using that document as our roadmap.

CUDA

Deep Dreaming in Andernach III

Deep Dreaming in Andernach III

So we have CUDA: though the Mini Mac doesn’t have a GPU we need this library for compiling Caffe, though. The download is rather straightforward, the DMG file provides an installer. Thanks, NVIDIA. Don’t let the install instructions discourage you: there is no real need for a GPU, at least not for the deepdream project. Pay attention to the environment variable $DYLD_LIBRARY_PATH, as is described on page 5: the BVLC instructions don’t recommend them. Instead, use a variable $DYLD_FALLBACK_LIBRARY_PATH. On my system I’ve set it to

Check if these paths are valid for you. The entries in /usr/local/cuda are symbolic links pointing to /Developer/NVIDIA/CUDA-7.0. My anaconda install is ~/anaconda which might be different from your install settings.

Miscellaneous Libraries

Following the Caffe Mac install instructions, you now may use brew for installing further libraries, including OpenCV:

If you have anaconda installed, 1) omit the hdf5 install step, it’s already included (we will come to this again later). 2) issue brew edit opencv before installing it and change two lines. On my system (OS X 10.9, brew 0.9.5) I changed them to:

– which differs a bit from the Caffe instructions. Your experience may vary, so think before you type. After that you may install opencv.

BLAS

This library is needed for vector and matrix computations. You have the choice between ATLAS (default), MKL (expensive!), and OpenBLAS. I ran into trouble with ATLAS; it’s part of XCode, but somehow didn’t want to compile with Caffe. So I installed OpenBLAS which can be easily accomplished with brew install openblas We have to tell this Caffe in its Makefile later.

And so on…

There are further Python libs like numpy, boost.python, and pandas to be installed. But if you went for anaconda, you can skip this step. The installation of protobuf is also done (see above). If everything installed without errors you should take a break now; you deserve it.

At last: Caffe

Did you already download it? You either issue a git clone https://github.com/BVLC/caffe or just download the .zip file and extract it. Copy the Makefile.config.example to Makefile.config and open the latter one in an editor. Adjust the settings to your needs, save, compile. Oh, wait, there are some caveats! Let’s have a look at that file:

There are obviously some settings that deserve some attention. Starting at line 8, we set the CPU-only mode, because a Mini Mac doesn’t have a GPU. This affects the optimization part of CUDA which is needed for compilation, anyway. Check the CUDA_DIR (line 15) entry; for reasons mentioned further above, we are using OpenBLAS and set this in line 33. Uncomment lines 41 and 42 if you installed BLAS with brew (like we did). Check the paths in ANACONDA_HOME and PYTHON_INCLUDE as well as PYTHON_LIB: since we are using anaconda use line 62 instead of line 61. I didn’t have to uncomment lines 65 and 66: check these if you run into errors during compilation. Uncomment lines 76 and 77. Hope for the best, save. Now run make all.

If you installed all prerequisites without errors and you didn’t make any mistakes in Make.config, this should run without errors. After that, do make test, then make runtest. That last one is rather exciting, for it’s running 1500+ tests using the previously compiled caffe framework. Issue a last make pycaffe (for the Python wrappers) and then you’re done for the moment. If all this finishes without complaints, you can pat your shoulder: you deserved it!

Excursion

Before proceeding it’s a good idea to play a bit with Caffe. Running the LeNet model for MNIST handwritten digit classification is explained in a short tutorial that touches all components (including protobuf) that we have installed so far. It’s a real and practical test.

Getting Real

We have now fulfilled all preconditions, so we can have a first try on deepdream. Finally! But don’t expect too much at this moment, there is still some work to do. First, download deepdream from GitHub. Now start an ipython notebook: you do this by either starting the Anaconda Launcher in $(ANACONDA_HOME)/bin (Launcher.app), or by opening a terminal window and running ipython_mac.command (which is nothing but a batch consisting of a call of “ipython notebook”). Your browser opens a new tab and now you see the notebook dashboard.

Navigate to your deepdream folder; look for dream.ipynb and click on it. You finally launched the deepdream notebook. Here you can see for the first time if it’s already possible to run the deepdrem code without problems. Unfortunately, the web is full of cries for help from people who got stuck at this point. So did I.

My first problem was that Python wasn’t able to import caffe, as described in this discussion. The error was “Library not loaded: libhdf5_hl.10.dylib”, and the reason was that I had erroneously installed hdf5 with brew earlier. If you followed this text from the beginning, you shouldn’t run into this error. If you did, then follow creynold’s advice and uninstall hdf5 (brew uninstall hdf5), because anaconda comes packaged with hdf5 already.

My next problem was a SEGFAULT, making Python crash when trying to import caffe. There was still something wrong. It was not caffe’s fault, obviously, because all its test were running successfully, but some dependencies seemed to be resolved the wrong way when importing. And indeed, after installing anaconda I had three different pythons and so different versions of libraries installed. To prevent old or wrong libraries from sneaking in during the linking phase you have to uninstall or at least disable them. Start with locate updatedb (this updates the locate database which tells you which files are where). My anaconda is configured to use python 2.7 (default), so locate libpython2.7 reveals all places where 2.7 libraries can be found on your system. There were plenty on my system, and especially these in /System/Library/Frameworks/Python.framework/Versions/2.7/lib/ and /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/ were too much. I simply renamed these Python.framework folder to Python.framework.bak. Not the prettiest way, but an effective one, because the segfault disappeared and I was finally able to import caffe. This discussion helped me a lot to overcome this problem.

Another issue might emerge when Python complains about missing modules. This can be done with the conda tool. It’s explained here. The anaconda FAQ has some useful hints, too. There’s a page explaining the ipython notebook basics. I’d recommend reading it, because these are the steps you’ll need to do after solving the last problem.

Are we finished yet? Almost. Open the deepdreams notebook, click on the first cell and run it. Should be working now. Click on the second cell and run it. Oh no! Now you need a training model. Don’t panic: as explained here you can find it there. (Ahh, another download!) Put it into your caffe/models/bvlc_googlenet/ directory, adjust the model_path variable in cell #2 and re-run it. Go through every cell and execute it, and it’s getting more and more exciting. Congrats! You should be able to process your own images now. Play with the code, with its parameters: the fun has just begun!

Deep Dreaming On Pluto

Deep Dreaming On Pluto

The next task I have set on my list is training a network with my own photos. That means a basic understanding of Caffe and that will take some time. But thinking of the beginning of our undertaking: this was exactly what I was looking for. Within a few weeks the web has been flooded with deep dream images that are showing processed images with similar features. Looking closely at them, you can find dogs, birds, people, strange buildings and the like. It would be a bit boring if we stopped at this point. So the next step is actually using the tools we have installed here. It’s all about insight, no?

About Manfred Berndtgen

Manfred Berndtgen, maintainer of this site, is a part-time researcher with enough spare time for doing useless things and sharing them with the rest of the world. His main photographic subjects are made of plants or stones, and since he's learning Haskell everything seems functional to him.