Google recently published a mind-blowing paper exploring what neural networks interpret when processing an image. The results are a dreamscape of hallucinations and patterns that pretty much answers the question on whether Androids dream on electric sheep.
You can feed the network and see the results over your own images using the IPython Notebook Google released too. I found setting up all the dependencies on a Mac particularly painful, so after some tinkering, I got it working on a Docker container, directly on the command line. All you need to do to process an image is run the container:
docker run -i -t -e INPUT=your_file.png -v /path/to/your/folder:/data herval/deepdream
The source is on Github, in case you want to venture around turning that into a SaaS or adding GPU support!
If you prefer running the original IPython Notebook, someone released another Docker container which works beautifully too - there’s also an attempt to “dreamify” Youtube videos in the works. Looking forward for someone to combine this with the Oculus Rift and live-feed video!
Enjoy - and sweet #deepdreams