Frameworks and Libraries for Deep Learning



Deep Learning is a form of Machine Learning and Artificial Intelligence that utilizes multiple hidden layers of a Neural Network stacked on top of each other in order to attempt to form a deeper “understanding” about the data.

Recently, deep neural networks have had a lot of exposure on the web in the form of “Deep Dreams”,  or as it was called in the original article on google researchInceptionism.

In this article we’ll be going over several frameworks, libraries and tools for Deep Learning for several purposes.

Deep Learning in Python


Theano

Homepage: http://deeplearning.net/software/theano/
Github URL
https://github.com/Theano/Theano

Theano is not only the backbone library to a lot of other frameworks which will be discussed in this article, but also a great library to use on its own in almost any situation where you’d want to perform anything ranging from a simple Logistic Regression all the way to Modeling and generating sequences of polyphonic music or using a long short term memory network to classify movie ratings.

Theano is largely written in Cython, a Python dialect that compiles to native executable code, which provides a large speed boost compared to using only the interpreted Python language. On top of that, a lot of optimizations have been built into theano in order to optimize the flow of your computations and keep your runtime to a bare minimum.

If that speed boost isn’t quite enough for you yet, it also has built-in support for performing all those time consuming calculations on your GPU using CUDA. All this does not require anything more than switching a single configuration flag. To run a script on the CPU and then switch over to the GPU does not require any changes in your code.

Also note that even though Theano uses Cython and CUDA for its large performance boosts, you can create almost any type of neural network architecture using only Python code.


Pylearn2

Homepage: http://deeplearning.net/software/pylearn2/
Github URL
https://github.com/lisa-lab/pylearn2

Made by the same developers as Theano, Pylearn2 is a library that wraps a lot of models and training algorithms such as Stochastic Gradient Descent that are commonly used in Deep Learning and AI research into a single package to experiment with.

You could also write wrappers around your own classes and algorithms quite easily in order to implement these in Pylearn2 and configure the parameters of your entire neural network model from a single YAML configuration file.

Aside from that it also has a lot of datasets and their preparation already built into the package so you can start experimenting with the MNIST dataset straight away!


Blocks

Github URL: https://github.com/mila-udem/blocks

Blocks is a very modular framework that helps you build neural network models on top of Theano. Currently it supports and provides:



  • Constructing parametrized Theano operations, called “bricks”
  • Pattern matching to select variables and bricks in large models
  • Algorithms to optimize your model
  • Saving and resuming of training
  • Monitoring and analyzing values during training progress (on the training set as well as on test sets)
  • Application of graph transformations, such as dropout

Keras

Homepage: http://keras.io/
Github URL: https://github.com/fchollet/keras

Keras is a minimalist, highly modular neural network library in the spirit of Torch, written in Python, that uses Theano under the hood for optimized tensor manipulation on GPU and CPU. It was developed with a focus on enabling fast experimentation and creation of new Deep Learning models.

Use Keras if you need a deep learning library that:

  • allows for easy and fast prototyping (through total modularity, minimalism, and extensibility).
  • supports both convolutional networks and recurrent networks, as well as combinations of the two.
  • supports arbitrary connectivity schemes (including multi-input and multi-output training).

What separates Keras from the other libraries that use Theano is that Keras is written in a very minimalist and clean style. Wrapping up all the essentials in small classes that can easily be pieced together to create entirely new models.


Lasagne

Github URL: https://github.com/Lasagne/Lasagne

Not just a very tasty Italian dish, but also a Deep Learning Library which has about the same features as Blocks or Keras, but follows a bit of a different design.

Here are some of Lasagne’s design goals:

  • Simplicity: it should be easy to use and extend the library. Whenever a feature is added, the effect on both of these should be considered. Every added abstraction should be carefully scrutinized, to determine whether the added complexity is justified.
  • Small interfaces: as few classes and methods as possible. Try to rely on Theano’s functionality and data types where possible, and follow Theano’s conventions. Don’t wrap things in classes if it is not strictly necessary. This should make it easier to both use the library and extend it (less cognitive overhead).
  • Don’t get in the way: unused features should be invisible, the user should not have to take into account a feature that they do not use. It should be possible to use each component of the library in isolation from the others.
  • Transparency: don’t try to hide Theano behind abstractions. Functions and methods should return Theano expressions and standard Python / numpy data types where possible.
  • Focus: follow the Unix philosophy of “do one thing and do it well”, with a strong focus on feed-forward neural networks.
  • Pragmatism: making common use cases easy is more important than supporting every possible use case out of the box.

Chainer

Github URL: https://github.com/pfnet/chainer
Homepage: http://chainer.org/

Relatively new compared to some other libraries on this list but buzzing with development activity.

This library also supports CUDA for running your models on the GPU, but does not rely on Theano to do so. It has instead its own implementation.

Chainer supports various network architectures including feed-forward nets, convnets, recurrent nets and recursive nets. It also supports per-batch architectures.

Chainer looks extremely promising due to the way it builds its computational graph, which differs from the way one declares a model in e.g. Theano or Torch7. This strategy also makes it easy to write multi-GPU parallelization, since logic comes closer to network manipulation.


Know of any other Deep Learning Libraries in Python or have you created your own? Be sure to drop a line in the comments!

Share the knowledge!
Share on Facebook0Tweet about this on Twitter0Share on Google+1Share on StumbleUpon20Share on Reddit1Share on LinkedIn0Share on TumblrBuffer this pageDigg this

Comments

You may also like...

Stay updated
Subscribe!