**By Dan Clark, KDnuggets.**

Python continues to lead the way when it comes to Machine Learning, AI, Deep Learning and Data Science tasks. According to builtwith.com, 45% of technology companies prefer to use Python for implementing AI and Machine Learning.

Because of this, we’ve decided to start a series investigating the top Python libraries across several categories:

**Top 8 Python Machine Learning Libraries** **✅**

**Top 13 Python Deep Learning Libraries ✅** - this post

Top X Python Reinforcement Learning and evolutionary computation Libraries – COMING SOON!

Top X Python Data Science Libraries – COMING SOON!

Of course, these lists are entirely subjective as many libraries could easily place in multiple categories. For example, TensorFlow is included in this list but Keras has been omitted and features in the Machine Learning library collection instead. This is because Keras is more of an ‘end-user’ library like SKLearn, as opposed to TensorFlow which appeals more to researchers and Machine Learning engineer types.

As always, please feel free to vent your frustrations/disagreements/annoyance in the comments section below!

**Fig. 1: Top 13 Python Deep Learning Libraries, by Commits and Contributors.**Circle size is proportional to number of stars.

Now, let’s get onto the list (GitHub figures correct as of October 23rd, 2018):

**1. TensorFlow (Contributors – 1700, Commits – 42256, Stars – 112591)**

“**TensorFlow** is an open source software library for numerical computation using data flow graphs. The graph nodes represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) that flow between them. This flexible architecture enables you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device without rewriting code. “

**2. PyTorch (Contributors – 806, Commits – 14022, Stars – 20243)**

“PyTorch is a Python package that provides two high-level features:

- Tensor computation (like NumPy) with strong GPU acceleration
- Deep neural networks built on a tape-based autograd system

You can reuse your favorite Python packages such as NumPy, SciPy and Cython to extend PyTorch when needed.”

**3. Apache MXNet (Contributors – 628, Commits – 8723, Stars – 15447)**

“Apache MXNet (incubating) is a deep learning framework designed for both *efficiency* and *flexibility*. It allows you to ** mix**symbolic and imperative programming to

**efficiency and productivity. At its core, MXNet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly.”**

*maximize***4. Theano (Contributors – 329, Commits – 28033, Stars – 8536)**

“Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.”

**5. Caffe (Contributors – 270, Commits – 4152, Stars – 25927)**

“Caffe is a deep learning framework made with expression, speed, and modularity in mind. It is developed by Berkeley AI Research (BAIR)/The Berkeley Vision and Learning Center (BVLC) and community contributors.”

**6. fast.ai (Contributors – 226, Commits – 2237, Stars – 8872)**

“The fastai library simplifies training fast and accurate neural nets using modern best practices. See the fastai website to get started. The library is based on research in to deep learning best practices undertaken at fast.ai, and includes "out of the box" support for vision, text, tabular, and collab (collaborative filtering) models.”

**7. CNTK (Contributors – 189, Commits – 15979, Stars – 15281)**

“The Microsoft Cognitive Toolkit (https://cntk.ai) is a unified deep learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows users to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs).”

**8. TFLearn (Contributors – 118, Commits – 599, Stars – 8632)**

“TFlearn is a modular and transparent deep learning library built on top of Tensorflow. It was designed to provide a higher-level API to TensorFlow in order to facilitate and speed-up experimentations, while remaining fully transparent and compatible with it.”

**9. Lasagne (Contributors – 64, Commits – 1157, Stars – 3534)**

“Lasagne is a lightweight library to build and train neural networks in Theano. It supports feed-forward networks such as Convolutional Neural Networks (CNNs), recurrent networks including Long Short-Term Memory (LSTM), and any combination thereof.”

**10. nolearn (Contributors – 14, Commits – 389, Stars – 909)**

“*nolearn* contains a number of wrappers and abstractions around existing neural network libraries, most notably Lasagne, along with a few machine learning utility modules. All code is written to be compatible with scikit-learn.”

**11. Elephas (Contributors – 13, Commits – 249, Stars – 1046)**

“Elephas is an extension of Keras, which allows you to run distributed deep learning models at scale with Spark. Elephas currently supports a number of applications, including:

- Data-parallel training of deep learning models
- Distributed hyper-parameter optimization
- Distributed training of ensemble models”

**12. spark-deep-learning (Contributors – 12, Commits – 83, Stars – 1131)**

“Deep Learning Pipelines provides high-level APIs for scalable deep learning in Python with Apache Spark. The library comes from Databricks and leverages Spark for its two strongest facets:

- In the spirit of Spark and Spark MLlib, it provides easy-to-use APIs that enable deep learning in very few lines of code.
- It uses Spark's powerful distributed engine to scale out deep learning on massive datasets.”

**13. Distributed Keras (Contributors – 5, Commits – 1125, Stars – 523)**

“Distributed Keras is a distributed deep learning framework built on top of Apache Spark and Keras, with a focus on "state-of-the-art" distributed optimization algorithms. We designed the framework in such a way that a new distributed optimizer could be implemented with ease, thus enabling a person to focus on research.”

Keep an eye out for the next part of this series - which focuses on Reinforcement Learning and evolutionary computation libraries - that will be published over the next few weeks!

**Related:**

- Top 8 Python Machine Learning Libraries

- A Deep Look at Deep Learning: Understanding The Basics of How (and Why) it Works
- Introduction to Deep Learning with Keras