Top /r/MachineLearning Posts, November: StarCraft II for AI Research; Google AI Experiments Website; Google in Montreal

DeepMind and Blizzard to release StarCraft II as an AI research environment; Google AI Experiments Website; Google opens new Montreal-based AI research lab; Lip Reading Sentences in the Wild; Clean implementations of machine learning algorithms
By Matthew Mayo, KDnuggets.

Reddit Machine Learning
In November on /r/MachineLearning, it's almost all about Google. We have news on the ML/AI research front with more help from video games, Google launches an AI experiments website, Google lands in Montreal, promising results of lip reading perform well in the wild, and a collection of minimal and clean implementations of machine learning algorithms. Let's get to it.

The top 5 /r/MachineLearning posts of the past month are:

1. DeepMind and Blizzard to release StarCraft II as an AI research environment

This blog post from DeepMind outlines its new initiative, in conjunction with Blizzard Entertainment, to open up StarCraft II to Artificial Intelligence and Machine Learning researchers worldwide. The post touts both StarCraft's enduring appeal and DeepMind's recent research successes, and looks forward to additional prospective gains involving StarCraft. A particularly interesting excerpt from the post is as follows:

We are also working with Blizzard to create “curriculum” scenarios, which present increasingly complex tasks to allow researchers of any level to get an agent up and running, and benchmark different algorithms and advances. Researchers will also have full flexibility and control to create their own tasks using the existing StarCraft II editing tools.

DeepMind Atari and Labrynth

Be on the look out for forthcoming additional information on DeepMind and Blizzard's collaborative effort.

2. Google's new A.I. experiments website

This explanation comes directly from the website:

Making it easier for anyone to start exploring A.I.

With all the exciting A.I. stuff happening, there are lots of people eager to start tinkering with machine learning technology. A.I. Experiments is a showcase for simple experiments that let anyone play with this technology in hands-on ways, through pictures, drawings, language, music, and more.

Google AI Experiments

Simple enough. Some of the prominent projects include: Thing Translator, AI Duet, Infinite Drum Machine, and Quick, Draw! You can also submit your own:

We want to make it easier for any coder – whether you have a machine learning background or not – to create your own experiments. This site includes open-source code and resources to help you get started. If you make something you’d like to share, we’d love to see it and possibly add it to the showcase.

3. Google opens new AI lab and invests $3.4M in Montreal-based AI research

Montreal has been transforming into a recognized deep learning and artificial intelligence research hub for a number of years, thanks in part to the presence of prominent researchers and world class supporting institutions. This recognition only deepened with September's news the the Canadian government was awarding the Université de Montréal the amount of $93,562,000 for deep learning and optimization research.

Well, add another feather to Montreal's cap. Google is setting up shop locally, and investing in deep learning research in the city. Read the article which outlines the specifics, and notes how big names like Yoshua Bengio and Hugo Larochelle, along with a pair of world class research institutions, are involved.

There is little doubt that Montreal is fast becoming THE global locale for cutting edge deep learning and artificial intelligence research.

4. Lip Reading Sentences in the Wild

The video linked above is the same embedded below. It demonstrates the effectiveness of the paper "Lip Reading Sentences in the Wild," by Joon Son Chung, Andrew Senior, Oriol Vinyals, and Andrew Zisserman. Results are promising, which is nice to hear; however, seeing the video demonstration places the accomplishments in a tangible and (dare I say) fantastic light.

Have a watch and be sufficiently wowed.

5. A collection of minimal and clean implementations of machine learning algorithms

This collection of machine learning algorithms is implemented in Python, and targets those interested in understanding how algorithms are implemented.

Currently, implemented algorithms include:

  • Deep learning (MLP, CNN, RNN, LSTM)
  • Linear regression, logistic regression
  • Random Forests
  • SVM with kernels (Linear, Poly, RBF)
  • K-Means
  • Gaussian Mixture Model
  • K-nearest neighbors
  • Naive bayes
  • PCA
  • Factorization machines
  • Gradient Boosting trees (also known as GBDT, GBRT, GBM, XGBoost)

There is a list of outstanding algorithms the author intends to implement, including t-SNE, MCMC, Word2vec, Adaboost, HMM, and Restricted Boltzmann machine.

I can vouch for the clean implementation of the code, and believe this would be a great learning resource for anyone interested in algorithm design.

Related:

  • Top /r/MachineLearning Posts, October: NSFW Image Recognition, Differentiable Neural Computers, Hinton on Coursera
  • Top /r/MachineLearning Posts, September: Open Images Dataset; Whopping Deep Learning Grant; Advanced ML Courseware
  • Top /r/MachineLearning Posts, August: Google Brain AMA, Image Completion with TensorFlow, Japanese Cucumber Farming