Facebook Shares Deep Learning Tools
Written by Alex Armstrong   
Thursday, 22 January 2015

Facebook AI Research has announced that is open sourcing the deep-learning modules that enable it to train larger neural nets in less time than those already available.

 

fairbanner

 

Since Yann LeCun was recruited to head Facebook's newly founded AI Group, FAIR in 2013 it has become a team of 36 people and made great strides forward.

Fortunately for the area of deep learning and convolutional nets it believes that:

Progress in science and technology accelerates when scientists share not just their results, but also their tools and methods.

Hence its decision to do just that with a set of tools that give a 23.5x speed-up over publicly available convolutional layer codes, and the ability to parallelize neural networks training over GPU cards.

The tools are being made available for Torch. an open source development environment for numerics, machine learning, and computer vision widely used at a number of academic labs as well as at Google/DeepMind, Twitter, NVIDIA, AMD, Intel, and many other companies.

The following fast nn modules for Convnets and neural networks in general are provide a a plug-in to the Torch-7 framework:

  •  Fast spatial convolution modules that use FFT to accelerate convolutions. This deft conference paper has details.

  • Fast Temporal convolutions that are 1.5x to 10x faster compared to Torch's cunn implementations.

  • nn.DataParallel and nn.ModelParallel containers. Plug your model in them and see it accelerate over multiple GPUs,

  • Wrappers to use FFT/IFFT as nn modules.

  • Fast LookupTable that is used for Neural Language Models and word embeddings. Much faster than the one in torch/nn

  • Hierarchical SoftMax module, now classifying 1 million classes is a practically viable strategy

  • LP and Max Pooling over feature maps (usable for MaxOut).

To use these packages for Torch, visit the fbcunn page which has installation instructions, documentation and examples to train classifiers over ImageNet.

Facebook has also recently released iTorch, an interface for Torch using iPython  with visualization and plotting and previously has made available fbnn, extensions to torch/nn, fbcuda, extensions to CUDA, and fblualib  libraries and utilities for Lua.

Concluding the announcement on the FAIR Blog, Soumith Chintala notes:

We hope that these high-quality code releases will be a catalyst to the research community and we will continue to update them from time to time.

Banner


Pico 2W Announced But There Is A Surprise!
25/11/2024

Raspberry Pi released the Pico 2 a few months ago and we have been waiting for the Pico 2W since then. But Pimoroni beat them to the draw with the Pico Plus 2W based on the RM2 radio module and hinted [ ... ]



Insights Into Learning Computer Science
18/12/2024

JetBrains Academy has published the results of a worldwide survey that set out to discover current trends in computer science education and the challenges involved in studying computer science. I [ ... ]


More News

 

espbook

 

Comments




or email your comment to: comments@i-programmer.info

 

Last Updated ( Thursday, 04 October 2018 )