Apache MXNet Deep Learning Adds Julia API |
Written by Kay Ewbank | |||
Monday, 11 March 2019 | |||
An updated version of an Apache Deep Learning library has been released. Improvements in MXNet 1.4.0 include Java bindings for inference and Julia bindings. MXNet is is an open-source deep learning framework used to train, and deploy deep neural networks. It is scalable, allowing for fast model training, and supports a flexible programming model and multiple languages (C++, Python, Julia, Clojure, JavaScript, R, Scala). In deep neural networks, researchers arrange artificial neurons into layers, and neurons in one layer get input from the neurons in the layers below them. Acceleration libraries like MXNet are designed to make it easier for developers to make full use of GPUs and cloud computing when developing such systems. The developers of MXNet say that other well-known scientific computing stacks such as Matlab, R, or NumPy & SciPy lack an easy way to make full use of distributed resources. By contrast, MXNet has been designed to support device placement, so you can specify where data structures should be stored on a distributed system. It also supports Multi-GPU training, and predefined layers that are optimized for speed. The library also offers automatic differentiation. This means that MXNet automates the derivative calculations. The new release of MXNet (which is still an Apache Incubating project) adds new high level Java Inference APIs for performing predictions in Java with deep learning models trained using MXNet. This simplifies production deployment of Apache MXNet models for enterprise systems that run on Java. The new release also includes a Julia API that provides efficient tensor computation across multiple devices including multiple CPUs, GPUs and distributed server nodes. Other improvements include control flow operators that can be used to turn variable dynamic neural network graphs into optimized static computation graphs. Automated JVM memory management has also been included, and Apache MXNet now supports distributed training using the Horovod framework. Horovod is an open source distributed framework created at Uber. A new Subgraph API has also been added, meaning that MXNet can integrate different kinds of backend libraries such as TVM, MKLDNN, TensorRT, and Intel nGraph.
More InformationRelated ArticlesNVIDA Updates Free Deep Learning Software Microsoft and Amazon Announce Gluon ONNX For AI Model Interoperability Microsoft Cognitive Toolkit Version 2.0
To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |