PyTorch 1.10 Focuses on Improved Training and Performance
Written by Kay Ewbank   
Thursday, 02 December 2021

PyTorch, Facebook's open-source deep-learning framework, has been updated with an integration with CUDA Graphs API.  The new version also has better performance thanks to JIT compiler updates, as well as beta support for the Android Neural Networks API (NNAPI). Two new libraries, TorchVision and TorchAudio have also been released.

 

pytorch

The support for Android NNAPI means Android apps can use hardware accelerators such as GPUs and neural processing units. It is now included as a beta feature with improvements including being able to run tests on mobile hosts and support for flexible tensor shapes. The CUDA Graphs APIs integration is another beta inclusion, and is designed to improve the runtime performance where workloads are CPU-bound by using a GPU. The way this works is that the workload is sent to the GPU, and the results are captured and reused at runtime. This obviously loses flexibility but gains a more acceptable performance.

A number of features in the new release are now moved from beta to production, including a module for remote communication, the training memory optimizer, and the handler for distributed data parallel communications.

The FX module is also now marked as stable. This consists of a symbolic tracer, an intermediate representation, and a Python code generator. The module gives developers the tools to create their own custom transformations. You can take a module and convert it to a graph representation that can then be modified in code. The resulting graph can then be converted to PyTorch compatible Python source code.

pytorch

More Information

PyTorch Website

PyTorch On GitHub

Related Articles

PyTorch 1.8 Improves FFT Support

PyTorch Adds New APIs

PyTorch Scholarship Challenge

PyTorch Developer Day Updates

PyTorch Adds TorchScript API

PyTorch 1.5 Updates C++ API 

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Mastering LLMs With Experts
22/10/2024

A freely available set of workshops and talks on the essentials of LLMs, taught by practitioners. The topics include Evals, Retrieval-augmented-generation (RAG), Fine-tuning etc.



Improved Code Completion With JetBrains Mellum
29/10/2024

JetBrains has launched Mellum, a proprietary large language model specifically built for coding. Currently available only with JetBrains AI Assistant, Mellum is claimed to provide faster, sm [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Thursday, 02 December 2021 )