LG Auptimizer Open Sourced
Written by Kay Ewbank   
Friday, 15 November 2019

An optimization tool for Machine Learning (ML) that automates many of the tedious parts of the model building process has been released in an open source version by LG. Auptimizer is designed to run and record sophisticated hyperparameter optimization (HPO) experiments to assist with fine tuning ML models.

The developers of Auptimizer say that they created it because it is difficult to tune machine learning models at scale, especially the task of finding the right hyperparameter values.

auptimizer

The current release of Auptimzer can be used very simply, and it provides ways to run and record sophisticated hyperparameter optimization (HPO) experiments for you. It can help make use of the compute resources at your disposal, whether those are a couple of GPUs or AWS. Auptimizer provides a single access point to HPO algorithms, including Bayesian optimization and multi-armed bandit.

Future versions of Auptimizer will support end-to-end model building for edge devices, including model compression and neural architecture search. The currently supported techniques include Random, Grid, Hyperband, Hyperopt, Spearmint, EAS (experimental)
and Passive.

The way Auptimizer works is that researchers call it using a few lines of code and by identifying hyperparameters to be investigated.  Auptimzer can converts this to a training script for the model, and can then be used to try different hyperparameter algorithms to see which works best. Auptimizer can also be used to manage available computing resources. Once the tests have been run, Auptimizer records the experiment results, and it can also map and save hyperparameter values.

While Auptimizer will handle resources automatically, users can also control resources directly. It can also be used with resource management tools such as Boto.

The developers say they created Auptimizer to have a user-friendly interface that helps users to easily use Auptimizer in their workflows and researchers to quickly implement novel HPO algorithms. All implemented HPO algorithms share the same interface, so users can switch between different algorithms without changing their code. Auptimizer is also designed to be scalable, as you can deploy it to a pool of computing resources to automatically scale out an experiment. It is designed so new HPO algorithms can be easily integrated into the Auptimizer framework. The researchers say this means Auptimizer provides a universal platform to develop new algorithms efficiently. 

auptimizer

More Information

Research paper on Auptimizer

Auptimiser Developer Guide

Auptimizer On GitHub

Related Articles

Machine Learning With App Inventor

Facebook Open Sources Natural Language Processing Model

Microsoft Open Sources Natural Language Processing Tool

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


The Feds Want Us To Move On From C/C++
13/11/2024

The clamour for safe programming languages seems to be growing and becoming official. We have known for a while that C and C++ are dangerous languages so why has it become such an issue now and is it  [ ... ]



JetBrains Improves Kubernetes Support In IDE Upgrades
12/11/2024

JetBrains has improved its IDEs with features to suggest the logical structure of code, to streamline the debugging experience for Kubernetes applications, and provide comprehensive cluster-wide Kuber [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info