Amazon Creating AI Chips For Alexa
Written by Harry Fairhead   
Sunday, 18 February 2018

To maintain the advantage established with Alexa, Amazon appears to be moving towards developing its own artificial intelligence chips to be used in Echo devices and other hardware.

Despite being outperformed in terms of its ability to answer questions by the Google Assistant, Amazon's Alexa is established as the leader in the voice assistant market. 

Currently, the Echo has a relatively simple chip to allow the device to pick up the wake word “Alexa.” After the wake word is detected, what a person asks Alexa is sent to Amazon’s servers in the cloud to be processed. As we've repeatedly pointed out, this places a heavy, possibly unsustainable, burden on cloud services and well as introducing a noticeable and sometimes unacceptable delay in receiving a response. An AI chip could go quite some way to improving this state of affairs and such a chip may already be under development.

ANNAPURNALABS

 

According to a report in The Information:

Amazon has quietly developed chip-making capabilities over the past two years, through both acquisitions and hiring. It started in 2015 through the $350 million acquisition of Annapurna Labs, a secretive Israeli chipmaker. Amazon said nothing about its plans for Annapurna at the time, although Annapurna said in 2016 it was making a line of chips, called Alpine, for data storage gear, WiFi routers, smart home devices and media streaming devices.

But Annapurna is now working on the AI chip for Alexa-powered devices, said the person familiar with Amazon’s plans.

Incorporating a chip that could run algorithms to answer some of users questions directly would cut the lag time that is typically experienced between asking Alexa a question and receiving an answer by reducing the amount of traffic being referred to the cloud. Of course, cloud services would still be needed to deliver music, news and other externally-based information, but eliminating those requests that could be handled locally, would improve the latency of those that were handled remotely. 

It was Reuters that initially drew attention to Amazon's latest chip-technology acquisition, reporting last week that in December 2017 Amazon had reportedly paid about $90 million to acquire home security camera maker Blink. At the time this appeared to be related to Amazon Key, the program whereby customers can set up a smart lock and surveillance camera so couriers can make deliveries inside their homes when they are away, According to Reuters, the deal may have hinged more on Blink’s proprietary chip technology. Blink's energy-efficient chips lengthen the battery life of devices and could enable Echo, which currently needs a plug-in power source, to operate on AA lithium batteries. Possibly more importantly is Blink's chip design expertise. The CEO and two of the co-founders of its owner, Immedia Semiconductor, came from Sand Video, which had designed chips in the early 2000s that decoded a new and improved video standard.

The Information reports that including Blink, Amazon has 449 people with chip expertise already on the payroll in the United States, with another 377 job openings in the field. Amazon has hired from all over, particularly Intel, Microsoft, Apple and Qualcomm. These numbers don’t include Annapurna, which has about 120 employees mostly in Israel and is also hiring more.

Amazon is also hiring chip engineers in its Amazon Web Services unit, and industry executives say there are indications it may be designing an AI chip for servers in AWS data centers. AWS has hired at least nine former engineers from defunct chip startup Tabula which worked on a new design of field-programmable gate array (or FPGA), which can be reprogrammed on the fly. If Amazon does develop an AI chip for its data centers, it would be following in the footsteps of Google which unveiled its Tensor Processing Unit, designed to run its deep learning algorithms in 2016 to run services like Search, Street View, Photos and Translate.

AI chips in both the Echo devices and the AWS data centers would seem to solve the dilemma posed by the success of Echo, which is how can Amazon manage to make a profit from such a data hungry device or rather how it can avoid making a loss on giving away so many hours of compute time.

 

 

More Information 

Annapurna Labs

Related Articles

Amazon Alexa Extending Its Influence

The State Of Voice As UI

New Alexa Skills Kit Developer Console 

Alexa For Business - The Big Shake Up

Amazon Starting To Monetize Alexa Skills

 

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

 

Banner


A Tee Is Not Just For Xmas - Top Tees
20/12/2024

Programmer gifts - easy idea, difficult implementation.  Here's our pick of tee-shirts for giving, buying or just wearing at any time of the year.



Ruby On Rails Adds Kamal And Thruster Support
17/12/2024

Ruby on Rails 8 has been released. The new version comes preconfigured with Kamal 2 for application deployment, a new proxy called Thruster, and a trio of SQLite database-backed adapters named Solid C [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Monday, 19 February 2018 )