Deep Mind and StarCraft II For Reinforcement Learning
Written by Sue Gee   
Wednesday, 09 November 2016

After Go where is Google's Deep Mind going next? One answer is into virtual warfare with the the real time strategy game StarCraft II a galactic, interspecies, struggle for dominance.

deepmindbanner

The announcement of Deep Mind's collaboration with Blizzard Entertainment to open up StarCraft II to AI and Machine Learning researchers around the world, came at BlizzCon 2016. The idea being that the game, in which players fight against one another by gathering resources to pay for defensive and offensive units, involving planning and decision making, provides a useful scenario for reinforcement learning. 

According to the DeepMind blog:

DeepMind is on a scientific mission to push the boundaries of AI, developing programs that can learn to solve any complex problem without needing to be told how. Games are the perfect environment in which to do this, allowing us to develop and test smarter, more flexible AI algorithms quickly and efficiently, and also providing instant feedback on how we’re doing through scores.

StarCraft is an interesting testing environment for current AI research because it provides a useful bridge to the messiness of the real-world. The skills required for an agent to progress through the environment and play StarCraft well could ultimately transfer to real-world tasks.

The blog post explains how, as a real-time strategy game, StarCraft is a suitable vehicle for AI researchers:

An agent that can play StarCraft will need to demonstrate effective use of memory, an ability to plan over a long time, and the capacity to adapt plans based on new information. Computers are capable of extremely fast control, but that doesn’t necessarily demonstrate intelligence, so agents must interact with the game within limits of human dexterity in terms of “Actions Per Minute”. StarCraft’s high-dimensional action space is quite different from those previously investigated in reinforcement learning research; to execute something as simple as “expand your base to some location”, one must coordinate mouse clicks, camera, and available resources.

The DeepMind and SpaceCraft II teams have collaborated to develop an API that allows programmatic control of individual units and access to the full game state:

Ultimately agents will play directly from pixels, so to get us there, we’ve developed a new image-based interface that outputs a simplified low resolution RGB image data for map & minimap, and the option to break out features into separate “layers”, like terrain heightfield, unit type, unit health etc.

This video clip provides is an example of what the feature layer API will look like:

 

 

The post also says:

We are also working with Blizzard to create “curriculum” scenarios, which present increasingly complex tasks to allow researchers of any level to get an agent up and running, and benchmark different algorithms and advances.

While this post doesn't go into exactly how external AI researchers gain access to the SpaceCraft II environment, it indicates that it will be part of the forthcoming DeepMind Labyrinth which is to be released under an open source license.

starcraft1

More Information

DeepMind and Blizzard to release StarCraft II as an AI research environment

Formation of Partnership On AI 

Related Articles

AlphaGo Changed Everything

AlphaGo Beats Lee Sedol Final Score 4-1

DeepMind's Differentiable Neural Network Thinks Deeply 

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, FacebookGoogle+ or Linkedin.

 

Banner


Open Source Key To Expansion of IoT & Edge
13/03/2024

According to the 2023 Eclipse IoT & Edge Commercial Adoption Survey Report, last year saw a surge of IoT adoption among commercial organizations across a wide range of industries. Open source [ ... ]



Google Adds Multiple Database Support To Firestore
04/03/2024

Google has announced the general availability of Firestore Multiple Databases, which can be used to manage multiple Firestore databases within a single Google Cloud project.


More News

 

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Wednesday, 09 November 2016 )