A new version of the Google Prediction API has been released with extra features for working with data models, and more samples.
The API is designed to give developers a way to use a cloud-based pattern recognition engine for applications that make use of artificial intelligence. It exposes Google’s machine learning algorithms as a RESTful API that you can use in your Web applications.
The API recognizes historical patterns in data and uses those patterns to make predictions about patterns in new data. Some suggested uses are spam detection and automatic tagging of unstructured data based on how your users have tagged similar content in the past.
As we reported on I Programmer when the API was first announced, the way the Prediction API works is that you upload data, which can be unstructured text or numeric, to Google Storage for Developers. You then perform a supervised learning stage and finally you can use the trained model to classify new data.
With version 1.5, you can now list all your models as a single list, or show the models page by page iteratively. There’s a simplified method to return model analysis data that will also show the timestamp of when the model was inserted and when the model training was completed, so you can keep track of models more easily.
Model analysis has also been improved. The developers have added the ability to obtain more detailed information about data and models. Carrying out a trainedmodels.analyze request will return information about the trained model’s output values, features, confusion matrix, and other information.
There are also two new sample apps showing how to use the Prediction API from Google App Engine, coded in Python and Java. These samples show how to create and manage shared server OAuth 2.0 credentials, and how to make predictions on behalf of any site visitors using the shared server credentials.
Computational photography is approaching the status of magic. Imagine amassing photos from all round the world taken by different people at different times and putting them together to make time-lapse [ ... ]