New Hadoop connectors
Written by Kay Ewbank   
Wednesday, 17 August 2011

Microsoft plans to release a community technology preview (CTP) of two new Hadoop connectors, one for SQL Server and one for the Parallel Data Warehouse (PDW) solution.

SQL Server 2008 R2 Parallel Data Warehouse (PDW) is described by Microsoft as a complete, high-scale data warehousing solution that enterprises can use to manage and enable self-service business intelligence with SQL Server.

In addition to the Hadoop connectors, Microsoft has released a second Appliance Update (AU2) to PDW that adds more programming options, along with four new connectors for heterogeneous BI (Business Intelligence) & ETL (Extract, Load and Transform) environments.

Appliance updates are similar to service packs, but contain both hardware and software updates. The new appliance update adds features that users were unhappy were missing, including multi-statement batches, T-SQL variables, temporary tables, conditional logic and control flow statements. The new connectors mean PDW can be used with SAP Business Objects and Informatica.

 

hadoop

 

The Hadoop connectors will let customers with unstructured data stored in Hadoop analyze it alongside structured SQL data. The connectors will let you transfer data between Hadoop and SQL Server/PDW.

You can read more about the appliance update and the Hadoop connectors on the SQL Server Team Blog.

Further Interest

Hadoop in Action

Hadoop: The Definitive Guide

Pro Hadoop

Project Daytona

 

If you would like to be informed about new articles on I Programmer you can either follow us on Twitter or Facebook or you can subscribe to our weekly newsletter.

Banner


Google Intensive AI Course - Free On Kaggle
05/11/2024

Google is offering a 5-Day Gen AI Intensive Course designed to equip data scientists with the knowledge and skills to tackle generative AI projects with confidence. It runs on the Kaggle platform from [ ... ]



Sequin - Open Source Message Stream Built On Postgres
31/10/2024

Sequin is a tool for capturing changes and streaming data out of your Postgres database, guaranteeing exactly once processing. What does that mean?


More News

Last Updated ( Wednesday, 17 August 2011 )