JetBrains Adds Claude Support To AI Assistant
Written by Kay Ewbank   
Wednesday, 12 February 2025

JetBrains has added support for Anthropic's Claude Sonnet 3.5 and Haiku 3.5, OpenAI's o1, o1-mini, and o3-mini models to its AI-powered coding tool, AI Assistant, along with local LLM support via LM Studio, allowing users to connect AI chat to locally hosted models.

The AI Assistant is an upgrade to an earlier profiling tool, with extra features including an integrated AI chat, code explanations, automated documentation generation, name suggestion, and commit message generation.

jetbrainslogo

AI Assistant chooses the most optimal model for each task. The JetBrains AI service behind the tool connects users to different LLMs and providers to use the best-performing and most cost-efficient models. When AI Assistant was released JetBrains said that the architecture means new future models can be used without the need for users to change AI providers, and the support for the new models is due to this.

This latest update adds support for the most recent Claude models – Claude 3.5 Sonnet and Claude 3.5 Haiku - provisioned in Amazon Bedrock. The use of Bedrock for provisioning means the models are available to users of AI Assistant globally thanks to Amazon Bedrock's coverage across 17 regions worldwide. JetBrains says this ensures consistent access to AI services while adhering to data residency requirements and complying with local regulations. They also highlight the cross-region inferencing that means workloads are automatically distributed during high traffic, minimizing latency and ensuring smooth operation even under heavy demand.

Anthropic Claude 3.5 Sonnet is the first to be released in the 3.5 line, and Anthropic says it sets new industry benchmarks for graduate-level reasoning (GPQA), undergraduate-level knowledge (MMLU), and coding proficiency (HumanEval). It shows marked improvement in grasping nuance, humor, and complex instructions.

The addition of support for OpenAI's o3-mini and o1-mini models is aimed at users who want faster and more cost-efficient reasoning capabilities. These compact OpenAI models offer faster processing than o1 and are tailored for coding, scientific, and mathematical tasks.

The other new option is the ability to connect AI chat to locally hosted models using LM Studio. The LM Studio platform provides an interface for managing and running AI models on your local machine, and JetBrains says this increases data privacy and allows you to customize your environment to meet specific requirements.

AI Assistant is available inside JetBrains IDEs and can generate code for any request, suggest a fix for a particular problem or refactor a function. On average, developers report saving up to 8 hours per week with JetBrains AI Assistant.

jetbrainslogo

More Information

JetBrains Website

Related Articles

JetBrains AI Assistant - A Welcome Time Saver 

JetBrains AI Coding Assistant Now Generally Available 

Gemini 1.5 Pro Now Available 

JetBrains Updates IDEs With AI

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Microsoft To Close Dev Home
07/02/2025

Microsoft has announced it is 'retiring' Dev Home, and moving " a subset of its features" to new places. The closedown will happen in May 2025.



Machine Learning Pioneers Awarded Queen Elizabeth Prize
09/02/2025

The 2025 Queen Elizabeth Prize has been awarded to Yoshua Bengio, Bill Dally, Geoffrey Hinton, John Hopfield, Jensen Huang, Yann LeCun, and Fei-Fei Li. The seven 2025 Laureates share the £500,00 [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info