Improved Code Completion With JetBrains Mellum |
Written by Sue Gee | |||
Tuesday, 29 October 2024 | |||
JetBrains has launched Mellum, a proprietary large language model specifically built for coding. Currently available only with JetBrains AI Assistant, Mellum is claimed to provide faster, smarter, and more contextually aware cloud code completion. Explaining Mellum's role within JetBrain AI Assistance, Vladislav Tankov, Director of AI at JetBrains states: The JetBrains AI Assistant carefully selects the most optimal model from OpenAI and Google for each use case. However, we recognized that to achieve truly powerful code completion, we needed to augment the Assistant with our own model. That’s why the launch of the Mellum model is a huge leap forward for us. At the same time, the true power of our code completion system lies in the Mellum model's deep integration with JetBrains IDEs. The synergy between client-side integration and server-side logic is crucial for providing fast, accurate, and context-aware suggestions, ensuring an enhanced coding experience with impressive results. We first met JetBrain's AI Assistant in July 2023 when it was still in limited access and with limited functionality. It was an upgrade to an earlier profiling tool, with extra features including an integrated AI chat, code explanations, automated documentation generation, name suggestion, and commit message generation. It became generally available in December 2023 and its chat features continued to rely on using models from OpenAI and Google. However, with regard to cloud code completion JetBrains took the decision to do a complete rewrite using new JetBrains in-house LLMs. Improved code completion powered by JetBrains' internally trained large language models has already started to roll out and as we reported in August 2024 the first languages to benefit were Java, Python, and Kotlin in the 2024.2 updates of JetBrains IDEs, followed by Go in the 22024.1 release and PHP, JavaScript/TypeScript, CSS, and HTML in the 2024.2.2 release. Now JetBrains has given details of its proprietary LLM, Mellum, a 4B parameters model, based on the architecture of Llama, the LLM developed by Meta AI which has exceptional performance in code generation and understanding. Mellum has been trained from scratch specifically for coding. Despite its much smaller size compared to GPT 4, Mellum provides useful suggestions with much smaller latency. Unlike GPT4, it can also be run locally. As Kirill Krylow explained in a blog post earlier this month, before the LLM was given its name: By training a relatively small, highly specialized model and enhancing inference mechanisms, we’ve significantly improved code completion accuracy and speed. In the latest blog post introducing Mellum, Valerie Kuzima looks at its performance, reporting that it has led to significantly lower latency and higher acceptance rates for code completions, specifically:
Cloud-based code completion in AI Assistant is only available with the AI Pro subscription or in the trial version. Local code completion is bundled and enabled by default in all paid IntelliJ-based IDEs.
More InformationIntroducing Mellum: JetBrains’ New LLM Built for Developers Related ArticlesJetBrains AI Assistant - A Welcome Time Saver JetBrains AI Coding Assistant Now Generally Available JetBrains Updates IDEs With AI JetBrains Improves AI Assistant To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |
|||
Last Updated ( Wednesday, 30 October 2024 ) |