Advanced Computing in the Age of AI | Monday, June 24, 2024

IBM Makes a Push Towards Open-Source Services, Announces New watsonx Updates 

Today, IBM declared that it is releasing a number of noteworthy changes to its watsonx platform, with the goal of increasing the openness, affordability, and flexibility of the platform’s AI capabilities.

Announced during the Think 2024 conference – an annual event held in Boston this year – these changes are part of an overall strategy by IBM to invest in and contribute to the open-source AI community.

IBM's Think 2024 conference will be held between May 20-23, 2024 in Boston, MA. Credit: IBM

“We firmly believe in bringing open innovation to AI. We want to use the power of open source to do with AI what was successfully done with Linux and OpenShift,” said IBM CEO Arvind Krishna. “Open means choice. Open means more eyes on the code, more minds on the problems, and more hands on the solutions. For any technology to gain velocity and become ubiquitous, you’ve got to balance three things: competition, innovation, and safety. Open source is a great way to achieve all three.”

Putting this vision into practice, IBM announced several key initiatives aimed at fostering open innovation in AI – chief  among them being the open-sourcing of its powerful Granite model family.

Open-Source Granite Models

One of the most interesting parts of this announcement is that IBM is now offering open-sourced versions of a family of Granite models. The new Granite models are now available under Apache 2.0 licenses on on the collaborative websites HuggingFace and Github. These Granite code models range from 3 billion to 34 billion parameters, they’re trained on 116 programming languages, and they are available in both base and instruction-following model variants.

IBM's Granite code models have proven to perform exceptionally well on a range of applications and benchmarks. These models exhibit efficiency and good performance across all model sizes, as demonstrated by IBM's testing, which discovered that they frequently outperform rival open-source code models that are twice their size.

Granite models show great performance on benchmarks such as HumanEvalPack, HumanEvalPlus, and GSM8K – demonstrating their proficiency in code synthesis, fixing, explanation, editing, and translation for key programming languages like Python, JavaScript, Java, Go, C++, and Rust. IBM's Watsonx Code Assistant for specialized domains and Watsonx Code Assistant for Z, which converts monolithic COBOL applications into efficient services for IBM Z, are powered by the 20 billion parameter Granite base model, which was also used to train the latter tool.

What’s more, this 20 billion parameter model ranked strongly on BIRD's independent leaderboard for Execution Accuracy and Valid Efficiency Score, demonstrating leadership in the crucial industry use case of natural language to SQL.

Additionally, IBM and Red Hat have also recently announced the launch of InstructLab – an open source project for enhancing large language models using generative AI applications. Using InstructLab, developers are able to build models specific to their business needs with their own data. IBM intends to use these contributions to open-source software to integrate and the upcoming Red Hat Enterprise Linux AI (RHEL AI) solution, thereby providing its clients with additional value.

RHEL AI will provide users with an enterprise-reader version of InstructLab, the open-source Granite models from IBM, as well as a Linux platform to make AI deployments across hybrid infrastructure environments easier.

Updates for watsonx

Confirming Krishna’s discussion of his company’s commitment to “bringing open innovation to AI”, IBM is also announcing a new class of watsonx assistants that will be available soon. These new AI assistants include Code Assistant for Enterprise Java Applications, watsonx Assistant for Z to transform how users interact with the system to quickly transfer knowledge and expertise, and an expansion of watsonx Code Assistant for Z service with code explanation to help clients understand and document applications through natural language.

On top of these new AI assistants, IBM is also working to expand ecosystem access to watsonx through the addition of third-party models. IBM announced integration with third-party models with nine organizations; Amazon Web Services, Adobe, Meta, Microsoft, Mistral, Palo Alto Networks, Salesforce, SAP, and the Saudi Data and Artificial Intelligence Authority.

While all of these integrations should help make watsonx more flexible, the plan to work with Meta seems especially interesting. These two companies jointly launched the AI Alliance to bring organization from industry, startup, academia, research, and government together with the goal to advance open, safe, and responsible AI. This most recent announcement stated that IBM watsonx will provide access to Meta’s Llama 3. IBM has already used Meta’s Llama 2 to help build a content engine for the non-profit organization that hosts the GRAMMYs.

The Think 2024 conference is still ongoing, and IBM will have much more to unveil as the event continues. However, the AI era is demanding a push toward open-source principles, and IBM efforts echo that.