close
close

AI21 Labs releases Jamba 1.5 family of open models: Jamba 1.5 Mini and Jamba 1.5 Large redefine long-context AI with unmatched speed, quality and multilingual capabilities for global enterprises

AI21 Labs has made a significant advance in the AI ​​landscape by Jamba 1.5 family of open modelsconsisting of Jamba 1.5 Mini And Jamba 1.5 LargeThese models, based on the novel SSM Transformer architecture, represent a breakthrough in AI technology, especially in handling tasks with longer contexts. AI21 Labs aims to democratize access to these powerful models by releasing them under the Jamba Open Model License, thus encouraging widespread experimentation and innovation.

Main features of the Jamba 1.5 models

One of the outstanding features of the Jamba 1.5 models is their ability to handle exceptionally long contexts. They have an effective context window of 256,000 tokens, the longest on the open model market. This feature is critical for enterprise applications that require the analysis and summarization of long documents. The models are also well suited to agent-based and retrieval-augmented generation (RAG) workflows, improving both the quality and efficiency of these processes.

In terms of speed, Jamba 1.5 models are up to 2.5 times faster than their competitors on long contexts and offer superior performance across all context lengths within their size class. This speed advantage is critical for companies that need fast turnaround times for tasks such as customer support or processing large amounts of data.

The Jamba 1.5 models also outperform their competitors in terms of quality. Jamba 1.5 Mini is considered the strongest open model in its size class, scoring 46.1 in the Arena Hard benchmark, outperforming larger models such as Mixtral 8x22B and Command-R+. Jamba 1.5 Large goes even further, scoring 65.4, outperforming leading models such as Llama 3.1 70B and 405B. This high-quality performance in various benchmarks underlines the robustness of the Jamba 1.5 models in delivering reliable and accurate results.

Multilingual support and developer readiness

In addition to their technical capabilities, the Jamba 1.5 models are equipped with multilingual support and support languages ​​such as Spanish, French, Portuguese, Italian, Dutch, German, Arabic and Hebrew, making them versatile tools for global companies operating in different language environments.

For developers, Jamba 1.5 models provide native support for structured JSON output, function calls, document object digestion, and citation generation. These features make the models adaptable to different development needs and enable seamless integration into existing workflows.

Use and efficiency

AI21 Labs has ensured that the Jamba 1.5 models are accessible and deployable across multiple platforms. They are available for immediate download on Hugging Face and are supported by major cloud providers including Google Cloud Vertex AI, Microsoft Azure, and NVIDIA NIM. The models are expected to be available on additional platforms soon, such as Amazon Bedrock, Databricks Marketplace, Snowflake Cortex, and others, making them easy to deploy across multiple environments, including on-premises and virtual private clouds.

Another key advantage of Jamba 1.5 models is their resource efficiency. Built on a hybrid architecture that combines the strengths of the Transformer and Mamba architectures, these models require less memory, allowing companies to process large context lengths on a single GPU. AI21 Labs' novel ExpertsInt8 quantization technique improves this efficiency even further, optimizing model performance without compromising quality.

Diploma

The release of the Jamba 1.5 family by AI21 Labs represents a significant advancement in long context processing. These models set new standards in speed, quality and efficiency and democratize access to cutting-edge AI technology through their open model license. As enterprises continue to look for AI solutions that deliver real value, the Jamba 1.5 models stand out as powerful tools that can meet the needs of complex, large-scale applications. Their availability on multiple platforms and support for multilingual environments further increase their appeal, making them a versatile choice for developers and enterprises.


Check out the Jamba 1.5 mini, Jamba 1.5 large and details. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Þjórsárdalur and join our Telegram channel And LinkedInphew. If you like our work, you will Newsletters..

Don’t forget to join our 49k+ ML SubReddit

Find upcoming AI webinars here


Asif Razzaq is the CEO of Marktechpost Media Inc. A visionary entrepreneur and engineer, Asif strives to harness the potential of artificial intelligence for the greater good. His latest project is the launch of an artificial intelligence media platform, Marktechpost, which is characterized by its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable for a wide audience. The platform boasts of over 2 million views per month, which underlines its popularity among the audience.

🐝 Subscribe to the fastest growing AI research newsletter, read by researchers from Google + NVIDIA + Meta + Stanford + MIT + Microsoft and many more…