Jurassic-X: Crossing the neuro-symbolic chasm with the MRKL system
Introducing Wordtune Read
Zero-to-production: bootstrapping a custom model in AI21 Studio
Jamba-1.5: Hybrid Transformer-Mamba Models at Scale
We present Jamba-1.5, new instruction-tuned large language models based on our Jamba architecture.
Jamba: A Hybrid Transformer-Mamba Language Model
Jamba Instruct is based on a novel hybrid Transformer-Mamba mixture-of-experts (MoE) architecture.
Generating Benchmarks for Factuality Evaluation of Language Models
We propose FACTOR: Factual Assessment via Corpus Transformation, a scalable approach for evaluating LM factuality.
Get the latest updates from AI21 on trends in enterprise GenAI, our most recent product advancements, customer success stories, and more.