News

Introducing Stable LM 2 12B
Product Bryce Wilson Product Bryce Wilson

Introducing Stable LM 2 12B

Introducing the latest additions to our Stable LM 2 language model series: a 12 billion parameter base model and an instruction-tuned variant, trained on 2 trillion tokens in seven languages: English, Spanish, German, Italian, French, Portuguese, and Dutch. This medium-sized model balances strong performance, efficiency, memory requirements, and speed, following our established Stable LM 2 1.6B framework as detailed in our previously released technical report.

Read More
Introducing Stable Code Instruct 3B
Product Bryce Wilson Product Bryce Wilson

Introducing Stable Code Instruct 3B

Stable Code Instruct 3B is our latest instruction-tuned large language model, built on top of Stable Code 3B. This model enhances code completion and supports natural language interactions, aiming to improve the efficiency and intuitiveness of programming and software development related tasks. Our analysis suggests that Stable Code Instruct 3B outperforms comparable models such as Codellama 7B Instruct, and DeepSeek-Coder Instruct 1.3B in various coding-related tasks.

Read More
Introducing Stable LM 2 1.6B
Product Guest User Product Guest User

Introducing Stable LM 2 1.6B

Today, we are introducing our first language model from the new Stable LM 2 series: the 1.6 billion parameter base model and an instruction-tuned version. The base model is trained on…

Read More
Stable Code 3B: Coding on the Edge
Product Guest User Product Guest User

Stable Code 3B: Coding on the Edge

Stable Code, an upgrade from Stable Code Alpha 3B, specializes in code completion and outperforms predecessors in efficiency and multi-language support. It is compatible with standard laptops, including non-GPU models, and features capabilities like FIM and expanded context size.

Read More
Introducing Japanese Stable LM Beta
Product Guest User Product Guest User

Introducing Japanese Stable LM Beta

Introducing Stability AI Japan's JSLM Beta series – the latest in Japanese language technology. Tailored for Japan, these models, including the powerful JSLM Beta 70B, redefine language processing with 70 billion parameters.

Read More