News

Databricks says DBRX, its latest open-source LLM, outperforms its open-source competitors, potentially offering enterprises a low-cost way to train it on their own data for generative AI use cases.
Data lakehouse provider Databricks has unveiled a new large language model (LLM) training method, TAO that will allow enterprises to train models without labeling data.
Databricks, the Data and AI company, is introducing DBRX, a general-purpose large language model (LLM) that enables organizations around the world to cost-effectively build, train, and serve their own ...
Today Databricks released Dolly 2.0, the next version of the large language model (LLM) with ChatGPT-like human interactivity (aka instruction-following) that the company released just two weeks ...
Databricks unveils “mixture of experts” AI model. The San Francisco-based company Databricks announced on Wednesday a new natural language model called DBRX, which it says performs better than ...
In March, Databricks rolled out one of the first fruits of its MosaicML buy: a new LLM called DBRX. With DBRX, Databricks can offer its some-12,000 customers a secure cloud where they can also ...
Databricks attributes DBRX’s speed to its MoE architecture, built upon their MegaBlocks research and open-source projects. This allows the model to output tokens at a very high rate.
Databricks acquired MosaicML for $1.3 billion and is steadily rolling out tools that help developers create A I apps rapidly. The Mosaic research team at Databricks developed the new TAO method.