[ad_1]
French AI startup Mistral is introducing new AI model customization options, including paid plans, to let developers — and enterprises — fine-tune its generative models for particular use cases.
The first is self-service. Mistral has released a software development kit (SDK), Mistral-Finetune, for fine-tuning its models on workstations, servers and small datacenter nodes.
In the readme for the SDK’s GitHub repository, Mistral notes that the SDK is optimized for multi-GPU setups but can scale down to a single Nvidia A100 or H100 GPU for fine-tuning smaller models like Mistral 7B. Fine-tuning on a data set such as UltraChat, a collection of 1.4 million dialogs with OpenAI’s ChatGPT, takes around half an hour using Mistral-Finetune across eight H100s, Mistral says.
For developers and companies who prefer a more managed solution, there’s Mistral’s newly launched fine-tuning services available through the company’s API. Compatible with two of Mistral’s models for now, Mistral Small and the aforementioned Mistral 7B, Mistral says that the fine-tuning services will gain support for more of its models in the coming weeks.
Lastly, Mistral is debuting custom training services — currently only available to select customers — to fine-tune any Mistral model for an organization’s apps using their data. “This approach enables the creation of highly specialized and optimized models for their specific domain,” the company explains in a post on its official blog.
Mistral, which my colleague Ingrid Lunden recently reported is seeking to raise around $600 million at a $6 billion valuation from investors including DST, General Catalyst and Lightspeed Venture Partners, is no doubt looking to grow revenue as it faces considerable — and growing — competition in the generative AI space.
Since Mistral unveiled its first generative model in September 2023, it’s released several more, including a code-generating model, and rolled out paid APIs. But it hasn’t disclosed how many users it has — nor what its revenues are looking like.
[ad_2]
Source link