Outscale combines AI and sovereignty with AI Mistral

The cloud subsidiary of Dassault Systèmes, a pioneer in the sovereign cloud thanks to its SecNumCloud certification, continues to develop its IT service offering in response to the issue of data sovereignty.

On the occasion of its Experiences conference, the French cloud provider announced the launch of its LLMaS or LLM-as-service offering.

Large language models are the central building block of generative artificial intelligence.

Productivity, security and sovereignty for GenAI

To position itself in this segment of the AI ​​market, while continuing to follow a sovereign logic, Outscale benefits from the partnership concluded earlier this year by parent company and unicorn Mistral AI.

Through the combination of its cloud technologies and the models of the French startup, Outscale therefore promises companies “a turnkey solution for the rapid development of generative AI applications, combining performance, security and sovereignty”.

Qualified SecNumCloud 3.2 by Anssi since 2023, the cloud provider therefore intends to address the concerns of organizations regarding security and protection against extraterritoriality.

Therefore, its environment must ensure “full protection of intellectual property and sensitive data” of users. Outscale assures that SeCNumCloud is synonymous with immunity “against extraterritorial laws, in particular those arising from non-European jurisdictions.”

Immunity from extraterritorial laws

Therefore, the generative AI vendor’s business case through its LLMaaS offering is to enable companies to use commercial LLM models for various use cases (content creation, business process optimization, predictive analytics, etc. ), without sacrificing security.

“Separation of data by customer, combined with strict IP address management, strengthens security by isolating sensitive information and restricting access to only authorized users,” explains Outscale.

The cloud host further highlights this through approach RAG (restore-augment-generate) data is protected by being “accessible to AI models only when needed.”

Leave a Comment