EMEA AI Model Builders

Explore and deploy sovereign AI models. Built to ensure data control, governance, and independence, these models are accelerated by NVIDIA’s AI inference platform and powered by NVIDIA-accelerated infrastructure.

Explore ModelsView Performance


EMEA AI Model Builder - Black Forest Labs Black Forest Labs

FLUX is a family of advanced models engineered to transform imagination into precise and realistic images. From image generation to multi-stage image editing, creators can utilize natural language or multimedia context to control, transform, or refine visual content through an intuitive workflow. FP4-quantized checkpoints significantly reduce memory requirements, and NVIDIA TensorRT™ enhancements provide up to 2X faster inference.

Get started with the models for your development environment.

Model

FLUX.2 Model on Hugging Face

FLUX.2 by Black Forest Labs is a 32B parameter flow matching transformer model capable of generating and editing (multiple) images. The model is released under the FLUX.2-dev Non-Commercial License.

Model

FLUX.1-dev Model by Black-Forest-Labs | NVIDIA NIM

FLUX.1-dev by Black Forest Labs is a cutting-edge generative AI model that creates high-quality images from simple text prompts.

EMEA AI Model Builder - Dicta Dicta

Dicta specializes in building high-performing, open language models for the Hebrew language, delivering robust performance, cultural awareness. The DictaBERT suite includes models fine-tuned for various difficult tasks in Hebrew using custom designed architectures, as well as providing powerful base models that can be fine-tuned for any task. DictaLM is a suite of generative large language models which set a new baseline for accuracy and fluency for on-prem Hebrew AI, leveraging advanced optimization through NVIDIA TensorRT-LLM and Dynamo for seamless production-scale deployment.

Get started with the models for your development environment.

Model

Dicta-LM 3.0 Model Demo

Dicta-LM 3.0-thinking: Dicta-LM 3.0 is a powerful open-weight collection of LLMs, trained on extensive corpora of Hebrew and English texts.

Model

DictaLM 2.0 - Instruct Chat Demo

Check out DictaLM 2.0 Instruct on Hugging Face to chat in Hebrew or English with an advanced AI language model fine-tuned for instruction-based conversations.

Model

DictaBERT Collection

Collection of state-of-the-art language model for Hebrew, finetuned for various tasks,

EMEA AI Model Builder - JetBrains Jetbrains

JetBrains is a global software company specializing in intelligent, productivity-enhancing tools for software developers and teams. Its open-source LLM, Mellum, optimizes code-related tasks such as code completion and intelligent code suggestions within developer environments. Following a Llama-style architecture, the transformer-based model has 4B parameters and an 8,192-token context window and is trained on 4T+ tokens spanning multiple programming languages.

Integrate

Get started with the right tools and frameworks for your development environment.

Optimize

Optimize inference workloads for large language models (LLMs) with TensorRT-LLM. Learn how to set up and get started using TensorRT-LLM.

Get started with the models for your development environment.

Model

AI Models from JetBrains: Powering Developer Productivity with Advanced Machine Learning Tools on Hugging Face

The JetBrains Hugging Face page hosts AI models and tools developed by JetBrains for improving developer productivity and coding with advanced machine learning techniques

EMEA AI Model Builder - Mistral AI Mistral

Mistral AI delivers industry-leading accuracy and efficiency for enterprise AI, such as their Mistral 3 family of open-source, multilingual, and multimodal models. Mistral’s models span frontier-level to compact models, enabling distributed intelligence across cloud and edge. Their models are optimized across NVIDIA supercomputing and edge platforms, empowering researchers and developers to experiment, customize, and accelerate AI innovation while democratizing access to frontier-class technologies.

Integrate

Get started with the right tools and frameworks for your development environment.

Get started with the models for your development environment.

Model

Mistral Large 3

Mistral’s first mixture-of-experts model, Mistral Large 3 is trained with 41B active and 675B total parameters.

Model

Ministral 3 14B

Ministral 3 14B Instruct 2512 FP8 is a powerful and efficient language model with vision capabilities, fine-tuned for instruction tasks.

Model

Mistral Small 24B on NVIDIA Build

Mistral Small 3 (2501) sets a new benchmark in the "small" Large Language Models category below 70B, boasting 24 billion parameters and achieving state-of-the-art capabilities comparable to larger models.

EMEA AI Model Builder - Bielik.AI SpeakLeash

Bielik is a family of open-source, lightweight LLMs (Apache-2.0) developed by the SpeakLeash Foundation and trained with ACK Cyfronet AGH. Designed for business applications and continuously improved through community-driven development, Bielik excels in document understanding, summarization, and knowledge extraction.

Explore

Explore sample applications to learn about different use cases for Bielik models.

Integrate

Get started with the right tools and frameworks for your development environment.

Optimize

Optimize inference workloads for large language models (LLMs) with TensorRT-LLM. Learn how to set up and get started using TensorRT-LLM.

Get started with the models for your development environment.

Model

Bielik-11B on NVIDIA Build

Bielik-11B-v2.6-Instruct is a generative text model with 11 billion parameters, developed to process and understand the Polish language with high precision

Model

Bielik-11B on Hugging Face

Bielik-11B-v2.6-Instruct is an 11-billion-parameter generative text model, fine-tuned for instruction following in Polish, developed by SpeakLeash and ACK Cyfronet AGH to deliver high-precision Polish language understanding and response generation

 EMEA AI Model Builder - Technology Innovation Institute (TII) Technology Innovation Institute (TII)

The Falcon LLM series is a family of multilingual, open-source large language models developed by TII. Built for efficiency, long-context understanding, and strong reasoning, Falcon models contain multiple architectures designs, including Transformer, State Space Models (SSMs), hybrid Transformer-SSLMs, and support a wide range of global languages, including dedicated variants for Arabic and regional dialects. They excel in knowledge-intensive, STEM, and long-context tasks, making them highly versatile for diverse real-world applications.

Get started with the models for your development environment.

Model

Falcon3-7b-instruct Model by Tiiuae on NVIDIA NIM

The Falcon3-7B-Instruct is an open foundation model designed for state-of-the-art performance in reasoning, language understanding, instruction following, code generation, and mathematics. It supports long-context tasks with a token limit of up to 32K and multilingual capabilities in English, French, Spanish, and Portuguese.

Model

Falcon-H1 Family of Models and Falcon Arabic

Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned).

Model

Falcon 3 Collection

Falcon3 family of Open Foundation Models is a set of pretrained and instruct LLMs ranging from 1B to 10B parameters.

EMEA AI Model Builder - University College London (UCL) University College London (UCL)

UK-LLM is a sovereign AI initiative led by UCL, developed in partnership with NVIDIA and Bangor University to support speakers of minority UK languages like Welsh, Cornish, Irish, Scottish Gaelic, and Scots. It uses open-source techniques and datasets from NVIDIA Nemotron™ to enable advanced reasoning in English and Welsh, ensuring cultural and linguistic accuracy, for public services such as healthcare, education, and legal resources.

Explore

Explore sample applications to learn about different use cases for UK-LLM models.

Integrate

Get started with the right tools and frameworks for your development environment.

Optimize

Optimize inference workloads for large language models (LLMs) with TensorRT-LLM. Learn how to set up and get started using TensorRT-LLM.

Get started with the models for your development environment.

Model

BritLLM on Hugging Face

BritLLM is a collection of multilingual language model training corpora and models, developed using machine-translated data from a single source language for broad pretraining applications

EMEA AI Model Builder - University of Ljubljana University of Ljubljana

The Generative Model for Slovene (GaMS) is a large language model family developed by researchers in the Centre for Language Resources and Technologies at the University of Ljubljana’s Faculty of Computer and Information Science. The family of LLMs is designed specifically for Slovene while also supporting other South Slavic languages and English.

Integrate

Get started with the right tools and frameworks for your development environment.

Optimize

Optimize inference workloads for large language models (LLMs) with TensorRT-LLM. Learn how to set up and get started using TensorRT-LLM.

Get started with the models for your development environment.

Model

Try the GaMS model

Try GaMS-9B is a 9 billion parameter generative language model with a total size of about 10 billion words, continually pretrained on Slovene, English and some portion of Croatian, Serbian and Bosnian corpora.

Model

Model Card for GaMS-27B

GaMS-27B represent new improved and larger models of the GaMS (Generative Model for Slovene) family. The models are based on Google's Gemma 2 family and continually pretrained on Slovene, English and some portion of Croatian, Serbian and Bosnian corpora.

More Resources

Decorative image representing Developer Community

Join the NVIDIA Developer Program

 Decorative image representing Training and Certification

Get Training and Certification

Decorative image representing Inception for Startups

Accelerate Your Startup


Ethical AI

NVIDIA believes Trustworthy AI is a shared responsibility and we have established policies and practices to enable development for a wide array of AI applications. When downloaded or used in accordance with our terms of service, developers should work with their supporting model team to ensure this model meets requirements for the relevant industry and use case and addresses unforeseen product misuse. Please report security vulnerabilities or NVIDIA AI Concerns here.

Try top community models today.

Explore Models