Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat= Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, you can get started over on the documentation page.

And, you can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – MongoDB – NPI EA (tag=MongoDB)
announcement - icon

Traditional keyword-based search methods rely on exact word matches, often leading to irrelevant results depending on the user's phrasing.

By comparison, using a vector store allows us to represent the data as vector embeddings, based on meaningful relationships. We can then compare the meaning of the user’s query to the stored content, and retrieve more relevant, context-aware results.

Explore how to build an intelligent chatbot using MongoDB Atlas, Langchain4j and Spring Boot:

>> Building an AI Chatbot in Java With Langchain4j and MongoDB Atlas

Partner – LambdaTest – NPI EA (cat=Testing)
announcement - icon

Accessibility testing is a crucial aspect to ensure that your application is usable for everyone and meets accessibility standards that are required in many countries.

By automating these tests, teams can quickly detect issues related to screen reader compatibility, keyboard navigation, color contrast, and other aspects that could pose a barrier to using the software effectively for people with disabilities.

Learn how to automate accessibility testing with Selenium and the LambdaTest cloud-based testing platform that lets developers and testers perform accessibility automation on over 3000+ real environments:

Automated Accessibility Testing With Selenium

1. Introduction

Docker Model Runner, introduced in Docker Desktop 4.40 for Mac with Apple Silicon (at the time of writing this article), revolutionizes local AI development by simplifying the deployment and management of large language models (LLMs). It tackles common challenges such as complex setup processes, high cloud inference costs, and data privacy concerns.

By providing an OpenAI-compatible Inference API, Model Runner enables seamless integration with frameworks like Spring AI, allowing developers to build AI-powered applications locally with ease. In this tutorial, we’ll learn how to set up Docker Model Runner and create a Spring AI application that connects to it. By the end, we’ll have a fully functional local AI application leveraging a powerful LLM.

2. Docker Model Runner

Docker Model Runner is a tool designed to simplify the deployment and execution of LLMs inside Docker containers. It’s an AI Inference Engine offering a wide range of models from various providers.

Let’s see the key features that Docker Model Runner includes:

  • Simplified Model Deployment: Models are distributed as standard Open Container Initiative (OCI) artifacts on Docker Hub under the ai namespace. This makes it easy to pull, run, and manage AI models directly within Docker Desktop.

  • Broad Model Support: Supports a variety of LLMs from multiple providers, such as Mistral, LLaMA, and Phi-4, ensuring flexibility in model selection.

  • Local Inference: Runs models locally, enhancing data privacy and eliminating dependency on cloud-based inference.

  • OpenAI-Compatible API: Provides a standardized API that integrates effortlessly with existing AI frameworks, reducing development overhead.

3. Set Up Environment

This section outlines the prerequisites for using Docker Model Runner and Maven dependencies to create a Spring AI application that uses the Model Runner.

3.1. Prerequisites

To use Docker Model Runner, we’ll need a few things:

  • Docker Desktop 4.40 or later: Installed on a Mac with Apple Silicon.

  • Java 21 or later: Required for Spring AI development.

  • Model Runner-compatible LLM: A model compatible with Docker Model Runner, such as LLaMA or Gemma 3.

3.2. Maven Dependencies

Let’s start by importing the spring-boot-starter-web, spring-ai-openai-spring-boot-starter, spring-ai-spring-boot-testcontainers, and junit-jupiter dependencies to the pom.xml:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
    <version>1.0.0-M6</version>
</dependency>
<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-openai-spring-boot-starter</artifactId>
    <version>1.0.0-M6</version>
</dependency>
<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-spring-boot-testcontainers</artifactId>
    <version>1.0.0-M6</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>junit-jupiter</artifactId>
    <version>1.19.8</version>
    <scope>test</scope>
</dependency>

4. Enabling and Configuring Docker Model Runner

This section outlines the steps to enable Docker Model Runner and pull a specific model using two distinct methods.

4.1. Enable Model Runner With a Specific TCP Port

First, let’s  enable Model Runner and expose it on a specific TCP port (e.g., 12434):

docker desktop enable model-runner --tcp 12434

This configures Model Runner to listen on http://localhost:12434/engines. In our Spring AI application, we need to configure the api-key, model, and base URL to point to the Model Runner endpoint:

spring.ai.openai.api-key=${OPENAI_API_KEY}
spring.ai.openai.base-url=http://localhost:12434/engines
spring.ai.openai.chat.options.model=ai/gemma3

4.2. Enable Model Runner With Testcontainers

We can run the following command to enable Model Runner without specifying a port:

docker desktop enable model-runner

This sets up Model Runner to run on the default internal Docker network. Then, we use Testcontainers and set the base-url, api-key, and model as follows:

@TestConfiguration(proxyBeanMethods = false)
class TestcontainersConfiguration {
    @Bean
    DockerModelRunnerContainer socat() {
        return new DockerModelRunnerContainer("alpine/socat:1.8.0.1");
    }

    @Bean
    DynamicPropertyRegistrar properties(DockerModelRunnerContainer dmr) {
        return (registrar) -> {
          registrar.add("spring.ai.openai.base-url", dmr::getOpenAIEndpoint);
          registrar.add("spring.ai.openai.api-key", () -> "test-api-key");
          registrar.add("spring.ai.openai.chat.options.model", () -> "ai/gemma3");
        };
    }
}

The provided TestcontainersConfiguration class is a Spring Boot @TestConfiguration designed for integration testing with Testcontainers. It defines two beans: a DockerModelRunnerContainer that starts a Docker container using the alpine/socat:1.8.0.1 image, likely to proxy or mock an AI service endpoint, and a DynamicPropertyRegistrar that dynamically sets Spring AI properties. These properties configure the AI client with a base URL from the container’s endpoint (via getOpenAIEndpoint()), a test API key (test-api-key), and a model identifier (ai/gemma3). The @TestConfiguration(proxyBeanMethods = false) annotation ensures lightweight bean creation for testing without proxying. This setup enables tests to simulate an AI service environment without external dependencies, using the socat container to handle requests. The socat forwards traffic to the internal model-runner.docker.internal service.

4.3. Pulling and Verifying the Gemma 3 Model

Now, after enabling Model Runner using one of the options, we pull the Gemma 3 model:

docker model pull ai/gemma3

Then, we can confirm it’s available locally:

docker model list

This command lists all locally available models, including ai/gemma3.

5. Integration With Spring AI

Now, let’s create a simple controller to interact with the model:

@RestController
class ModelRunnerController {
    private final ChatClient chatClient;

    public ModelRunnerController(ChatClient.Builder chatClientBuilder) {
        this.chatClient = chatClientBuilder.build();
    }

    @GetMapping("/chat")
    public String chat(@RequestParam("message") String message) {
        return this.chatClient.prompt()
          .user(message)
          .call()
          .content();
    }
}

5.1. Testing Model Runner With a Specific TCP Port

To use Docker Model Runner, we need to configure the OpenAI client to point to the right endpoint and use the model pulled earlier. Now, we start the application and test the /chat endpoint:

curl "http://localhost:8080/chat?prompt=What%20is%20the%20future%20of%20AI%20development?"

The response will be generated by the Gemma 3 model running in Model Runner.

5.2. Testing Model Runner With Testcontainers

Let’s create the ModelRunnerApplicationTest class. It will import the TestcontainersConfiguration class and call the sample controller:

@Import(TestcontainersConfiguration.class)
class ModelRunnerApplicationTest {
    // ...

    @Test
    void givenMessage_whenCallChatController_thenSuccess() {
        // given
        String userMessage = "Hello, how are you?";

        // when
        ResponseEntity<String> response = restTemplate.getForEntity(
          baseUrl + "/chat?message=" + userMessage, String.class);

        // then
        assertThat(response.getStatusCode().is2xxSuccessful()).isTrue();
        assertThat(response.getBody()).isNotEmpty();
    }
}

The @Import(TestcontainersConfiguration.class) imports the TestcontainersConfiguration class, which defines a DockerModelRunnerContainer (running alpine/socat:1.8.0.1). Also, it dynamically registers Spring AI properties (e.g., spring.ai.openai.base-url, spring.ai.openai.api-key, spring.ai.openai.chat.options.model). This ensures the test environment is configured with a mock AI service endpoint provided by the Testcontainers-managed container.

6. Conclusion

Docker Model Runner provides a developer-friendly, privacy-focused, and cost-effective solution for running LLMs locally, particularly for those building GenAI applications within the Docker ecosystem. In this article, we explored Docker Model Runner’s capabilities and demonstrated its integration with Spring AI. As always, the source code is available over on GitHub.

Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat = Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Partner – MongoDB – NPI EA (tag=MongoDB)
announcement - icon

Traditional keyword-based search methods rely on exact word matches, often leading to irrelevant results depending on the user's phrasing.

By comparison, using a vector store allows us to represent the data as vector embeddings, based on meaningful relationships. We can then compare the meaning of the user’s query to the stored content, and retrieve more relevant, context-aware results.

Explore how to build an intelligent chatbot using MongoDB Atlas, Langchain4j and Spring Boot:

>> Building an AI Chatbot in Java With Langchain4j and MongoDB Atlas

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Microsoft – NPI (cat=Spring)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

eBook Jackson – NPI EA – 3 (cat = Jackson)