1. Overview

In this article, we will be looking at the Jetty library. Jetty provides a web server that can run as an embedded container and integrates easily with the javax.servlet library.

2. Maven Dependencies

To get started we'll add Maven dependencies to jetty-server and jetty-servlet libraries:


3. Starting Jetty Server With Servlet

Starting the Jetty embedded container is simple. We need to instantiate a new Server object and set it to start on a given port:

public class JettyServer {
    private Server server;

    public void start() throws Exception {
        server = new Server();
        ServerConnector connector = new ServerConnector(server);
        server.setConnectors(new Connector[] {connector});

Let's say that we want to create an endpoint that will respond with the HTTP status code of 200 if everything goes well and a simple JSON payload.

We'll create a class that extends the HttpServlet class to handle such request; this class will be single threaded and block until completion:

public class BlockingServlet extends HttpServlet {

    protected void doGet(
      HttpServletRequest request, 
      HttpServletResponse response)
      throws ServletException, IOException {
        response.getWriter().println("{ \"status\": \"ok\"}");

Next, we need to register the BlockingServlet class in the ServletHandler object by using the addServletWithMapping() method and start the server:

servletHandler.addServletWithMapping(BlockingServlet.class, "/status");

If we wish to test our Servlet logic, we need to start our server by using the previously created JettyServer class that is a wrapper of the actual Jetty server instance within the test setup:

public void setup() throws Exception {
    jettyServer = new JettyServer();

Once started, we will send a test HTTP request to the /status endpoint:

String url = "http://localhost:8090/status";
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(url);

HttpResponse response = client.execute(request);

4. Non-Blocking Servlets

Jetty has good support for asynchronous request processing.

Let's say that we have an enormous resource that is I/O intense taking a long time to load blocking the executing thread for a substantial amount of time. It is better if that thread can be liberated to handle other requests in the meantime, instead of waiting for some I/O resource.

To provide such logic with Jetty, we can create a servlet that will use the AsyncContext class by calling the startAsync() method on the HttpServletRequest. This code will not block the executing thread but will perform the I/O operation in separate thread returning the result when ready using the AsyncContext.complete() method:

public class AsyncServlet extends HttpServlet {
    private static String HEAVY_RESOURCE 
      = "This is some heavy resource that will be served in an async way";

    protected void doGet(
      HttpServletRequest request, HttpServletResponse response)
      throws ServletException, IOException {
        ByteBuffer content = ByteBuffer.wrap(

        AsyncContext async = request.startAsync();
        ServletOutputStream out = response.getOutputStream();
        out.setWriteListener(new WriteListener() {
            public void onWritePossible() throws IOException {
                while (out.isReady()) {
                    if (!content.hasRemaining()) {

            public void onError(Throwable t) {
                getServletContext().log("Async Error", t);

We are writing the ByteBuffer to the OutputStream, and once the whole buffer is written we are signaling that result is ready to return to the client by invoking the complete() method.

Next, we need to add the AsyncServlet as a Jetty servlet mapping:

  AsyncServlet.class, "/heavy/async");

We can now send a request to the /heavy/async endpoint – that request will be handled by the Jetty in an asynchronous way:

String url = "http://localhost:8090/heavy/async";
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(url);
HttpResponse response = client.execute(request);

String responseContent = IOUtils.toString(r
  esponse.getEntity().getContent(), StandardCharsets.UTF_8);
  "This is some heavy resource that will be served in an async way");

When our application is handling requests in an asynchronous way, we should configure thread pool explicitly. In the next section, we will configure Jetty to use a custom thread pool.

5. Jetty Configuration

When we run our web application on production, we want might want to tune how the Jetty server processes requests. This is done by defining thread pool and applying it to our Jetty server.

To do this, we have three configuration settings that we can set:

  • maxThreads – To specify the maximum number of threads that Jetty can create and use in the pool
  • minThreads – To set the initial number of threads in the pool that Jetty will use
  • idleTimeout – This value in milliseconds defines how long a thread can be idle before it is stopped and removed from the thread pool. The number of remaining threads in the pool will never go below the minThreads setting

With these we can configure the embedded Jetty server programmatically by passing the configured thread pool to the Server constructor:

int maxThreads = 100;
int minThreads = 10;
int idleTimeout = 120;

QueuedThreadPool threadPool = new QueuedThreadPool(maxThreads, minThreads, idleTimeout);

server = new Server(threadPool);

Then, when we start our server it will be using threads from a specific thread pool.

6. Conclusion

In this quick tutorial, we saw how to integrate embedded servers with Jetty and tested our web application.

As always, the code is available over on GitHub.

Generic bottom

I just announced the new Learn Spring course, focused on the fundamentals of Spring 5 and Spring Boot 2:

Inline Feedbacks
View all comments
3 years ago

In the non-blocking example I have noticed that http respons code is set only after there is no remaining from the input. as the http reposponse code must be in the header of the response does it mean that the whole response is just buffered and when it finished it is sending _FROM_ the buffer.

If this is true async is not suitable for scenarios where it might make sense: (e.g.) downloading large volume of content.

Could you clarify

3 years ago
Reply to  takacsot

In the example from the article, we are reading resource in the asynchronous way. Due to that fact, our thread when is waiting for the next portion of data can do something else in the meantime, so it is not blocked.

If you want to send portion of data via network using HTTP protocol you need to handle that on the client and server side. HTTP protocol is not perfect for such use cases, I would suggest to use some other protocol like for example WebSockets.

Comments are closed on this article!