Partner – Payara – NPI (cat=Jakarta EE)
announcement - icon

Can Jakarta EE be used to develop microservices? The answer is a resounding ‘yes’!

>> Demystifying Microservices for Jakarta EE & Java EE Developers

Course – LS – All

Get started with Spring and Spring Boot, through the Learn Spring course:

>> CHECK OUT THE COURSE

1. Overview

In this article, we will be looking at the Jetty library. Jetty provides a web server that can run as an embedded container and integrates easily with the javax.servlet library.

2. Maven Dependencies

To get started we’ll add Maven dependencies to jetty-server and jetty-servlet libraries:

<dependency>
    <groupId>org.eclipse.jetty</groupId>
    <artifactId>jetty-server</artifactId>
    <version>9.4.3.v20170317</version>
</dependency>
<dependency>
    <groupId>org.eclipse.jetty</groupId>
    <artifactId>jetty-servlet</artifactId>
    <version>9.4.3.v20170317</version>
</dependency>

3. Starting Jetty Server With Servlet

Starting the Jetty embedded container is simple. We need to instantiate a new Server object and set it to start on a given port:

public class JettyServer {
    private Server server;

    public void start() throws Exception {
        server = new Server();
        ServerConnector connector = new ServerConnector(server);
        connector.setPort(8090);
        server.setConnectors(new Connector[] {connector});
}

Let’s say that we want to create an endpoint that will respond with the HTTP status code of 200 if everything goes well and a simple JSON payload.

We’ll create a class that extends the HttpServlet class to handle such request; this class will be single threaded and block until completion:

public class BlockingServlet extends HttpServlet {

    protected void doGet(
      HttpServletRequest request, 
      HttpServletResponse response)
      throws ServletException, IOException {
 
        response.setContentType("application/json");
        response.setStatus(HttpServletResponse.SC_OK);
        response.getWriter().println("{ \"status\": \"ok\"}");
    }
}

Next, we need to register the BlockingServlet class in the ServletHandler object by using the addServletWithMapping() method and start the server:

servletHandler.addServletWithMapping(BlockingServlet.class, "/status");
server.start();

If we wish to test our Servlet logic, we need to start our server by using the previously created JettyServer class that is a wrapper of the actual Jetty server instance within the test setup:

@Before
public void setup() throws Exception {
    jettyServer = new JettyServer();
    jettyServer.start();
}

Once started, we will send a test HTTP request to the /status endpoint:

String url = "http://localhost:8090/status";
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(url);

HttpResponse response = client.execute(request);
 
assertThat(response.getStatusLine().getStatusCode()).isEqualTo(200);

4. Non-Blocking Servlets

Jetty has good support for asynchronous request processing.

Let’s say that we have an enormous resource that is I/O intense taking a long time to load blocking the executing thread for a substantial amount of time. It is better if that thread can be liberated to handle other requests in the meantime, instead of waiting for some I/O resource.

To provide such logic with Jetty, we can create a servlet that will use the AsyncContext class by calling the startAsync() method on the HttpServletRequest. This code will not block the executing thread but will perform the I/O operation in separate thread returning the result when ready using the AsyncContext.complete() method:

public class AsyncServlet extends HttpServlet {
    private static String HEAVY_RESOURCE 
      = "This is some heavy resource that will be served in an async way";

    protected void doGet(
      HttpServletRequest request, HttpServletResponse response)
      throws ServletException, IOException {
 
        ByteBuffer content = ByteBuffer.wrap(
          HEAVY_RESOURCE.getBytes(StandardCharsets.UTF_8));

        AsyncContext async = request.startAsync();
        ServletOutputStream out = response.getOutputStream();
        out.setWriteListener(new WriteListener() {
            @Override
            public void onWritePossible() throws IOException {
                while (out.isReady()) {
                    if (!content.hasRemaining()) {
                        response.setStatus(200);
                        async.complete();
                        return;
                    }
                    out.write(content.get());
                }
            }

            @Override
            public void onError(Throwable t) {
                getServletContext().log("Async Error", t);
                async.complete();
            }
        });
    }
}

We are writing the ByteBuffer to the OutputStream, and once the whole buffer is written we are signaling that result is ready to return to the client by invoking the complete() method.

Next, we need to add the AsyncServlet as a Jetty servlet mapping:

servletHandler.addServletWithMapping(
  AsyncServlet.class, "/heavy/async");

We can now send a request to the /heavy/async endpoint – that request will be handled by the Jetty in an asynchronous way:

String url = "http://localhost:8090/heavy/async";
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(url);
HttpResponse response = client.execute(request);

assertThat(response.getStatusLine().getStatusCode())
  .isEqualTo(200);
String responseContent = IOUtils.toString(r
  esponse.getEntity().getContent(), StandardCharsets.UTF_8);
assertThat(responseContent).isEqualTo(
  "This is some heavy resource that will be served in an async way");

When our application is handling requests in an asynchronous way, we should configure thread pool explicitly. In the next section, we will configure Jetty to use a custom thread pool.

5. Jetty Configuration

When we run our web application on production, we want might want to tune how the Jetty server processes requests. This is done by defining thread pool and applying it to our Jetty server.

To do this, we have three configuration settings that we can set:

  • maxThreads – To specify the maximum number of threads that Jetty can create and use in the pool
  • minThreads – To set the initial number of threads in the pool that Jetty will use
  • idleTimeout – This value in milliseconds defines how long a thread can be idle before it is stopped and removed from the thread pool. The number of remaining threads in the pool will never go below the minThreads setting

With these we can configure the embedded Jetty server programmatically by passing the configured thread pool to the Server constructor:

int maxThreads = 100;
int minThreads = 10;
int idleTimeout = 120;

QueuedThreadPool threadPool = new QueuedThreadPool(maxThreads, minThreads, idleTimeout);

server = new Server(threadPool);

Then, when we start our server it will be using threads from a specific thread pool.

6. Conclusion

In this quick tutorial, we saw how to integrate embedded servers with Jetty and tested our web application.

As always, the code is available over on GitHub.

Course – LS – All

Get started with Spring and Spring Boot, through the Learn Spring course:

>> CHECK OUT THE COURSE
res – REST with Spring (eBook) (everywhere)
Comments are closed on this article!