Black Friday 2025 – NPI EA (cat = Baeldung on Linux)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

Baeldung Pro – Linux – NPI EA (cat = Baeldung on Linux)
announcement - icon

Learn through the super-clean Baeldung Pro experience:

>> Membership and Baeldung Pro.

No ads, dark-mode and 6 months free of IntelliJ Idea Ultimate to start with.

Partner – Orkes – NPI EA (tag=Kubernetes)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

1. Introduction

In web development, repeated HTTP requests are common for testing server performance, monitoring uptime, or validating application behavior. This helps to observe patterns, detect issues, and gather useful data. A popular tool for making HTTP requests in Linux is cURL, a command-line tool developers and sysadmins use to interact with web services and APIs.

Manually repeating requests isn’t practical. Fortunately, Linux offers ways to automate this by combining curl with other tools.

In this tutorial, we’ll explore methods such as for loops, the Apache Benchmark (ab) tool, and the watch command. Understanding these options lets us choose the right one based on the task, whether for simple checks or complex testing.

2. Basic URL Request With cURL

The curl command is a powerful utility in Linux used to transfer data from or to a server using various protocols, most commonly HTTP and HTTPS. It’s extremely useful for sending basic URL requests from the terminal, making it an essential tool for developers and sysadmins who need to interact with APIs or check web server responses.

To install curl on a Linux system, we can use the following command:

sudo apt install curl

Once installed, we can run a basic URL request by executing the curl command:

curl http://www.google.com

In this example, curl sends a simple GET request to http://www.example.com. By default, curl retrieves the content from the URL and displays it in the terminal. This is useful when we need to quickly check if a website is up, inspect its contents, or interact with web services.

3. URL Sequence Substitution With cURL

cURL offers a convenient feature called URL sequence substitution, which allows us to send multiple requests with varying query parameters by simply defining a range of values. This method is particularly useful when we need to automate requests to URLs that differ only by a small parameter, such as ID numbers or page numbers, without having to manually modify each URL. It’s an efficient way to reduce keystrokes and quickly perform batch requests.

Let’s see an example of using URL sequence substitution to send requests with a numeric range:

curl http://www.example.com/?[1-20]

In this example, curl automatically substitutes the numbers from 1 to 20 in the URL’s query string, sending 20 GET requests to http://www.example.com/ with the corresponding values. This is an easy way to make repetitive requests where only a small part of the URL changes.

If we need to incorporate other query parameters, we can assign the sequence to a “throwaway” variable:

curl http://www.myurl.com/?myVar=111&fakeVar=[1-20]

In this case, myVar remains constant, while fakeVar cycles through the numbers 1 to 20. This technique is particularly useful when we want to automate multiple requests with different values for one parameter, while other parts of the URL stay the same.

4. Using a for Loop With cURL

A for loop in bash allows us to repeat commands multiple times, making it a great way to automate URL requests with curl. This method is useful when we need to send multiple requests for testing purposes, such as load testing or checking the behavior of a web service over time. By using the for loop, we can easily control the number of repetitions and add delays between requests.

Let’s repeat a URL request 10 times using a for loop:

for i in {1..10}; do curl http://www.google.com; done

Here, the for loop runs 10 times, making a request to http://www.google.com during each iteration. The loop variable i counts from 1 to 10, though it’s not used inside the loop. The command curl http://www.google.com sends a GET request each time, fetching the content and displaying it in the terminal.

This method is highly flexible since we can add conditions or introduce delays by using sleep, allowing us to customize how often and how quickly the requests are made.

5. Using watch With cURL

The watch command allows us to repeatedly run a command at fixed intervals and display the output in real-time. When combined with curl, it becomes a powerful tool for continuously monitoring the status of a URL or server. This method is especially useful for tasks like uptime monitoring, where we need to check a website’s response over a period of time without manual intervention.

Let’s install watch in our Linux system:

sudo apt install procps

5.1. Basic Usage of watch With cURL

A simple use case is to send a request every two seconds to check the status of a URL:

watch -n 2 curl http://example.com

Here, watch runs the curl command every two seconds, updating the terminal with the latest output from the URL. This setup helps us monitor responses in real-time, making it easier to detect issues like downtime or slow responses.

5.2. watch With a for Loop

For more advanced control, we can combine watch with a for loop to send multiple requests within each interval. This is useful when we need to send more than one request per cycle or handle additional processing. Let’s send five requests every two seconds:

watch -n 2 'for i in {1..5}; do curl http://example.com; done'

In this case, watch executes the for loop every two seconds. The loop sends five consecutive curl requests to the URL during each interval. This combination allows us to control both the frequency of the intervals (using watch) and the number of requests sent (using the for loop). It’s an ideal approach for advanced scenarios where we need repetitive tasks with greater flexibility.

6. Apache Benchmark

Apache Benchmark (ab) is a specialized tool for performance testing and load generation. It allows us to quickly send a high volume of requests to a URL, making it ideal for stress-testing web servers or applications. With ab, we can simulate multiple users hitting a URL simultaneously, which helps us measure how well a server handles traffic under load.

To install ab on a Linux system, we can use:

sudo apt install apache2-utils

Once installed, we can run a benchmark test using the ab command:

ab -n 100 -c 10 http://example.com/

In this example, ab sends 100 requests to http://example.com/, with 10 requests being processed concurrently. The -n option specifies the total number of requests, while the -c option defines how many of those requests should be sent at the same time.

This allows us to simulate multiple users accessing the website, which is useful for evaluating server performance. After running the command, ab generates a detailed report that includes statistics such as requests per second, time per request, and server response times.

7. Conclusion

In this article, we’ve explored several ways to accomplish the automation of some repetitive URL requests using command-line tools, such as curl, for loops, watch, and ab. Each tool has a specific application — cURL can be used in simple checks, while the others apply to continuous monitoring and stress tests.

The flexibility in cURL is excellent for sending simple requests and performing sequence substitution for a range of URLs. On the other hand, Bash’s for loops offer much better control over the number of repetitions and timing. Real-time monitoring through the watch command is also helpful, especially for uptime checks. The ab command does load testing by emulating high traffic.

These methods make it possible for developers and sysadmins to gain an understanding of how to implement the best possible tool to perform a particular test or monitor efficiently.