1. Introduction

When transferring large files over the Internet, cURL is a powerful tool that can be very useful. In short, cURL is a command-line tool for making HTTP requests and transferring data across different protocols. In fact, cURL supports protocols like HTTPS, FTP, SCP, and more, making it a versatile choice for file transfers. Besides, cURL provides a straightforward syntax for sending requests and handling responses.

In this tutorial, we’ll explore various solutions and techniques for sending large files using cURL.

Notably, when using cURL and interacting with a server or an API, we must choose options that align with the server’s requirements. The appropriate HTTP method depends on the server-side application or script functionality and design.

2. Installing cURL

On Ubuntu and other Debian-based distributions, we can use the apt-get package manager to install cURL:

$ sudo apt-get install curl

On CentOS and Red Hat Enterprise Linux (RHEL) distributions, we can use the yum package manager to install cURL:

$ sudo yum install curl

We can also use the dnf package manager to install cURL.

3. Basic File Transfer

To begin with, let’s create a simple setup to better understand what happens on the server and client side.

In it, we’ll listen on port 80 via the nc command. Then, we’ll send a small example text file from another terminal. Finally, we can see exactly what was sent and received.

First, we list the content of the file that’s going to be sent:

$ cat sampleFile.txt

Lorem ipsum dolor sit amet, 
consectetur adipiscing elit, 
...
in culpa qui officia deserunt mollit anim id est laborum.

Then, we’ll prepare a simulation of the server-side connection via nc:

$ sudo nc -l -p 8080

In this example, the nc command listens on port 8080. We used -l for listening and -p to define our port.

Now, we’ll open a new terminal and send sampleFile.txt:

$ curl -X POST -T sampleFile.txt localhost:8080
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   446    0     0  100   446      0     34  0:00:13  0:00:01 --:--:--     0
[...]

Here, we used the -T (–upload-file) option to read and include the file contents in the request body.

Finally, we can see what was sent to the server:

$ sudo nc -l 8080
POST /sampleFile.txt HTTP/1.1
Host: localhost:8080
User-Agent: curl/7.81.0
Accept: */*
Content-Length: 446
Expect: 100-continue

Lorem ipsum dolor sit amet, consectetur adipiscing elit, 
... 
in culpa qui officia deserunt mollit anim id est laborum.

As we can see, we received HTTP request headers and the content of sampleFile.txt on the server side at port 80.

Now, we’re able to send a big file to the specified URL. Regardless of its size, the data will be transmitted in the request body. Besides, we can use the PUT method instead of POST. The choice between different data upload request methods with cURL depends on the server-side configuration.

4. Using Standard Input

Sending file data via standard input can be achieved via the -d @- option with cURL. By using this option, we allow the data to be read from stdin instead of passing it as a command-line argument. However, this method may lead to increased memory usage and potential limitations on file size, introducing potential performance issues and security risks:

DATA=$(base64 "big_file.iso")
$ echo "{ \"data\": \"$DATA\" }" | curl -X POST -H "Content-Type: application/json" -d @- https://example.com

The file content is read, encoded in base64, and then sent as the data field in a JSON payload using cURL‘s -d @- option for reading from stdin.

5. Splitting Large Files

When dealing with file size limitations, splitting large files into smaller parts can be a solution. cURL doesn’t have built-in support for file splitting. However, we can use external tools like split on Linux to create chunks from the file and then send each chunk individually.

First, let’s split the large file:

$ split -b 500M /path/to/largefile.vhd largefile_chunk

In this example, we used -b to specify the size of each output file. Hence, each split file should have a maximum size of 500 megabytes.

Let’s check the chunked files via ls:

$ ls -l
-rw-r--r-- 1 user group 500M Jun 9 12:00 largefile_chunkaa
-rw-r--r-- 1 user group 500M Jun 9 12:01 largefile_chunkab
-rw-r--r-- 1 user group 500M Jun 9 12:02 largefile_chunkac
...

As we can see, all chunked files have a maximum size of 500 MB and their name starts with largefile_chunk.

Now, we can use a for loop and send files via cURL:

$ for file in largefile_chunk*; do
  curl -X POST -F "file=@$file" https://example.com/upload
done

In this example, we sent all chunked files in a loop, one by one.

6. Compressing Files

To reduce the size of the transfer, we can compress files before sending them via cURL especially when dealing with large files. Hence, we’ll reduce the size of files which can result in faster transfer speeds and lower bandwidth consumption.

Let’s compress the big file via tar:

$ tar -czvf compressed_file.tar.gz big_file.iso

In this case, we created a compressed archive named compressed_file.tar.gz by compressing the file big_file.iso using gzip compression. The -c option creates a new archive, -z specifies gzip compression. Moreover, -v enables verbose mode, and -f specifies the output filename.

Now, we’ll send the compressed file:

$ curl -X POST -T compressed_file.tar.gz https://example.com

Using the POST method, we used cURL to send the compressed file compressed_file.tar.gz to the specified endpoint URL. Again, the -T option specifies the file to upload, and https://example.com is provided as the destination.

7. Remote Upload

If the large file is already hosted on another server, we can use cURL to remotely transfer the file to the destination server without downloading it to our local machine:

$ curl -o /dev/null -X POST -H "Content-Type: application/json" \
  -d '{"url":"https://remote-server.com/big_file.iso"}' \
  https://example.com/remote-upload

In this case, cURL sends a JSON payload containing the URL of a large file to https://example.com/remote-upload via a POST request. The server receives the request and initiates a remote upload process of the file specified by the URL. As a result, the response received from the server is discarded, and nothing is saved on the local machine.

Here, we used several options:

  • -o /dev/null specifies that we won’t save the response in the server
  • -X POST sets the HTTP method to POST
  • -H specifies the content type of the data we’re sending to the server, which is JSON in this case
  • -d specifies the data we want to send to the server

The server should handle the request, process the URL, and perform the necessary actions.

8. Conclusion

In this article, we explored various techniques for transferring large files using cURL. We discussed sending files, using standard input, splitting large files, compressing files, and remote uploading. These methods provide flexible options for efficient file transfers over the Internet.

In conclusion, cURL is a powerful tool for sending big files over the Internet. Moreover, cURL offers different ways to make file transfers better and work well with different servers. We can choose the right method, like POST or PUT, depending on what the server needs.

One way is by including the file in the request using the -T option. This works for files of any size. If the server needs authentication, we can use the -u option to provide our username and password.

Sometimes, there are limits on file size. In those cases, we can split the big file into smaller parts and send them separately. Another option is to compress the file before sending it, which makes it faster and uses less data.

Finally, if the file is already on another server, we can use cURL to transfer it directly to the destination server without downloading it to our local machine.

Comments are closed on this article!