April 12, 2026
Normally, when you download a file over HTTP (such as through your browser, or with wget/curl), you download it sequentially. What if you could speed it up (as much as your connection and the target server allows) by downloading the file in multiple parts in parallel? This is one of the things that aria2 does: it uses the HTTP Range request header to request different segments in different HTTP connections, and then merges the segments.
With aria2c (the command-line interface for aria2), set to 16 connections, let’s use the Tele2 speed test service:
aria2c --max-connection-per-server 16 \
--split 16 \
--min-split-size 1M \
http://speedtest.tele2.net/10GB.zip
You can shorten the options to -x 16 -s 16 -k 1M.
You can even use it as the downloader for yt-dlp: yt-dlp --downloader aria2c --downloader-args "aria2c:-x 16 -s 16 -k 1M".
Caveats: spawning multiple connections to a server could be considered antisocial by some, could lead to rate limiting, etc. Requesting data ranges might not be supported by your target server.
If you want to queue files, you can give it multiple URLs, and then use --max-concurrent-downloads N to control how many of those URLs are downloaded at the same time.
This is just one of the many things it can do. It can also download a file from multiple sources/mirrors, download torrents, etc. – see the aria2 documentation.