Linux curl Command Examples

Transferring files using the curl command

curl


"cURL" is a computer software project that provides a library (libcurl) and a command line tool called curl. Curl is a tool to transfer data to or from a server using one of the many supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP). Support for authentication , Proxy servers, transfer resume, SSL are all catered for. For the full list of functionality see the "man pages" for curl.

Below are examples of how to install curl on popular Linux distributions along with some useful examples of the curl tool in action.



Installing cURL


On many distributions you will find that curl is already included. If not, it can easily be installed:


Debian/Ubuntu Installing Curl


To install curl on a Debian or Ubuntu system you need to issue the "sudo apt-get install curl" command. Before running the command we will check to see if it is installed first, if not, we will update our package lists and then install:

First lets check to see if curl has been installed by using the command "dpkg -s curl"



john@ubuntu1404:~$ dpkg -s curl
dpkg-query: package 'curl' is not installed and no information is available
Use dpkg --info (= dpkg-deb --info) to examine archive files,
and dpkg --contents (= dpkg-deb --contents) to list their contents.

It is always advisable to issue the "update" command to check for the latest packages.



john@ubuntu1404:~$ sudo apt-get update

As we can see from the previous output of the "dpkg -s curl" command that curl has not been installed on this particular Ubuntu 14.04 LTS system. Now we can issue the install command as follows:



john@ubuntu1404:~$ sudo apt-get install curl
[sudo] password for john:
Reading package lists... Done
Building dependency tree      
Reading state information... Done
The following NEW packages will be installed
  curl
0 to upgrade, 1 to newly install, 0 to remove and 0 not to upgrade.
Need to get 123 kB of archives.
After this operation, 313 kB of additional disk space will be used.
Get:1 http://gb.archive.ubuntu.com/ubuntu/ trusty/main curl amd64 7.35.0-1ubuntu2 [123 kB]
Fetched 123 kB in 0s (888 kB/s)
Selecting previously unselected package curl.
(Reading database ... 165068 files and directories currently installed.)
Preparing to unpack .../curl_7.35.0-1ubuntu2_amd64.deb ...
Unpacking curl (7.35.0-1ubuntu2) ...
Processing triggers for man-db (2.6.7.1-1) ...
Setting up curl (7.35.0-1ubuntu2) ...

Curl is now installed on your Debian/Ubuntu system. We can verify this by using the "dpkg -s curl" command again:



john@ubuntu1404:~$ dpkg -s curl
Package: curl
Status: install ok installed
Priority: optional

The above is only an exert from the output of the command. When you run this command you will also see information relating to dependencies, the maintainer, the version and quite often the web site for the provider of the package.


Red Hat/CentOS Installing curl


In the majority of instances curl will be automatically installed with the chosen server build. We can verify that curl is installed by either using the "rpm -qa" command or by issuing the relevant "yum search" command:

Below is an example taken from a Red Hat Server:



[root@rhel02 ~]# rpm -qa | grep curl
curl-7.19.7-26.el6_2.4.i686
libcurl-7.19.7-26.el6_2.4.i686
python-pycurl-7.19.0-8.el6.i686

We can see from the above output that the relevant packages are already installed.

If no "curl" packages were found we would have to issue the command "yum install curl" to install the relevant curl packages.


openSUSE 13.1/SLES


If you are working with an openSUSE or SLES (SUSE Linux Enterprise Server) you can use the "zyyper se curl" command to verify that the relevant packages have been installed already:



linux-z2jb:~ # zypper se curl
Loading repository data...
Reading installed packages...

S | Name                | Summary                                   | Type     
--+---------------------+-------------------------------------------+-----------
i | curl                | A Tool for Transferring Data from URLs    | package  
  | curl                | A Tool for Transferring Data from URLs    | srcpackage
  | curlftpfs           | Filesystem for mounting FTP hosts using-> | package  
  | flickcurl           | Command-Line Tools for the Flickr Web S-> | package  
  | flickcurl-doc       | C Library API to the Flickr Web Service-> | package  
  | libcurl-devel       | A Tool for Transferring Data from URLs    | package  
i | libcurl4            | Version 4 of cURL shared library          | package  
  | libcurl4-32bit      | Version 4 of cURL shared library          | package  
  | libflickcurl-devel  | C Library API to the Flickr Web Service   | package  
  | libflickcurl0       | C Library API to the Flickr Web Service   | package  
  | libflickcurl0-32bit | C Library API to the Flickr Web Service   | package  
  | perl-WWW-Curl       | Perl extension interface for libcurl      | package  
  | php5-curl           | PHP5 Extension Module                     | package  
i | python-pycurl       | PycURL -- cURL library module             | package  
  | python-pycurl-doc   | Documentation for python-curl             | package  
  | python3-pycurl      | PycURL -- cURL library module             | package  
  | python3-pycurl-doc  | Documentation for python-curl             | package  
  | tclcurl             | Tcl Binding to libcurl                    | package  
  | xmms2-plugin-curl   | Curl Support for xmms2                    | package 


From the above output we can see that the "curl, libcurl and python-pycurl" packages are installed. This is indicated by an "i in the first column.

If "curl" is not installed, it can be installed by issuing the command "zypper in curl. This command will install curl along with any dependencies.



curl command examples


Below are some examples of the curl command in use:


Retrieving a web page



$ curl landoflinux.com/linux_wget_command.html

By issuing the "curl" command followed by a web address, we can retrieve the page information back. This output is displayed directly to your terminal.

If you wanted to send the output from this command to a file we can use some simple re-direction:



john@ubuntu01-pc:~$ curl landoflinux.com/linux_wget_command.html > test.txt
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 22085  100 22085    0     0   198k      0 --:--:-- --:--:-- --:--:--  199k


Now our output will be sent to a file called "test.txt". Information relating to the file transfer is indicated in the output.

Another option for retrieving a file(s) to a local destination is to use the Output options flags "-o" or "-O"


Specifying a file name with the curl command option -o


The "-o" specifies that the specified name is to be used for the retrieved file:



john@ubuntu01-pc:~$ curl -o myfile.html http://landoflinux.com/linux_wget_command.html
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 22085  100 22085    0     0   160k      0 --:--:-- --:--:-- --:--:--  160k

To check you can view the first 5 lines of the retrieved file "myfile.html" by issuing the command "head -n 5 myfile.html":



john@ubuntu01-pc:~$ head -n 5 myfile.html 
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><title>Linux - curl command - www.LandofLinux.com</title>

<meta http-equiv="content-type" content="application/xhtml+xml; charset=UTF-8" />
<meta name="author" content="www.landoflinux.com" />

Using the retrieved file name with the curl option -O


The "-O option specifies that the saved file will use the same name as the file on the remote server.



john@ubuntu01-pc:~$ curl -O http://landoflinux.com/linux_wget_command.html
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 22085  100 22085    0     0   154k      0 --:--:-- --:--:-- --:--:--  155k

We should now have a file called "linux_wget_command.html" in our current directory:



john@ubuntu01-pc:~$ file linux_wget_command.html 
linux_wget_command.html: HTML document, ASCII text

john@ubuntu01-pc:~$ ls -l linux_wget_command.html 
-rw-rw-r-- 1 john john 22085 Apr 23 21:21 linux_wget_command.html

In the above output I have used the "file" command to describe our retrieved file. We can see the file also from the output from the "ls" command.


Downloading Multiple Files


As we have seen from the above examples we can specify a single file to retrieve, however it is also possible to retrieve multiple files in a similar way. Below is an example of two files being retrieved with the "-O" option specified:



john@ubuntu01-pc:~$ curl -O http://landoflinux.com/linux_books.html -O http://landoflinux.com/linux_wget_command.html
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  6093  100  6093    0     0   6094      0 --:--:-- --:--:-- --:--:--  6093
100 22085  100 22085    0     0   314k      0 --:--:-- --:--:-- --:--:--  314k

The above output shows the two specified files being retrieved successfully. This is shown by the "100 %" in the first column.


Continuing - Resuming a file transfer using -C -


The option "-C can be used to continue/resume a file transfer that has been disrupted. This is a very useful feature if you are downloading a large file and you want to interrupt the transfer or the transfer gets disrupted. The "-C -" tells curl to automatically find out where/how to resume the file transfer. The output/input files are then used to calculate the starting position (resume position).

To demonstrate this feature we can initiate a large download and then interrupt this by hitting "CTRL + C". We should then be able to resume the download by specifying the "-C" option:



john@ubuntu01-pc:~$ curl -O http://debian-handbook.info/download/stable/debian-handbook.pdf
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  8 28.6M    8 2522k    0     0  2448k      0  0:00:12  0:00:01  0:00:11 2446k

Now we can resume the download by issuing the command "curl -C - -O http://debian-handbook.info/download/stable/debian-handbook.pdf"



john@ubuntu01-pc:~$ curl -C - -O http://debian-handbook.info/download/stable/debian-handbook.pdf
** Resuming transfer from byte position 3710976
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 25.1M  100 25.1M    0     0   834k      0  0:00:30  0:00:30 --:--:--  699k

From the above output we can see that the transfer has resumed and completed successfully.


Downloading files from FTP servers with curl


As we have seen in the above examples we can download "http" type files with ease using curl. It is also easy to use curl to download files from a ftp server:



john@ubuntu01-pc:~/curl$ curl -O ftp://ftp.myftptest.com/distro/x86_64/iso/test/test-server-1.0-x86_64-dvd.iso.txt
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  452k  100  452k    0     0   121k      0  0:00:03  0:00:03 --:--:--  121k
john@ubuntu01-pc:~/curl$ ls -rtl
total 456
-rw-rw-r-- 1 john john 463514 Apr 23 22:00 test-server-1.0-x86_64-dvd.iso.txt

We can also specify authentication credentials by adding the parameter "-u followed by a valid userid and password:



$ curl -u userid:password -O ftp://myftptest.com/files/myfile.txt 

In the above example, we would have to substitute a real userid/password combination for the transfer to work. You can also list the contents of directories within a ftp server by specifying the path to the directory.



john@ubuntu01-pc:~/curl$ curl ftp://ftp.myftptest.com/distro/x86_64/iso/
drwxrwsr-x    6 ftp      ftp          4096 Apr 08 21:04 test
drwxrwsr-x    6 ftp      ftp          4096 Apr 08 21:04 beta
drwxrwsr-x    5 ftp      ftp          4096 Apr 08 21:04 alpha
drwxrwsr-x    6 ftp      ftp          4096 Apr 08 21:04 rc

In the above example we are listing the directories that can be found under the specified path.


Uploading Files to an FTP Server using the -T option


Curl can also be used to upload files to an ftp server. To accomplish this you will need to use the "-T option.



john@ubuntu01-pc:~/curl$ curl -u userid:password -T test.txt ftp.myftptest.com/
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100    55    0     0  100    55      0     55  0:00:01 --:--:--  0:00:01    55

In the above example we transferred the file "test.txt" to the server "ftp.myftptest.com"


Summary


Although we have shown several examples of curl in use. There are many more features/functions available within this tool. More information can be found at the main homepage for curl:
Curl Homepage


curl --help


For a quick overview of some of the parameters available you can issue the command curl --help or consult the man pages with the command man curl



john@ubuntu01-pc:~$ curl --help
Usage: curl [options...] 
Options: (H) means HTTP/HTTPS only, (F) means FTP only
     --anyauth       Pick "any" authentication method (H)
 -a, --append        Append to target file when uploading (F/SFTP)
     --basic         Use HTTP Basic Authentication (H)
     --cacert FILE   CA certificate to verify peer against (SSL)
     --capath DIR    CA directory to verify peer against (SSL)
 -E, --cert CERT[:PASSWD] Client certificate file and password (SSL)
     --cert-type TYPE Certificate file type (DER/PEM/ENG) (SSL)
     --ciphers LIST  SSL ciphers to use (SSL)
     --compressed    Request compressed response (using deflate or gzip)
 -K, --config FILE   Specify which config file to read
     --connect-timeout SECONDS  Maximum time allowed for connection
 -C, --continue-at OFFSET  Resumed transfer offset
 -b, --cookie STRING/FILE  String or file to read cookies from (H)
 -c, --cookie-jar FILE  Write cookies to this file after operation (H)
     --create-dirs   Create necessary local directory hierarchy
     --crlf          Convert LF to CRLF in upload
     --crlfile FILE  Get a CRL list in PEM format from the given file
 -d, --data DATA     HTTP POST data (H)
     --data-ascii DATA  HTTP POST ASCII data (H)
     --data-binary DATA  HTTP POST binary data (H)
     --data-urlencode DATA  HTTP POST data url encoded (H)
     --delegation STRING GSS-API delegation permission
     --digest        Use HTTP Digest Authentication (H)
     --disable-eprt  Inhibit using EPRT or LPRT (F)
     --disable-epsv  Inhibit using EPSV (F)
     --dns-servers    DNS server addrs to use: 1.1.1.1;2.2.2.2
     --dns-interface  Interface to use for DNS requests
     --dns-ipv4-addr  IPv4 address to use for DNS requests, dot notation
     --dns-ipv6-addr  IPv6 address to use for DNS requests, dot notation
 -D, --dump-header FILE  Write the headers to this file
     --egd-file FILE  EGD socket path for random data (SSL)
     --engine ENGINE  Crypto engine (SSL). "--engine list" for list
 -f, --fail          Fail silently (no output at all) on HTTP errors (H)
 -F, --form CONTENT  Specify HTTP multipart POST data (H)
     --form-string STRING  Specify HTTP multipart POST data (H)
     --ftp-account DATA  Account data string (F)
     --ftp-alternative-to-user COMMAND  String to replace "USER [name]" (F)
     --ftp-create-dirs  Create the remote dirs if not present (F)
     --ftp-method [MULTICWD/NOCWD/SINGLECWD] Control CWD usage (F)
     --ftp-pasv      Use PASV/EPSV instead of PORT (F)
 -P, --ftp-port ADR  Use PORT with given address instead of PASV (F)
     --ftp-skip-pasv-ip Skip the IP address for PASV (F)
     --ftp-pret      Send PRET before PASV (for drftpd) (F)
     --ftp-ssl-ccc   Send CCC after authenticating (F)
     --ftp-ssl-ccc-mode ACTIVE/PASSIVE  Set CCC mode (F)
     --ftp-ssl-control Require SSL/TLS for ftp login, clear for transfer (F)
 -G, --get           Send the -d data with a HTTP GET (H)
 -g, --globoff       Disable URL sequences and ranges using {} and []
 -H, --header LINE   Custom header to pass to server (H)
 -I, --head          Show document info only
 -h, --help          This help text
     --hostpubmd5 MD5  Hex encoded MD5 string of the host public key. (SSH)
 -0, --http1.0       Use HTTP 1.0 (H)
     --http1.1       Use HTTP 1.1 (H)
     --http2.0       Use HTTP 2.0 (H)
     --ignore-content-length  Ignore the HTTP Content-Length header
 -i, --include       Include protocol headers in the output (H/F)
 -k, --insecure      Allow connections to SSL sites without certs (H)
     --interface INTERFACE  Specify network interface/address to use
 -4, --ipv4          Resolve name to IPv4 address
 -6, --ipv6          Resolve name to IPv6 address
 -j, --junk-session-cookies Ignore session cookies read from file (H)
     --keepalive-time SECONDS  Interval between keepalive probes
     --key KEY       Private key file name (SSL/SSH)
     --key-type TYPE Private key file type (DER/PEM/ENG) (SSL)
     --krb LEVEL     Enable Kerberos with specified security level (F)
     --libcurl FILE  Dump libcurl equivalent code of this command line
     --limit-rate RATE  Limit transfer speed to this rate
 -l, --list-only     List only mode (F/POP3)
     --local-port RANGE  Force use of these local port numbers
 -L, --location      Follow redirects (H)
     --location-trusted like --location and send auth to other hosts (H)
 -M, --manual        Display the full manual
     --mail-from FROM  Mail from this address (SMTP)
     --mail-rcpt TO  Mail to this/these addresses (SMTP)
     --mail-auth AUTH  Originator address of the original email (SMTP)
     --max-filesize BYTES  Maximum file size to download (H/F)
     --max-redirs NUM  Maximum number of redirects allowed (H)
 -m, --max-time SECONDS  Maximum time allowed for the transfer
     --metalink      Process given URLs as metalink XML file
     --negotiate     Use HTTP Negotiate Authentication (H)
 -n, --netrc         Must read .netrc for user name and password
     --netrc-optional Use either .netrc or URL; overrides -n
     --netrc-file FILE  Set up the netrc filename to use
 -N, --no-buffer     Disable buffering of the output stream
     --no-keepalive  Disable keepalive use on the connection
     --no-sessionid  Disable SSL session-ID reusing (SSL)
     --noproxy       List of hosts which do not use proxy
     --ntlm          Use HTTP NTLM authentication (H)
     --oauth2-bearer TOKEN  OAuth 2 Bearer Token (IMAP, POP3, SMTP)
 -o, --output FILE   Write output to  instead of stdout
     --pass PASS     Pass phrase for the private key (SSL/SSH)
     --post301       Do not switch to GET after following a 301 redirect (H)
     --post302       Do not switch to GET after following a 302 redirect (H)
     --post303       Do not switch to GET after following a 303 redirect (H)
 -#, --progress-bar  Display transfer progress as a progress bar
     --proto PROTOCOLS  Enable/disable specified protocols
     --proto-redir PROTOCOLS  Enable/disable specified protocols on redirect
 -x, --proxy [PROTOCOL://]HOST[:PORT] Use proxy on given port
     --proxy-anyauth Pick "any" proxy authentication method (H)
     --proxy-basic   Use Basic authentication on the proxy (H)
     --proxy-digest  Use Digest authentication on the proxy (H)
     --proxy-negotiate Use Negotiate authentication on the proxy (H)
     --proxy-ntlm    Use NTLM authentication on the proxy (H)
 -U, --proxy-user USER[:PASSWORD]  Proxy user and password
     --proxy1.0 HOST[:PORT]  Use HTTP/1.0 proxy on given port
 -p, --proxytunnel   Operate through a HTTP proxy tunnel (using CONNECT)
     --pubkey KEY    Public key file name (SSH)
 -Q, --quote CMD     Send command(s) to server before transfer (F/SFTP)
     --random-file FILE  File for reading random data from (SSL)
 -r, --range RANGE   Retrieve only the bytes within a range
     --raw           Do HTTP "raw", without any transfer decoding (H)
 -e, --referer       Referer URL (H)
 -J, --remote-header-name Use the header-provided filename (H)
 -O, --remote-name   Write output to a file named as the remote file
     --remote-name-all Use the remote file name for all URLs
 -R, --remote-time   Set the remote file's time on the local output
 -X, --request COMMAND  Specify request command to use
     --resolve HOST:PORT:ADDRESS  Force resolve of HOST:PORT to ADDRESS
     --retry NUM   Retry request NUM times if transient problems occur
     --retry-delay SECONDS When retrying, wait this many seconds between each
     --retry-max-time SECONDS  Retry only within this period
     --sasl-ir       Enable initial response in SASL authentication
 -S, --show-error    Show error. With -s, make curl show errors when they occur
 -s, --silent        Silent mode. Don't output anything
     --socks4 HOST[:PORT]  SOCKS4 proxy on given host + port
     --socks4a HOST[:PORT]  SOCKS4a proxy on given host + port
     --socks5 HOST[:PORT]  SOCKS5 proxy on given host + port
     --socks5-hostname HOST[:PORT] SOCKS5 proxy, pass host name to proxy
     --socks5-gssapi-service NAME  SOCKS5 proxy service name for gssapi
     --socks5-gssapi-nec  Compatibility with NEC SOCKS5 server
 -Y, --speed-limit RATE  Stop transfers below speed-limit for 'speed-time' secs
 -y, --speed-time SECONDS  Time for trig speed-limit abort. Defaults to 30
     --ssl           Try SSL/TLS (FTP, IMAP, POP3, SMTP)
     --ssl-reqd      Require SSL/TLS (FTP, IMAP, POP3, SMTP)
 -2, --sslv2         Use SSLv2 (SSL)
 -3, --sslv3         Use SSLv3 (SSL)
     --ssl-allow-beast Allow security flaw to improve interop (SSL)
     --stderr FILE   Where to redirect stderr. - means stdout
     --tcp-nodelay   Use the TCP_NODELAY option
 -t, --telnet-option OPT=VAL  Set telnet option
     --tftp-blksize VALUE  Set TFTP BLKSIZE option (must be >512)
 -z, --time-cond TIME  Transfer based on a time condition
 -1, --tlsv1         Use TLSv1 (SSL)
     --trace FILE    Write a debug trace to the given file
     --trace-ascii FILE  Like --trace but without the hex output
     --trace-time    Add time stamps to trace/verbose output
     --tr-encoding   Request compressed transfer encoding (H)
 -T, --upload-file FILE  Transfer FILE to destination
     --url URL       URL to work with
 -B, --use-ascii     Use ASCII/text transfer
 -u, --user USER[:PASSWORD][;OPTIONS]  Server user, password and login options
     --tlsuser USER  TLS username
     --tlspassword STRING TLS password
     --tlsauthtype STRING  TLS authentication type (default SRP)
 -A, --user-agent STRING  User-Agent to send to server (H)
 -v, --verbose       Make the operation more talkative
 -V, --version       Show version number and quit
 -w, --write-out FORMAT  What to output after completion
     --xattr        Store metadata in extended file attributes
 -q                 If used as the first parameter disables .curlrc