Although I have been working about four years with Solaris, I am not able to do many things using only the command line.
One of the things that I had never been done via the command line is downloads on the Internet.
The magic command is wget. Just pass the download link as a parameter for the command.
I tested this command on a CentOS and it worked very well:
]# wget http://www.smartfoxserver.com/products/download.php?d=76
--2010-12-12 13:20:26-- http://www.smartfoxserver.com/products/download.php?d=76
Resolving www.smartfoxserver.com... 62.149.227.100
Connecting to www.smartfoxserver.com|62.149.227.100|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: ../download/SFSPRO_linux_1.6.6.tar.gz [following]
--2010-12-12 13:20:27-- http://www.smartfoxserver.com/download/SFSPRO_linux_1.6.6.tar.gz
Connecting to www.smartfoxserver.com|62.149.227.100|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 136461484 (130M) [application/x-gzip]
Saving to: `SFSPRO_linux_1.6.6.tar.gz'
100%[===================================================================================================================>] 136,461,484 221K/s in 9m 2s
2010-12-12 13:29:29 (246 KB/s) - `SFSPRO_linux_1.6.6.tar.gz' saved [136461484/136461484]
I did not need anything advanced, but you can do miracles with wget, for example,download an entire site. See more details in reference [1].
Reference:
[1] http://www.simplehelp.net/2008/12/11/how-to-download-files-from-the-linux-command-line/
No comments:
Post a Comment