My blog has moved! Redirecting...

You should be automatically redirected. If not, visit http://techienote.com and update your bookmarks.

Tech Resources

Friday, May 28, 2010

Downloading website for offline use with wget

Wget is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get, connotative of its primary function. One of its best feature is recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more.
So I decided to download a website for offline use from wget. Here we are downloading a my website to tmp folder

Command:
#mkdir /tmp/vidyadhar
#cd /tmp/vidyadhar
#/usr/bin/wget -r -Nc -mk http://vidyadhards.blogspot.com/

Options explanation

-r  Turn on recursive retrieving
-N  Turn on time-stamping
-m  Create a mirror
-k  Convert the links

Download time depends on the website contents.

All the content will get fetch in /tmp/vidyadhar/vidyadhards.blogspot.com directory.
For more options use "man wget"
P.S. Don't use this for illegal purpose.

Labels: , , ,

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home