SysChat is a free online computer support community. Ask questions, share resources, contribute knowledge and discuss technology. Join our growing community to access all features. Register Now!

SysChat » Tips 'n' Tricks » How To » How to Download an Entire Website for Offline Browsing

How To

how to tips

Comment
 
LinkBack Tip Tools
 

How to Download an Entire Website for Offline Browsing

By DominicD
03-11-2011
Download and Copy an Entire Website with Winhttrack

How would you like to download and copy an entire website? Not just browsing it online, but instead, copying all of its contents and saving it to your computer. This way, you have a copy of the entire website even if you are offline! Winhttrack lets you copy and download an entire website! Its fast, easy to understand, and free.

Download and install Winhttrack from HTTrack Website Copier - Offline Browser

For a this demo of Winhttrack, I will download the static content of “hamsterhideout.com”. My younger sister loves hamsters, so I thought this would be be easier so she can browse while offline.


Input the project name (this will be the saved project file of winhttrack)
Input the base path (this is where the files will be saved on your computer.


Input the website address

Click on Set Options to review the download settings
The default settings of winhttrack are already enough to download websites. For demo purposes, we will go into the important options that can be tweaked to adjust the downloaded content.



Downloading rules

The default Max transfer rate is only at a slow 25kb. While this is slow, this is done so that httrack does not overload the website that you are copying with too fast and too many download requests. Some websites have it in them to delay/block too fast and too many download requests from a single viewer.

You can also adjust the mirroring depth. This is the limit on how deep from the main directory will winhttrack copy.



Include/exclude the download of certain file types

By default, winhttrack downloads all known static files such as pictures, videos and sound files. You can adjust these settings if you would like to include special filenames or exclude other types.



Website Copying in progress!

After setting your download options, Winhttrack goes into action and downloads the entire website to a folder in your hard drive. The download might take a while – depending on the total file size of the html and media files.



Browsing the offline copy

The screenshot above shows the downloaded website at it is saved from my computer. Winhttrack is mostly able to copy static content. Dynamic content such as php/forms/scripts, and ads are not copied. The code for dynamic content is preserved. This enables the offline copy to still display the dynamic content as the browser fetches them online.



All the downloaded files

This folder contains the winhttrack project file and all the downloaded website contents. You can save this folder on another computer for later offline browsing.



Winhttrack is a free, easy to use, and fast website copier. It is especially useful for copying site with static content like – picture galleries, text and documents, and web downloads. This is also a good tool for keeping track of your favorite website’s history.
  #1  
By kaycee on 03-12-2011, 10:54 AM
Default

I was surprised when I saw this article, cos I have been looking for a thing like this.. thou I have tried it once, I think exactly with this same app - Htttrack. but I didn't think I succeeded. so that was why I got excited when I saw this post...

Then I wanted to copy a website -www dot w3schools dot com. cos I was learning html and may be css and may be flash basics.. but each time I tried to copy this site, it took ages and would never end the operation.. sometimes it shows copied content to be up to about 500 mb, but it either does not go above that or never ends the operation.

I kept wondering if it was possible to perform the task with the app. or if the website was too large or if there was something preventing it from completing successfully..

well, as it is now, I may have to try it again since it's coming from the main man DominicD. ..lol.. hope u don't mind..

or do u have any other thing I would have done to make it work?? may be settings I should have done properly??
Reply With Quote
  #2  
By DominicD on 03-12-2011, 12:32 PM
Default

hi kaycee, i just sent you pm check your inbox
Reply With Quote
  #3  
By DominicD on 03-13-2011, 12:04 AM
Tag Blue

i also found this program useful for generating known url links with a certain pattern



this is the download link
URLGen.exe

ive used this URL generator program myself. i used it to download images from a server that does not have an index files of the images AND the directory does not allow an automated index.

hope it helps
Reply With Quote
Comment





Similar Threads
Tip Tip Starter Category Tips Last Post
How To Download Google Books dwarkarao Internet 0 01-29-2010 09:33 AM
PICRIP IMAGE DOWNLOADER: Fast Way To Download Images In Batches dwarkarao Multimedia 0 01-05-2010 05:05 AM
Microsoft download I_Hate_My_Computer Networking 4 02-15-2008 02:59 AM

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is on
Smilies are on
[IMG] code is on
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are on



» Ads



1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54