PDA

View Full Version : Ripping A Web Page



Legolas
02-08-2004, 09:58 PM
i need to rip a very small web page copletely to a disk. how do i o about this?

dgmortal
02-08-2004, 10:05 PM
Offline Explorer (http://www.metaproducts.com/mp/mpProducts_Detail.asp?id=1)

h1
02-08-2004, 11:34 PM
WinHTTrack.

And I'm sure sharedholder will have a couple objections... :D

shn
02-08-2004, 11:49 PM
Originally posted by haxor41789@8 February 2004 - 17:34
WinHTTrack.

And I'm sure sharedholder will have a couple objections... :D
:lol: :lol: :lol:

DWk
02-08-2004, 11:56 PM
Originally posted by haxor41789@8 February 2004 - 16:34
WinHTTrack.

And I'm sure sharedholder will have a couple objections... :D
I use this program for the matter. Pretty nifty and l33t :P B)

Triadcool
02-09-2004, 12:07 AM
Hey, thats not funny

:smilie4:

h1
02-09-2004, 12:19 AM
Damn straight it's not funny. :ninja:

sharedholder
02-09-2004, 12:26 AM
BEFORE YOU WANT TO RIPP A SITE READ FIRST THE DAMN Terms of Use .Hope this help. :)

I.am
02-09-2004, 12:45 AM
Originally posted by sharedholder@8 February 2004 - 17:26
BEFORE YOU WANT TO RIPP A SITE READ FIRST THE DAMN Terms of Use .Hope this help. :)
Here we go again :lol:

Triadcool
02-09-2004, 12:57 AM
I hope not. :unsure:

DWk
02-09-2004, 01:17 AM
What happened before? :ph34r:

Mercy
02-09-2004, 05:04 AM
B)

h1
02-09-2004, 05:46 AM
Triadcool ripped sharedholder and muscleman's site... three times.

shn
02-09-2004, 08:13 AM
Originally posted by Legolas@8 February 2004 - 15:58
i need to rip a very small web page copletely to a disk. how do i o about this?
If you dont want to be blocked from websites that dont allow website copiers then just change the user agent string. There is an option to do that in WinHTTrack, you can also set the option to bypass the robots.txt file.

DWk
02-09-2004, 12:36 PM
Originally posted by haxor41789@8 February 2004 - 22:46
Triadcool ripped sharedholder and muscleman's site... three times.
LMAO :lol: :lol: :lol: :lol: :lol: :lol: :lol: