PtokaX forum

Stuff => Offtopic => Topic started by: blackwings on 04 December, 2004, 01:59:49

Title: image downloader
Post by: blackwings on 04 December, 2004, 01:59:49
Does anyone know a good program that can download all images on a site, included images thar is on subpages?

Because it's really time consuming to click on millions off links on big image gallaries, would go much quicker if you could download all images and then sort them on your own computer.
Title:
Post by: Herodes on 04 December, 2004, 14:24:49
Httrack Website Copier (http://www.httrack.com/) Support open source :)
Title:
Post by: blackwings on 04 December, 2004, 15:52:51
QuoteOriginally posted by Herodes
Httrack Website Copier (http://www.httrack.com/) Support open source :)
This program downloads everything, not just images. I hope that I won't be detected and accused for website stealing.

EDIT: I found the filtering option, but I can't seem to put the correct setting so it will onlt dl pictures :( When I try, either it downloads everything or it doesn't download anthing att all :S

EDIT 2: Herodes, you know about the program, could you maybe post a little guide how to  make the program to just download images, including images on subpages etc?
Title:
Post by: plop on 04 December, 2004, 23:12:07
try mozilla firefox with the downTHEMall extension.

plop
Title:
Post by: blackwings on 04 December, 2004, 23:51:03
QuoteOriginally posted by plop
try mozilla firefox with the downTHEMall extension.

plop
tried that, but it only dl images on the current page that you are on. It wont dl images on subpages  :(
Title:
Post by: Herodes on 05 December, 2004, 10:54:18
There is extensive documentation on the filter thingie of Httrack ...

Read the manual and you'll find a solution ...

btw ... If some webmaster dont wont their site to be downloaded or copied or stolen they can use robots.txt file properly ...
accusing others wont help a bit .. ;)
Title:
Post by: NotRabidWombat on 05 December, 2004, 17:47:15
<3 wget

wget -m -A .gif,.bmp,.jpg http://www.google.com

Downloading lots-o-pr0n? :-) That will mirror the entire site (only downloading the filtered images). If you want to limit the depth of recursively following urls.

wget -r -l2 -A .gif,.bmp,.jpg http://www.google.com

That will recursively follow links with a max depth of 2.

-NotRabidWombat
Title:
Post by: blackwings on 05 December, 2004, 23:00:52
QuoteOriginally posted by NotRabidWombat
<3 wget

wget -m -A .gif,.bmp,.jpg http://www.google.com

Downloading lots-o-pr0n? :-) That will mirror the entire site (only downloading the filtered images). If you want to limit the depth of recursively following urls.

wget -r -l2 -A .gif,.bmp,.jpg http://www.google.com

That will recursively follow links with a max depth of 2.

-NotRabidWombat
is that a command in some program?