Does anyone know a good program that can download all images on a site, included images thar is on subpages?
Because it's really time consuming to click on millions off links on big image gallaries, would go much quicker if you could download all images and then sort them on your own computer.
Httrack Website Copier (http://www.httrack.com/) Support open source :)
QuoteOriginally posted by Herodes
Httrack Website Copier (http://www.httrack.com/) Support open source :)
This program downloads everything, not just images. I hope that I won't be detected and accused for website stealing.
EDIT: I found the filtering option, but I can't seem to put the correct setting so it will onlt dl pictures :( When I try, either it downloads everything or it doesn't download anthing att all :S
EDIT 2: Herodes, you know about the program, could you maybe post a little guide how to make the program to just download images, including images on subpages etc?
try mozilla firefox with the downTHEMall extension.
plop
QuoteOriginally posted by plop
try mozilla firefox with the downTHEMall extension.
plop
tried that, but it only dl images on the current page that you are on. It wont dl images on subpages :(
There is extensive documentation on the filter thingie of Httrack ...
Read the manual and you'll find a solution ...
btw ... If some webmaster dont wont their site to be downloaded or copied or stolen they can use robots.txt file properly ...
accusing others wont help a bit .. ;)
<3 wget
wget -m -A .gif,.bmp,.jpg http://www.google.com
Downloading lots-o-pr0n? :-) That will mirror the entire site (only downloading the filtered images). If you want to limit the depth of recursively following urls.
wget -r -l2 -A .gif,.bmp,.jpg http://www.google.com
That will recursively follow links with a max depth of 2.
-NotRabidWombat
QuoteOriginally posted by NotRabidWombat
<3 wget
wget -m -A .gif,.bmp,.jpg http://www.google.com
Downloading lots-o-pr0n? :-) That will mirror the entire site (only downloading the filtered images). If you want to limit the depth of recursively following urls.
wget -r -l2 -A .gif,.bmp,.jpg http://www.google.com
That will recursively follow links with a max depth of 2.
-NotRabidWombat
is that a command in some program?