image downloader
 

News:

29 December 2022 - PtokaX 0.5.3.0 (20th anniversary edition) released...
11 April 2017 - PtokaX 0.5.2.2 released...
8 April 2015 Anti child and anti pedo pr0n scripts are not allowed anymore on this board!
28 September 2015 - PtokaX 0.5.2.1 for Windows 10 IoT released...
3 September 2015 - PtokaX 0.5.2.1 released...
16 August 2015 - PtokaX 0.5.2.0 released...
1 August 2015 - Crowdfunding for ADC protocol support in PtokaX ended. Clearly nobody want ADC support...
30 June 2015 - PtokaX 0.5.1.0 released...
30 April 2015 Crowdfunding for ADC protocol support in PtokaX
26 April 2015 New support hub!
20 February 2015 - PtokaX 0.5.0.3 released...
13 April 2014 - PtokaX 0.5.0.2 released...
23 March 2014 - PtokaX testing version 0.5.0.1 build 454 is available.
04 March 2014 - PtokaX.org sites were temporary down because of DDOS attacks and issues with hosting service provider.

Main Menu

image downloader

Started by blackwings, 04 December, 2004, 01:59:49

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

blackwings

Does anyone know a good program that can download all images on a site, included images thar is on subpages?

Because it's really time consuming to click on millions off links on big image gallaries, would go much quicker if you could download all images and then sort them on your own computer.


Herodes


blackwings

#2
QuoteOriginally posted by Herodes
Httrack Website Copier Support open source :)
This program downloads everything, not just images. I hope that I won't be detected and accused for website stealing.

EDIT: I found the filtering option, but I can't seem to put the correct setting so it will onlt dl pictures :( When I try, either it downloads everything or it doesn't download anthing att all :S

EDIT 2: Herodes, you know about the program, could you maybe post a little guide how to  make the program to just download images, including images on subpages etc?


plop

try mozilla firefox with the downTHEMall extension.

plop
http://www.plop.nl lua scripts/howto\'s.
http://www.thegoldenangel.net
http://www.vikingshub.com
http://www.lua.org

>>----> he who fights hatred with hatred, drives the spreading of hatred <----<<

blackwings

#4
QuoteOriginally posted by plop
try mozilla firefox with the downTHEMall extension.

plop
tried that, but it only dl images on the current page that you are on. It wont dl images on subpages  :(


Herodes

There is extensive documentation on the filter thingie of Httrack ...

Read the manual and you'll find a solution ...

btw ... If some webmaster dont wont their site to be downloaded or copied or stolen they can use robots.txt file properly ...
accusing others wont help a bit .. ;)

NotRabidWombat

<3 wget

wget -m -A .gif,.bmp,.jpg http://www.google.com

Downloading lots-o-pr0n? :-) That will mirror the entire site (only downloading the filtered images). If you want to limit the depth of recursively following urls.

wget -r -l2 -A .gif,.bmp,.jpg http://www.google.com

That will recursively follow links with a max depth of 2.

-NotRabidWombat


I like childish behavior. Maybe this post will be deleted next.

blackwings

QuoteOriginally posted by NotRabidWombat
<3 wget

wget -m -A .gif,.bmp,.jpg http://www.google.com

Downloading lots-o-pr0n? :-) That will mirror the entire site (only downloading the filtered images). If you want to limit the depth of recursively following urls.

wget -r -l2 -A .gif,.bmp,.jpg http://www.google.com

That will recursively follow links with a max depth of 2.

-NotRabidWombat
is that a command in some program?


SMF spam blocked by CleanTalk