| What exactly are you trying to do??
It's easy to copy many files in one swoop. From DOS, you can just
'cd' to the target directory and type 'copy *.zip'. From Windows,
highlight the ones to copy and then do it.
If what you're trying to do is copy several files off the web, WS-FTP
(I _think_ that's the name, it's off the top of my head) also lets you
highlight multiple files and copy them. Of course you need to have an
FTP connection with the other site. Even plain and simple ftp has the
'mget' command.
If you want to do this with links on a web page, just remember the web
(WWW) was created to be interactive, one operation at a time, via
keyclick and hypertext pages and graphics organized by the web page's
designer. Doing multiple things at once doesn't seem to be one of
WWW's objectives.
On the other hand, I know I have downloaded a number of files at once
from web links, just by clicking on another link while the first one
is still copying. But I don't recall whether each file was an 'http:'
or 'ftp:' link, in case it matters.
|
| If you're looking for something to trawl round and grab all zip files
from a site (or any other kind of files) then you need something like
Internet Marauder. (Find it on www.shareware.com). This follows http
links to files and grabs them all back for you, so if you have access
to a web page which links to ten zip files, point the marauder at the
web page and it'll grab all 10 for you.
Alen.
|