Skip to Content.
Sympa Menu

baslinux - [BL] FTP was: (pure-ftpd pppd passwd)

baslinux AT lists.ibiblio.org

Subject: Baslinux mailing list

List archive

Chronological Thread  
  • From: Lee Forrest <lforrestster AT gmail.com>
  • To: baslinux AT lists.ibiblio.org
  • Subject: [BL] FTP was: (pure-ftpd pppd passwd)
  • Date: Sun, 31 Dec 2006 18:01:18 +0000

On Sun, Dec 31, 2006 at 04:39:40PM +0000, Lee Forrest wrote:
> On Sun, Dec 31, 2006 at 11:59:20PM +0000, sindi keesan wrote:
> > > I can't imagine doing FTP without a full client available,
> > > so that you can connect to the server and browse the files
> > > there. But it wouldn't have to be anything as large and
> > > complex as ncftp.
> >
>
> > I made scripts to move files from my computer to and from the
> > BL directory at my sdf shell account using ftpput and ftpget.
> > I already know what is where.
>
> But that isn't true of all the FTP servers out there, is it? If
> you want to limit yourself to knowing what's on one FTP server,
> go ahead.

I've been looking over the FTP protocol (RFC 959), after being
nudged by a dim memory: You can retrieve a list of files in a
directory on an FTP server with the LIST (or NLIST) command.

That means you can essentially run ls on it, with any FTP client
that will send that command.

According to the wget manpage from my debian box, all you have to
do is 'wget ftp://foo.foo.foo/' and it will do that, handling the
anonymous login required:

If you specify a directory (the above would get you to the root
directory, I'd think) Wget will retrieve the directory listing,
parse it and convert it to HTML.

/quote

But the wget on BL seems to be very limited compared
to that one.

I hope I'm wrong. Because that would make the BL (busybox) wget
truly functional for FTP retrievals.

This is what I got when I entered 'wget -v ftp://...'

wget: illegal option -- v
BusyBox v1.01 (2006.01.02-09:39+0000) multi-call binary

Usage: wget [-c|--continue] [-q|--quiet] [-O|--output-document
file] [--header 'header: value'] [-Y|--proxy on/off] [-P DIR] url

wget retrieves files via HTTP or FTP

Options:
-c continue retrieval of aborted transfers
-q quiet mode - do not print
-P Set directory prefix to DIR
-O save to filename ('-' for stdout)
-Y use proxy ('on' or 'off')

/quote

The manpage for wget on my debian OS is 1266 lines long.

If you can't get those listings with wget, I'll look into
writing a netcat script for it.

But first, I need to look at the ftpget manpage, because
that's probably a pretty small utility.

[delete]

Lee
--
BasicLinux: Small is Beautiful
http://www.basiclinux.com.ru





Archive powered by MHonArc 2.6.24.

Top of Page