Skip to Content.
Sympa Menu

freetds - Re: [freetds] freebcp file limit

freetds AT lists.ibiblio.org

Subject: FreeTDS Development Group

List archive

Chronological Thread  
  • From: "James K. Lowden" <jklowden AT freetds.org>
  • To: FreeTDS Development Group <freetds AT lists.ibiblio.org>
  • Subject: Re: [freetds] freebcp file limit
  • Date: Mon, 5 Dec 2005 20:37:54 -0500

Li, Maggie (IT) wrote:
> I need to freebcp a large file (about 15 million lines, 4 gb in size),
> but after about 5 million lines, it gives segmentation fault.

Wow. Thanks for the report.

It would take quite a bit of effort for me to reproduce that. My testing
facilities are across the Internet. Even if they have that much space, it
would take hours just to transmit the data.

I do have Sybase installed around here somewhere, on a machine that's
normally booted in something else. If you could send me your DDL, I'll
work on it. It would be very helpful, too, if you could send a backtrace.


Also, what does "ident $(command -v freebcp)" tell you?

> Is there a size limit for freebcp?

No, there's not supposed to be. The file is read in chunks and
transmitted to the server piecemeal. You apparently have tickled a bug,
some bit of memory that isn't being freed.

The good news is there may be a workaround. As long as your data file is
in plain text (not a "native" bcp file), you can split it up. You could
use sed(1) to do the job, or run something this, which takes advantage of
the first/last options of freebcp:

#! /bin/sh
BCP='freebcp db.own.table in file.dat -S server -c -U -P'
for FIRST in $(jot 16 0)
do
LAST=$(( ${FIRST} + 1 ))
printf "${BCP} -b 100000 -F %8d -L %8d \n" \
${FIRST}000001 ${LAST}000000
done

Pipe the output to sh, as in

$ ./foo | sh -e -x

and go to lunch.

HTH.

--jkl


--jkl




Archive powered by MHonArc 2.6.24.

Top of Page