Skip to Content.
Sympa Menu

freetds - Memory leak when doing many INSERTs

freetds AT lists.ibiblio.org

Subject: FreeTDS Development Group

List archive

Chronological Thread  
  • From: "Eric Deutsch" <edeutsch AT systemsbiology.org>
  • To: "TDS Development Group" <freetds AT franklin.oit.unc.edu>
  • Subject: Memory leak when doing many INSERTs
  • Date: Mon, 10 Sep 2001 18:32:59 -0700



Hi all, I've noticed what appears to be some memory leaking while doing lots
of INSERTS. I'm using Perl 5.6.0 + DBD::Sybase 0.91 + FreeTDS (both 0.51a
and latest CVS) under Red Hat 7.0 connecting to SQL Server 2000.

If I do:

$querystring="INSERT INTO leaktest (intfield,charfield,realfield) VALUES (
20,'chicken',3.1415 )";
while ( 1 == 1 ) {
$dbh->do($SQLstring);
print ".";
}

after a suitable setup, and watch my process with top, the memory used grows
and grows at a rate of maybe 5 MB/min. There's no difference if I use
prepare() and execute().

I have some real code (as opposed to the above example) that is parsing and
loading some huge datasets from files into a my DB and after one
particularly large set, my process was killed because it hogged all the
client machine's memory (probably after running for an hour). This is a bit
of a snag in what I'm trying to do. I could start breaking up the work, but
this is awkward for various reasons.

Has anyone encountered/studied/noticed this before? Any advice? I suppose
I could try taking Perl out of the loop to see if it's a FreeTDS issue or a
Perl DBI + DBD::Sybase issue.

I'm assuming it is not unreasonable to expect a Perl script to be able to
run INSERTs until SQL Server explodes without running into memory issues, is
that true?

thanks,
Eric








Archive powered by MHonArc 2.6.24.

Top of Page