Skip to Content.
Sympa Menu

freetds - Re: [freetds] ctlib bulk text copy segmentation fault

freetds AT lists.ibiblio.org

Subject: FreeTDS Development Group

List archive

Chronological Thread  
  • From: "Patrick Dunnigan" <patrick.dunnigan AT centivia.com>
  • To: "FreeTDS Development Group" <freetds AT lists.ibiblio.org>
  • Subject: Re: [freetds] ctlib bulk text copy segmentation fault
  • Date: Tue, 8 Mar 2005 12:18:35 -0500

Bill,
I found that text size was not set in freertds.conf. I went ahead and set it
to a large number (a meg for starters = 1048576) and I was able to get a
small string into the TEXT field.

The odd thing is that i had considered this to some degree previously but
had set the textsize immediately after the connection was made but it had no
effect (still received error):
ret = ct_command(cmd, CS_LANG_CMD, "set textsize 64512", CS_NULLTERM,
CS_UNUSED);


Anyways, I was able to get the inserts working, but when I try to bulk load
more than 131060 bytes into the text field, it croaks again in the same
spot. This is very close to 128k (? 4 bytes for the integer and 8 bytes for
some other overhead ??, just a guess).

I am looking to load whole files into the SQL Server text field, so a 128k
limit will be roadblock.

The Sybase Open Client lib has a function called blk_textxfer that seems to
deliver the data in chunks which would solve this issue I think. It seems
that this is a stub in the freetds libraries. I also see references to
textprt's in DBD:Sybase, etc. but don't see it in ctlib.

Have you been successful in bulk loading a large amount of data using these
methods? or Is there a way I can send in chunks, say 64k at a time?

I don't think I should post the tds dump, 319808 bytes. I can email it to
you if you'd like.

Thanks,
Patrick

----- Original Message ----- From: "Thompson, Bill D (London)" <bill_d_thompson AT ml.com>
To: "FreeTDS Development Group" <freetds AT lists.ibiblio.org>
Sent: Tuesday, March 08, 2005 3:57 AM
Subject: RE: [freetds] ctlib bulk text copy segmentation fault


Hi Patrick,

I had time for a quick look last night and reckoned this was probably
the problem.

There are a number of places in freetds where handling the potentially
enormous sizes of text/blob data poses a problem.
We have gotten round this (to some extent) by using the "set textsize
<n>" command.
This sets a limit for the size of text fields.
We should issue the command on establishment of each connection...if a
textsize value is specified in the freetds.conf file..

We pick up a default textsize of 64K from the [global] section of the
freetds.conf file :

[global]
...
# If you get out of memory errors, it may mean that your client
# is trying to allocate a huge buffer for a TEXT field.
# (Microsoft servers sometimes pretend TEXT columns are
# 4 GB wide!) If you have this problem, try setting
# 'text size' to a more reasonable limit
text size = 64512


Have you got this in your freetds.conf file ?

If so, then we need to look further into the problem. A TDSDUMP log file
would help.
If not, then I *think* this would solve your problem.

Let us know,

Bill



-----Original Message-----
From: freetds-bounces AT lists.ibiblio.org
[mailto:freetds-bounces AT lists.ibiblio.org] On Behalf Of Patrick Dunnigan
Sent: 07 March 2005 20:54
To: FreeTDS Development Group
Subject: Re: [freetds] ctlib bulk text copy segmentation fault


I checked out RC10 and had the same problem.

I've gone further into the debugging and found what i think is the root
cause. During the blk_init, there is a piece of code that looks like it
initializes memory buffers for each column in the data, out to it's max
length. I was able to get the bulk text insert to work when I changed
the
size of the allocation. Once I changed the number to something under 1
gig I
was able to successfully insert the data (my data by the way was only
100
bytes or less during testing this out).

blk.c blk_init ----->

if (is_numeric_type(curcol->column_type)) {
curcol->bcp_column_data =
tds_alloc_bcp_column_data(sizeof(TDS_NUMERIC));
((TDS_NUMERIC *)
curcol->bcp_column_data->data)->precision = curcol->column_prec;
((TDS_NUMERIC *)
curcol->bcp_column_data->data)->scale = curcol->column_scale;
} else {
//curcol->bcp_column_data =
tds_alloc_bcp_column_data(curcol->on_server.column_size); <---- Line I

commented out
curcol->bcp_column_data =
tds_alloc_bcp_column_data(1048576*1007);
<---- Max number I could allocate
}


Now, I have 1 gig of RAM in this machine (I believe, see meminfo below)
so
that may be why it's barfing above the 1048576*1007 limit. But many
machines
will not have 2 gig (size of the SQL Server TEXT) free to allocate.

As a workaround, since I know the size of the data I'm going to
transfer,
I'm going to try allocation the sizeof the data going in as opposed to
the
size of the database column by adding a parameter onto blk_init (.....,
int
datasize) . Please let me know of your opinion on this temp measure.

Thanks,
Patrick


----- Original Message ----- From: "Thompson, Bill D (London)" <bill_d_thompson AT ml.com>
To: "FreeTDS Development Group" <freetds AT lists.ibiblio.org>
Sent: Monday, March 07, 2005 1:04 PM
Subject: RE: [freetds] ctlib bulk text copy segmentation fault


Hi Patrick,

Thanks for your well researched and documented bug report.

I think it IS a bug. BCP of text data should indeed be possible,
although it is a (sometimes problematic) special case.

I'll look into the causes and advise (probably tomorrow)

Bill

-----Original Message-----
From: freetds-bounces AT lists.ibiblio.org
[mailto:freetds-bounces AT lists.ibiblio.org] On Behalf Of Patrick Dunnigan
Sent: 07 March 2005 17:17
To: freetds AT lists.ibiblio.org
Subject: [freetds] ctlib bulk text copy segmentation fault


Hi All,
I am attempting to do a bulk copy IN in a SQL Server 2000 table that has
two columns, an INT and a TEXT using freeTDS 0.63RC9 on RH 7.3 using
TDS 8.

The binds seem to be returning properly. When I call the blk_rowxfer
function it produces a "segmentation fault". I have traced it down to
the cs_convert call performed inside _blk_get_col_data (as below) in
blk.c for the second column in the database table - the TEXT field.
Further the segmentation fault occurs because of the 6th parameter:
bindcol->bcp_column_data->data.

result = cs_convert(ctx, &srcfmt, (CS_VOID *) src, &destfmt, (CS_VOID *)
bindcol->bcp_column_data->data, &destlen);

Now when I do nothing but change the fieldtype in SQL Server to say
VARCHAR, the bulk copy works without modifying my bind parameters (which
are set to the Text variety. If I change the database field type back to
TEXT, it fails.

I know it's the bindcol->bcp_column_data->data parameter (pointer to the
destination data buffer) because I created a stub function with printf
and return statement only inside and removed each parameter until it
successfully called the function and returned.

With this parameter in the cs_convert call, it does a segmentation fault
just by simply calling the function. The blk_bind for this column
appears to be functioning properly and is returning with a CS_SUCCEED.

Therefore my questions for the community are as such:
- Are bulk text inserts even possible with freeTDS and MSSQL?
- I've been through the examples and on the web, examples of TEXT bulk
copies are sparse. Any guidence?
- What other info can I provide? I can provide the code if needed, I
basically took the bulk copy in example from src/ctlib/unittests and
modified it for the text operation.

Thanks
Patrick
_______________________________________________
FreeTDS mailing list
FreeTDS AT lists.ibiblio.org
http://lists.ibiblio.org/mailman/listinfo/freetds
--------------------------------------------------------

If you are not an intended recipient of this e-mail, please notify the
sender, delete it and do not read, act upon, print, disclose, copy,
retain
or redistribute it. Click here for important additional terms relating
to
this e-mail. http://www.ml.com/email_terms/
--------------------------------------------------------

_______________________________________________
FreeTDS mailing list
FreeTDS AT lists.ibiblio.org
http://lists.ibiblio.org/mailman/listinfo/freetds


_______________________________________________
FreeTDS mailing list
FreeTDS AT lists.ibiblio.org
http://lists.ibiblio.org/mailman/listinfo/freetds
--------------------------------------------------------

If you are not an intended recipient of this e-mail, please notify the
sender, delete it and do not read, act upon, print, disclose, copy, retain
or redistribute it. Click here for important additional terms relating to
this e-mail. http://www.ml.com/email_terms/
--------------------------------------------------------

_______________________________________________
FreeTDS mailing list
FreeTDS AT lists.ibiblio.org
http://lists.ibiblio.org/mailman/listinfo/freetds






Archive powered by MHonArc 2.6.24.

Top of Page