Skip to Content.
Sympa Menu

xom-interest - Re: [XOM-interest] OutOfMemoryError

xom-interest AT lists.ibiblio.org

Subject: XOM API for Processing XML with Java

List archive

Chronological Thread  
  • From: Wolfgang Hoschek <wolfgang.hoschek AT mac.com>
  • To: Elliotte Harold <elharo AT metalab.unc.edu>
  • Cc: xom-interest AT lists.ibiblio.org, m AT lhaza.com
  • Subject: Re: [XOM-interest] OutOfMemoryError
  • Date: Fri, 20 Jan 2006 11:31:53 -0800

On Jan 20, 2006, at 11:05 AM, Elliotte Harold wrote:

Wolfgang Hoschek wrote:
Looks like Gsell Martin is using a streaming NodeFactory, so should be able to process arbitrarily large documents rather than being bounded by a constant limit. Unless his custom NodeFactory code has a hidden memory leak, of course.

I still have not received sufficient information to be sure, but I think his problem is that he has some Base-64 encoded content in his document that is so large the mere act of creating a single String from it is overflowing memory. SAX can break up such large chunks into multiple invocations to chars. XOM does not do this. It assumes you have at least enough memory to hold a single node, and that may not be true here.

OK. That makes sense.

But trying to stream arbitrarily large binary data via base64 inside XML just isn't a good idea to begin with, both conceptually and in terms of efficiency and scalability. Better to use an explicit binary packaging format such as MIME multipart, or similar. That way one can get hold of an InputStream for the binary data, then read it whichever way desired without scalability limits.

Bottom line: I don't think XOM can or should do something about this problem.

Wolfgang.




Archive powered by MHonArc 2.6.24.

Top of Page