Skip to Content.
Sympa Menu

xom-interest - Re: [XOM-interest] XOM and Large Files

xom-interest AT lists.ibiblio.org

Subject: XOM API for Processing XML with Java

List archive

Chronological Thread  
  • From: Mike Colbert <mbcolbert AT yahoo.com>
  • To: Michael Kay <mike AT saxonica.com>, 'Elliotte Harold' <elharo AT metalab.unc.edu>
  • Cc: 'John Cowan' <cowan AT ccil.org>, xom-interest AT lists.ibiblio.org
  • Subject: Re: [XOM-interest] XOM and Large Files
  • Date: Wed, 26 Jul 2006 12:51:41 -0700 (PDT)

--- Michael Kay <mike AT saxonica.com> wrote:

> As a matter of interest, how long is it taking to process 1Gb, and how much
> memory does it use?
>

Haha, I have a feeling you already know the answer.

I was using 1Gb as an arbitrary limit to help define what I meant by a large
file. I had only tested previously with a couple of files in the 100M-200M
range (I should have stated that, I apologize). In reality, I can split them
up however want. 1Gb seems like a decent goal, since I have many gigabyte of
data to process.

However ...

530.2M seems to be about the largest I can squeeze through XOM. The next
largest size I tried was 707.6M and that ran out of heap space.

The time to process 530M was about 2 mins:

---> Wed Jul 26 15:10:01 EDT 2006
---> Wed Jul 26 15:11:55 EDT 2006

The memory utilization was as follows (maxed out):

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
27558 mzc29 17 0 3811m 3.6g 9860 R 99.9 15.1 3:19.20 java

Wolfgang, looks like the streaming NodeFactory may be the way to go! I'll
have
to try that, but I can go ahead and process what I need with the current
setup,
using smaller files.

For a 177.3M file, it only takes 2.2G/3.8G of memory in roughly 30 seconds.

Thanks to all!
Mike






Archive powered by MHonArc 2.6.24.

Top of Page