Out of process memory error when writing large xmlfile using oracle xmldb

942 Views Asked by At

We've run into a problem writing a large xml-file using oracle 9i's xmldb facility. The query generates about 3 million lines, and oracle responds with the following error message:

ERROR at line 1:
ORA-04030: out of process memory when trying to allocate 4012 bytes
(qmxtgCreateBuf,kghsseg: kolaslCreateCtx)
ORA-06512: at "....", line 1154
ORA-06512: at line 1
ERROR: 
ORA-00600: internal error code, arguments: [%s], [%s], [%s], [%s], [%s], [%s],
[%s], [%s]

In the alert log:

Errors in file d:/db/admin/acc1/udump/acc1_ora_8112.trc:
ORA-00600: internal error code, arguments: [729], [104], [space leak], [], [], [], [], []

We've tried increasing the process memory, but that has hardly any effect at all.

Is there a way to have oracle use less memory for the xml (a 'lazy manifestation' / writethrough switch or something like that?

1

There are 1 best solutions below

0
On

You need to manage memory consumed by the process using BULK operation and LIMITED PAGED queries
For example:

DECLARE
  CURSOR c_customer IS
    SELECT CUSTOMER.id, CUSTOMER.name from CUSTOMER;
  TYPE customer_array_type IS TABLE OF c_customer%ROWTYPE INDEX BY BINARY_INTEGER;
  customer_array customer_array_type;
  fetch_size     NUMBER := 5000; -- scale the value to manage memory
BEGIN
  -- Open(create) XML file
  OPEN c_customer;
  loop
    FETCH c_customer BULK COLLECT
      INTO customer_array LIMIT fetch_size;
    FOR i IN 1 .. customer_array.COUNT LOOP
      null; -- Add XML nodes
    END LOOP;
    EXIT WHEN c_customer%NOTFOUND;
  END LOOP;
  CLOSE c_customer;
  -- Close(flush) XML file
End;

In some cases when the file size will exceed the OS file size limits, you have to create multiple files.