Out of memory issues in Javelin using Oracle Data Pump Import

book

Article ID: 15388

calendar_today

Updated On:

Products

CA Test Data Manager (Data Finder / Grid Tools)

Issue/Introduction



I am trying to do an Oracle Data Pump of several million rows,  and it seems that Javelin crashes every time I run the step.  Is there a way I can run the data pump without running into an Out Of Memory condition?

Environment

Javelin - all versions

Resolution

Javelin is a Windows 32-bit application,  so we only have a limited amount of memory for processing your tasks.  Also Javelin was never designed to handle mass data transfers due to the memory limitation of the 32-bit application.  If you need to use Javelin to data pump a large amount of data, you might want to consider batching your data down to smaller chunks and run it over multiple iterations (we call this slice and dice).

To do this, you will need to modify your SQL statement. An example how you could 'slice and dice' this problem over 10 iterations would be to change your SQL statement to have an additional where clause as shown below. This will also allow you to evenly distribute over as many iterations as you want.

 

Select... From... Where...  AND  Remainder (numerickey, total_iteration) = current_iteration

 

If I was using 10 for my total iterations,  and the numerickey was 23,  then the remainder is 3,  and this record will only be data pumped on the third iteration of the Javelin flow.

 

Although this is just an example, you will need to build and support your own 'slice and dice' query if you are not able to run your data pump query in the available memory in Javelin.