The customer tried 8 GB file which they generated by their own Java program.
It was breaking on 4 GB File. He split the file up and ran them through the converter and it worked.
I tried a 21 GB file and it broke after 3-4 GB. It said record didn't have the leading 32 byte record identifier.
The files are large, so generating traces with high-level is going to be hard.
Does the File Conversion Utility have limits in terms of size of file or number of records?
Test Data Manager - Mainframe
TDM
This packed decimal DG AFL .txt (windows) has
3,ACCOUNT_ID,12,33,2,String,,,,,,,
z/OS ,ACCOUNT_ID,6,1,2,"PackedSigned:11,0",,,,,,,
Account_id original sample values
000000306235
000000306649
000000306653
The formulas did not pad it with zeros and that caused data to be out of alignment.
@add(6234,~ROWNUM~)@
@add(6647,~ROWNUM~)@
@add(6650,~ROWNUM~)@
correct solution is
@leftpad(@add(6234,~ROWNUM~)@,0,@subtract(16,@length(@add(6234,~ROWNUM~)@ )@)@)@