There is a job created for SQL Server that runs a database query and writes the data to a csv file. This is being done in this fashion:
!Read the report and echo it onto the PostProcess report
:SET &HND# = PREP_PROCESS_REPORT(,,"REP")
: SET &RET# = GET_PROCESS_LINE(&HND#)
: PRINT &RET#
!Write the report to a TXT file
:SET &RET# = WRITE_PROCESS(&HND#,"C:\TEMP\file.csv",WIN.HOST01, LOGIN.WIN.USER)
:PRINT "WRITE RC=&RET#"
Is there any limitation on:
the amount of registration that the sql job supports
the amount of rows that can be read with PREP_PROCESS_REPORT
the amount of rows that can be written with WRITE_PROCESS
Release : 12.3
Component : AUTOMATION ENGINE
There should be no limitations with prep_process_report() or write_process().
The only concern that I would have is around the limits defined on the agent for the amount of lines transferred to the database. For example, the default blocksize for reports is 8000 (as determined by the UC_HOSTCHAR_* setting "REPORT_BLKSIZE") and the default report size is 120 blocks (as determined by the UC_HOSTCHAR_* setting "MAX_REPORT_SIZE"). So if the report is larger than 960,000 characters, it will not store fully on the database and that is the report that is used in prep_process_report(). Anyone using these will need to keep an eye out in testing to be sure that the report is fully transferred to the database and increase MAX_REPORT_SIZE if needed.
As always, it's recommended to run this in a test environment before running in production. If any issues arise, please open a case with Broadcom Support