Arithmetic overflow error converting IDENTITY to data type int.
Article ID: 103714
CA Process Automation Base
Symptoms of this issue are CA PRocess Automation not starting and the following error showing in the log files: Arithmetic overflow error converting IDENTITY to data type int:
2018-06-28 14:24:09,139 ERROR [org.hibernate.util.JDBCExceptionReporter] [ NodeManager_10] Arithmetic overflow error converting IDENTITY to data type int. 2018-06-28 14:24:09,139 WARN [com.optinuity.c2o.c2oserver.DBRecoveryStateStore] [ NodeManager_10] Failed to write service state to database: d4d92543-75ff-49b0-93ef-a4f7521edf3b; ID: 0 try count: 1 (will retry) org.hibernate.exception.DataException: could not insert: [com.optinuity.c2o.bean.C2ORecoverySvcState]
We can see more detail on this error here in this stackoverflow thread: https://stackoverflow.com/questions/2295731/maximum-for-autoincremental-int-primary-key-in-sqlserver
this shows the limit on the datatype INT is 2147483647 looking in the c2orecoverystate table, the one the log file is showing as having this issue we see the last ID value (an INT datatype column) is 2146367269, meaning we have likely exceeded that counter and are attempting to create values above 2147483647 which is throwing the error and causing issues with flow execution.
This is something we have not run into before with Process Automation and is ultimately due to your usage patterns exceeding a limitation on a field we need to write data to.
This detail shows the problem we encountered in the PAM c2o.log file:
Release: Component: ITPAM
We attempted to fix this by modifying the datatype in the C2ODBDefinitions.xml file but due to not fully understanding this file changes I made caused Process Automation to fail to start.
After discussing we decided to just drop the C2ORecoveryState database table as there were only 27 records in there anyway. After dropping the C2ORecoveryState and starting Process Automation it recreated the C2ORecoveryState table and we saw the Orchestrator started writing ID values in the thousands.
This should resolve this until we can correct this in the next major release.