Load Data Warehouse job fails with error ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define on DWH_INV_ASSIGN_SUM_FACTS_LOAD or DWH_INV_TASK_SUM_FACTS_LOAD
Article ID: 101868
CLARITY PPM FOR ITGCLARITY PPM FEDERALClarity PPM SaaS - ApplicationClarity PPM On Premise
On running Load Data Warehouse job, Full or Incremental, the following error is thrown every time: ClarityDB - isOracle? - An error occurred executing this job entry : Couldn't execute SQL: BEGIN DWH_INV_ASSIGN_SUM_FACTS_LOAD ( P_DBLINK => 'CLARITYDBLINK', P_LAST_LOAD_DATE => to_date('1910/01/01 00:00:00', 'yyyy/MM/dd HH24:mi:ss'), P_CURRENT_DIM_LOAD_DATE => to_date('2018/06/04 13:31:43', 'yyyy/mm/dd HH24:MI:SS'), P_ARRAY_SIZE => 50000 ); END;
[CA Clarity][Oracle JDBC Driver][Oracle]ORA-20100: ENCOUNTERED EXCEPTION WHILE INSERTING INTO DWH_INV_ASSIGN_SUMMARY_FACTS. SQLERRM : ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define ORA-06512: at "CLARITY154DWH.DWH_INV_ASSIGN_SUM_FACTS_LOAD", line 66 ORA-06512: at line 2
A user entered an invalid entry (over 1,000,000,000,000,000) for ETC/Pending ETC/Pending Actuals or any other value on the assignment. We do not recommend users to enter impossible values as this is not how the product is meant to be used. In some cases the user enters billions for ETC which is not possible to happen in a single life span.
Release: Component: ODDWH
DE41741 Entering a very high value on an assignment in PPM is allowed,which then fails Load Data Warehouse job with ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define fixed in CA PPM 15.5. The fix will ensure that all invalid entries that go over , will be set to 0 in DWH. This way the job will no longer fail if a user enters an invalid entry.
Query the PRASSIGNMENT table and look for any values that are going outside the limits such as
Once you find them, correct the value for this assignment from CA PPM UI