SUMMARY: Using the lookup SCH_BROWSE_RESOURCE_BY_CODE on a custom attribute may cause the Incremental Load Data Warehouse job to fail when a new resource is created and then the incremental job runs. This appears to be Postgres database only.
STEPS TO REPRODUCE:
1. On a postgres db run a full load of data warehouse to make sure it completes successfully.
2. Create a new attribute on the Investment Object
- Make it a lookup type and use the lookup ID 'SCH_BROWSE_RESOURCE_BY_CODE' as the lookup
3. Save and return
4. Create a new resource. All you need to have is the basic information on New Labor Resource from Home side.
5. Run Incremental Load of DWH
Expected Results: Job completes
Actual Results: On Postgres databases the job fails on DWH_RES_RESOURCE table. Note this table does populate, but the new resource is not added to DWH_RES_RESOURCE_LN
Log error: ERROR 2022-03-30 20:38:11,501 [init of Execute SQL For Lookups (pg).0 (Thread-205702)] dwh.event Execute SQL For Lookups (pg) - An error occurred, processing will be stopped:
Couldn't execute SQL: CALL DWH_DIM_REFS_MAPPING_SP
(
p_dwh_table => 'DWH_RES_RESOURCE',
p_last_load_date => to_timestamp('2022/03/30 19:12:47', 'yyyy/MM/dd HH24:mi:ss')::TIMESTAMP WITHOUT TIME ZONE
);
ERROR: Encountered exception while updating reference columns for DWH_RES_RESOURCE. SQLERRM: operator does not exist: character varying = numeric
Where: PL/pgSQL function dwh_dim_refs_mapping_sp(character varying,timestamp without time zone) line 77 at RAISE
ERROR 2022-03-30 20:38:11,603 [init of Execute SQL For Lookups (pg).0 (Thread-205702)] dwh.event Execute SQL For Lookups (pg) - Error initializing step [Execute SQL For Lookups (pg)]
Reported as DE64636
Workaround: Run the Full Load of Data Warehouse.
Release : SAAS
Component : Clarity Data Warehouse
Resolved in 16.0.3