The postgresql is showing duplicate errors within the pg_log file.
Here is a snippet of the error message:
ERROR: duplicate key value violates unique constraint "at_evidences_pkey"
DETAIL: Key (story_id, vertex_id, type, occ_index)=(4069, 117447, ErrorEventStatement, 3) already exists.
STATEMENT: insert into at_evidences(story_id, vertex_id, start_time, end_time, fork, type, occ_index, latest, statements) values($1, $2, $3, $4, $5, $6, $7, $8, $9)
ERROR: duplicate key value violates unique constraint "at_evidences_pkey"
DETAIL: Key (story_id, vertex_id, type, occ_index)=(4063, 116896, ErrorEventStatement, 4) already exists.
STATEMENT: insert into at_evidences(story_id, vertex_id, start_time, end_time, fork, type, occ_index, latest, statements) values($1, $2, $3, $4, $5, $6, $7, $8, $9)
ERROR: duplicate key value violates unique constraint "at_evidences_pkey"
DETAIL: Key (story_id, vertex_id, type, occ_index)=(4031, 116114, ErrorEventStatement, 13) already exists.
STATEMENT: insert into at_evidences(story_id, vertex_id, start_time, end_time, fork, type, occ_index, latest, statements) values($1, $2, $3, $4, $5, $6, $7, $8, $9)
LOG: could not receive data from client: An existing connection was forcibly closed by the remote host.
Release : 10.7 SP3
Component : APM Agents
The error message is appearing in the Postgres logs only. This was fixed in 10.7 SP3. The fix has caught the Exception, which is why the error is not seen in the APM logs.
APM then saves the failed evidences and handle them later, so this error in the Postgres log can be ignored as it is handled at the Application level already.