When browsing the largest workflow tokens, the Orchestrator process restarts generating an 502 / 503 error in the UI
book
Article ID: 369686
calendar_today
Updated On:
Products
VMware Aria Suite
Issue/Introduction
You are using Aria Automation Orchestrator
When browsing the largest workflow tokens in the UI, the Aria Automation Orchestrator server process restarts generating 502 or 503 errors in the dashboard.
In some cases a heap dump is created.
In other cases the following log message is seen in the journal:
Some workflow tokens are very large ( > 10 MB) causing memory allocation issues when this content is decompressed.
This issue may be experienced during compression as well.
Impact:
The user is not able to browse and work with the largest tokens.
When browsing the largest workflow tokens, the Orchestrator process restarts generating an 502 / 503 error in the UI.
Resolution
This issue is resolved in Aria Automation Orchestrator 8.18.1.
Workaround:
Prerequisites:
You have valid backups or temporary snapshots of the appliance(s) participating in the cluster.
You have access to root username and password for the appliance(s).
You have access to an SSH tool or utility.
Procedure: Delete the largest tokens
The user can delete the largest tokens to reduce the maximum allowed size of retained content.
SSH into a one appliance in the participating cluster.
Run the following commands to connect to PostgreSQL and the vco-db:
vracli dev psql \c vco-db
Copy the following query and run it:
delete from vmo_workflowtoken where id in (select tokenid from vmo_workflowtokenstatistics s where (s.tokensize is null or s.tokensize >= 10000000)); delete from vmo_workflowtokencontent where workflowtokenid in (select tokenid from vmo_workflowtokenstatistics s where (s.tokensize is null or s.tokensize >= 10000000)); delete from vmo_workflowtokenstatistics where tokenid in (select tokenid from vmo_workflowtokenstatistics s where (s.tokensize is null or s.tokensize >= 10000000));
Procedure: Reduce the allowed maximum size of persisted token content.