Dataflow server return error "The maximum concurrent task executions is at its limit [xx]" when launching a new task
search cancel

Dataflow server return error "The maximum concurrent task executions is at its limit [xx]" when launching a new task

book

Article ID: 297091

calendar_today

Updated On:

Products

Support Only for Spring

Issue/Introduction

With Spring Cloud Data Flow for VMware Tanzu platform, when launching a new task, dataflow server returns error "The maximum concurrent task executions is at its limit [20]", even though there are not any other running tasks associated with the dataflow server.

2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] Caused by: org.springframework.messaging.MessagingException: Exception thrown while invoking TaskLauncherListener#process[1 args]; nested exception is org.springframework.web.client.HttpServerErrorException$InternalServerError: 500 Internal Server Error: [[{"logref":"IllegalStateException","message":"Cannot launch task mytask. The maximum concurrent task executions is at its limit [20]."}]]
2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] at org.springframework.cloud.stream.binding.StreamListenerMessageHandler.handleRequestMessage(StreamListenerMessageHandler.java:64) ~[spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:127) ~[spring-integration-core-5.2.3.RELEASE.jar:5.2.3.RELEASE]

...

2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] Caused by: org.springframework.web.client.HttpServerErrorException$InternalServerError: 500 Internal Server Error: [[{"logref":"IllegalStateException","message":"Cannot launch task mytask. The maximum concurrent task executions is at its limit [20]."}]]
2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] at org.springframework.web.client.HttpServerErrorException.create(HttpServerErrorException.java:100) ~[spring-web-5.2.3.RELEASE.jar:5.2.3.RELEASE]
2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:172) ~[spring-web-5.2.3.RELEASE.jar:5.2.3.RELEASE]
2020-04-21T11:09:24.145-00:00 [APP/PROC/WEB/0] [OUT] at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:112) ~[spring-web-5.2.3.RELEASE.jar:5.2.3.RELEASE]


Resolution

Spring Cloud Data Flow (SCDF) allows a user to limit the maximum number of concurrently running tasks for each configured platform to prevent the saturation of IaaS/hardware resources.

The limit is set to 20 for all supported platforms by default. If the number of concurrently running tasks on a platform instance is greater or equal to the limit, the next task launch request will fail and an error message will be returned via the RESTful API, Shell, or UI. 

With Spring Cloud Data Flow for VMware Tanzu, every running task container in the current org and space is included in the running execution count, whether or not it was launched using Spring Cloud Data Flow. Thus even with no tasks running associated with the dataflow instance, other running tasks in the space can also result in the error. 

To increase the limit (e.g from 20 to 30), when creating the service instance, please specify concurrent-task-limit as below:

$ cf create-service p-dataflow standard data-flow -c '{"concurrent-task-limit": 30}'


Note: In addition to the above case where the tasks in space hit the limit, there is an issue where all tasks running on the platform are counted as current running tasks no matter whether they are associated with the dataflow server. This issue has been fixed in OSS SCDF v2.5.1 and SCDF for VMware Tanzu v1.8, but it could cause the same problem.

For further details, please refer to the following pull request https://github.com/spring-cloud/spring-cloud-dataflow/pull/3775