GPSS failing to start job with "fail to get kafka meta data: Local: Broker transport failure" error
search cancel

GPSS failing to start job with "fail to get kafka meta data: Local: Broker transport failure" error

book

Article ID: 296535

calendar_today

Updated On:

Products

VMware Tanzu Greenplum

Issue/Introduction

Starting a GPSS job fails with "Local: Broker transport failure" where SSL is enabled:
[root@7f7159220181 scripts]# gpsscli start insert start job: insert failed, reason: InitJob: getPartitionsMetadata: reader: insert fail to get kafka meta data: Local: Broker transport failure

This error suggests that when the GPSS is making a connection with the kafka, GPSS is being refused.

Firstly, check that all connections are setup correctly and working:
 
  • Verify that you can telnet from the GPSS server to the Kafka using the correct port
  • Verify that the kafkacat is working correctly


Environment

Product Version: 5.28

Resolution

This issue may be caused by the order that the job yaml file is based on. To correct this, make sure that the PROPERTIES come before the INPUT. To view an example, click on the following link:
 
Below, we changed the order that the insert yaml is based on.

From:
KAFKA:
  INPUT:
    SOURCE:
      BROKERS: hostname1:9092
      TOPIC: gpsspoc
  [.....]
  PROPERTIES:
    sasl.kerberos.service.name: kafka
    security.protocol: SASL_PLAINTEXT
    sasl.kerberos.keytab: /opt/cloudera/security/[email protected]
    sasl.kerberos.principal: [email protected]
    ssl.ca.location: /opt/cloudera/security/pki/ca_certs.pem

To:
KAFKA:
  PROPERTIES:
    sasl.kerberos.service.name: kafka
    security.protocol: SASL_PLAINTEXT
    sasl.kerberos.keytab: /opt/cloudera/security/[email protected]
    sasl.kerberos.principal: [email protected]
    ssl.ca.location: /opt/cloudera/security/pki/ca_certs.pem
  INPUT:
    SOURCE:
      BROKERS: hostname1:9092
      TOPIC: gpsspoc