An error was encountered:
An error occurred while calling o104.count.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 14 in stage 0.0 filed 4 times, most recent failure: lost task 14.3 in stage 0.0 (TID 182) (server_name.com executor 12): com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed to initialize pool: FATAL: too many connections for role ".."
at com.zaxxer.hikari.pool.HikariPool.throwPoolInitializtionException(HikariPool.java:544)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:536)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:112)
at com.zaxxer.hikari.pool.HikariDataSource.<init>(HikariDataSource.java:72)
at io.pivotal.greenplum.spark.jdbc.HikariProvider$.createDataSource(ConnectionManager.scala:170)
at io.pivotal.greenplum.spark.jdbc.ConnectionManager.getPoolDataSource(ConnectionManager.scala:74)
at io.pivotal.greenplum.spark.jdbc.ConnectionManager.getPoolConnection(ConnectionManager.scala:55)
at io.pivotal.greenplum.spark.jdbc.ConnectionManager.getConnction(ConnectionManager.scala:40)
at io.pivotal.greenplum.spark.jdbc.ConnectionManager$.getConnction(ConnectionManager.scala:28)
at io.pivotal.greenplum.spark.jdbc.ConnectionManager$.GreenplumRowIterator.<init>(GreenplumRowIterator.scala:45)
at io.pivotal.greenplum.spark.GreenplumRDD.compute(GreenplumRDD.scala:49)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
...
at java.util.concurrent.TheadPoolExecutor.runWorker(TheadPoolExecutor.java:1149)
at java.util.concurrent.TheadPoolExecutor.$Worker.run(TheadPoolExecutor.java:624)
at java.lang.Thread.run(Thead.java:748)
It is recommeended to review the current setting of max_connections using the linked documentation and confirm sufficient connections are configured for successful execution