When trying to insert data into a Hive table, the following exception is seen:
Exception: org.apache.hadoop.hive.ql.lockmgr.LockException: No record of lock could be found, may have timed out Killing DAG... java.io.IOException: org.apache.hadoop.hive.ql.lockmgr.LockException: No record of lock could be found, may have timed out at org.apache.hadoop.hive.ql.exec.Heartbeater.heartbeat(Heartbeater.java:84) at org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor.monitorExecution(TezJobMonitor.java:293) at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:167) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1606) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1367) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1179) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1006) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:996)
This is a known software bug in the BoneCP connection pooling library.
Set Hive to use the DBCP library instead of the BoneCP library by following these steps:
1. Open Ambari.
2. Under "Services / Hive / Configs / Advanced / Custom hive-site" click "Add Property" and fill in the following key/value pair:
datanucleus.connectionPoolingType = dbcp
3. Restart the services.