How to create a heap dump for hiveserver2 to analyze heap out of memory issues in Pivotal HD
search cancel

How to create a heap dump for hiveserver2 to analyze heap out of memory issues in Pivotal HD

book

Article ID: 294596

calendar_today

Updated On:

Products

Services Suite

Issue/Introduction

In the case where hiveserver2 is not responding because of out of memory errors, it may be necessary to collect the heap dump. This knowledge base article explains how to collect the dump.


Error message

2016-05-27 13:46:55,629 ERROR [HiveServer2-Handler-Pool: Thread-35]: thrift.ProcessFunction (ProcessFunction.java:process(41)) - Internal error processing OpenSession 
2016-05-27 13:46:55,629 ERROR [HiveServer2-Handler-Pool: Thread-33]: thrift.ProcessFunction (ProcessFunction.java:process(41)) - Internal error processing OpenSession 
java.lang.OutOfMemoryError: Java heap space 
at java.util.Hashtable$Entry.clone(Hashtable.java:1052) 
at java.util.Hashtable.clone(Hashtable.java:613) 
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:696)


Environment


Resolution

In order to enable Heap Dump in case of an out of memory error, apply the following steps:


1. Open Ambari and go to: HIVE > Configs > Advanced > advanced hive-env > hive-env template and add the following lines to the end:

if [ "$SERVICE" = "hiveserver2" ]; then 
 export HADOOP_CLIENT_OPTS="$HADOOP_CLIENT_OPTS -XX:HeapDumpPath=/var/log/hive -XX:+HeapDumpOnOutOfMemoryError" 
fi  

2. Save the configuration change.

3. Restart all services requested to be restarted by Ambari.

4. Confirm the changes have taken effect on hiveserver2 by logging into the hiveserver2 host and running 'ps':


Additional Information