You can set the container heap in Hive and fix this error:
Most tools that work on top of the Hadoop MapReduce framework provide ways to configure these Hadoop level options for your tasks. There are several ways to do this in Hive. Three of them are shown here:
1) Pass it directly through the Hive command line:
hive -hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120 -e "select count(*) from test_table;"
2) Set the ENV variable before calling Hive:
export HIVE_OPTS="-hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120"
3) Use the "set" command in the CLI for the hive.
hive> set mapreduce.map.memory.mb=4096; hive> set mapreduce.reduce.memory.mb=5120; hive> select count(*) from test_table;
source share