Java.lang.OutOfMemoryError: Java heap space with beehive

I used hadoop hive 0.9.0 and 1.1.2 and netbeans, but I got this error and I can not solve this problem please help the code:

public class Hive_test { private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver"; @SuppressWarnings("CallToThreadDumpStack") public static void main(String[] args) throws SQLException { try { Class.forName(driverName); } catch (ClassNotFoundException e){ e.printStackTrace(); System.exit(1); } System.out.println("commencer la connexion"); Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default",""," "); Statement stmt = con.createStatement(); ResultSet res = stmt.executeQuery("select * from STATE"); while (res.next()){ System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2)); System.out.println("sql terminer"); } } 

Error below;

 error : commencer la connexion Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116) at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103) at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192) at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132) at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:132) at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:122) at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at hive.Hive_test.main(Hive_test.java:22) 
+7
source share
2 answers

You can set the container heap in Hive and fix this error:

Most tools that work on top of the Hadoop MapReduce framework provide ways to configure these Hadoop level options for your tasks. There are several ways to do this in Hive. Three of them are shown here:

1) Pass it directly through the Hive command line:

 hive -hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120 -e "select count(*) from test_table;" 

2) Set the ENV variable before calling Hive:

 export HIVE_OPTS="-hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120" 

3) Use the "set" command in the CLI for the hive.

 hive> set mapreduce.map.memory.mb=4096; hive> set mapreduce.reduce.memory.mb=5120; hive> select count(*) from test_table; 
+18
source

Well, in my case, I also need to install memory in java.opts

 set mapreduce.map.memory.mb=4096; set mapreduce.map.java.opts=-Xmx3686m; set mapreduce.reduce.memory.mb=4096; set mapreduce.reduce.java.opts=-Xmx3686m; 
0
source

All Articles