Hadoop on OSX "Unable to load kingdom information from SCDynamicStore"

I get this error when running Hadoop on OSX 10.7:

Unable to load kingdom information from SCDynamicStore put: org.apache.hadoop.hdfs.server.namenode.SafeModeException: cannot create directory / user / travis / input / conf. The name node is in safe mode.

It does not cause problems with the functionality of Hadoop.

+57
osx-lion hadoop macos
Aug 20 2018-11-21T00:
source share
7 answers

Matthew Bouquet's suggestion in HADOOP-7489 worked for me. Add the following to the hadoop-env.sh file:

export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk" 
+76
Oct 21 '11 at 15:05
source share

As an update to this (and for David Williams' address on Java 1.7), I experienced that just setting the .realm and .kdc not enough to stop the violation message.

However, by examining the source file that omits the message, I was able to determine that setting the .krb5.conf property to /dev/null was enough to suppress the message. Obviously, if you really have krb5 configuration, it is better to indicate the actual path to it.

In general, my hadoop-env.sh looks like this:

 HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc=" HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf=/dev/null" 
+38
Nov 12 '13 at 4:57
source share

I have the same problem for OS X 10.8.2, Java version 1.7.0_21. Unfortunately, the above solution does not fix the problem with this version :(

Edit: I found a solution to this based on the tooltip I saw here . In the hadoop-env.sh file hadoop-env.sh change the JAVA_HOME parameter to:

 export JAVA_HOME=`/usr/libexec/java_home -v 1.6` 

(Check out the grave quotes here.)

+16
Jun 25 '13 at 5:39
source share

FYI, you can simplify this by specifying only the following:

 export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc=" 

It is also mentioned in HADOOP-7489.

+13
Jan 05 '13 at 15:24
source share

I had a similar problem on MacOS, and after trying different combinations, this is what worked everywhere for me (both Hadoop 1.2 and 2.2):

in $HADOOP_HOME/conf/hadoop-env.sh set the following lines:

 # Set Hadoop-specific environment variables here. export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc=" # The java implementation to use. export JAVA_HOME=`/usr/libexec/java_home -v 1.6` 

Hope this helps

+5
Nov 19 '13 at 21:19
source share

also add

YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

before running start-yarn.sh (or start-all.sh) on cdh4.1.3

+4
Feb 27 '13 at 19:19
source share

I had this error while debugging MapReduce from Eclipse, but it was a red herring. The real problem was that I had to remotely debug by adding debug options to JAVA_OPTS

 -Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=1044 

And then create a new remote Java application profile in the debug configuration pointing to port 1044.

This article provides more details on the debugging side of things. He talks about Solra, but works the same with Hadoop. If you have problems write a message below and I will try to help.

+2
Apr 14 '13 at 4:49
source share



All Articles