Could not find winutils binary in hadoop binary

I get the following error when running namenode for the latest version of hadoop-2.2. I did not find the winutils exe file in the bin-bin folder. I tried to execute commands

$ bin/hdfs namenode -format $ sbin/yarn-daemon.sh start resourcemanager ERROR [main] util.Shell (Shell.java:getWinUtilsPath(303)) - Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278) at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:863) 
+101
hadoop
Oct 27 '13 at 16:49
source share
15 answers

Simple solution : Download it from here and add to $HADOOP_HOME/bin

(Source: Click here )

EDIT:

For hadoop-2.6.0 you can download the binary files from the Titus Barik Blog "> .

I need to not only point HADOOP_HOME to the extracted [path] directory, but also provide the system property -Djava.library.path=[path]\bin to load my own libs (dll).

+93
Dec 10 '14 at 6:45
source share
— -

If we directly take the binary release schedule of Apache Hadoop 2.2.0 and try to run it on Microsoft Windows, then we will encounter ERROR util.Shell: Could not find the winutils binary in the hadoop binary path.

The binary distribution of Apache Hadoop 2.2.0 does not contain some native Windows components (for example, winutils.exe, hadoop.dll, etc.). They are required (optional) to run Hadoop on Windows.

Thus, you need to create your own binary distribution windows application from the source codes following the file "BUILD.txt" located inside the hadoop source distribution. You can follow the following messages as well as a step-by-step guide with a screenshot

Build, install, configure and run Apache Hadoop 2.2.0 on Microsoft Windows

ERROR util.Shell: Could not find winutils binary in hadoop binary

+27
Nov 03 '13 at 21:10
source share

If you encountered this problem when starting a standalone local application using Spark (i.e. after adding spark-assembly-xxx-hadoopx.xxjar or depending on Maven for the project), a simpler solution would be to put winutils.exe (download from here ) in "C: \ winutil \ bin". You can then add winutils.exe to the hadoop home directory by adding the following line to the code:

 System.setProperty("hadoop.home.dir", "c:\\\winutil\\\") 

Source: click here

+26
Oct 29 '15 at 20:53
source share

Java.io.IOException statement: Could not find executable file null \ bin \ winutils.exe

explains that zero is accepted when expanding or replacing an environment variable. If you see Source in Shell.Java in the Common Package, you will find that the HADOOP_HOME variable does not get the value, and you get zero instead and therefore an error.

So, for this you need to set HADOOP_HOME or the hasoop.home.dir variable.

Hope this helps.

Thank you Kamleshwar.

+17
Nov 02 '13 at 9:04 on
source share

I ran into this problem when working with Eclipse. In my case, I had the correct version of Hadoop loaded (hadoop-2.5.0-cdh5.3.0.tgz), I extracted the contents and put it directly in my C drive. Then I went to

Eclipse-> Debug / Run Configurations -> Environment (tab) -> and added

variable: HADOOP_HOME

Value: C: \ hadoop-2.5.0-cdh5.3.0

+11
May 27 '15 at 21:02
source share

You can download winutils.exe here: http://public-repo-1.hortonworks.com/hdp-win-alpha/winutils.exe

Then copy it to the HADOOP_HOME/bin .

+7
Aug 14 '15 at 6:29
source share

Winutils.exe is used to run shell commands for SPARK. When you need to start Spark without installing Hadoop, you need this file.

The steps are as follows:

  1. Download winutils.exe from the following location for hadoop 2.7.1 https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1/bin [NOTE. If you are using a separate hadoop version, please download winutils from the corresponding hadoop version folder on GITHUB from the folder above.]

  2. Now create the "winutils" folder on the C: \ drive. Now create the bin folder inside the winutils folder and copy the winutils.exe file to this folder. Thus, the location of winutils.exe will be C: \ winutils \ bin \ winutils.exe

  3. Now open the environment variable and set HADOOP_HOME = C: \ winutil [NOTE: Please do not add \ bin to HADOOP_HOME and you do not need to set HADOOP_HOME to Path]

Your problem must be resolved !!

+5
Apr 12 '19 at 21:45
source share

winutils.exe requires that hadoop execute the commands associated with the command. download hasoop-common-2.2.0 zip file. winutils.exe can be found in the bin folder. Extract the zip file and copy it to the hadoop / bin local folder.

+2
Oct 30 '14 at 11:51
source share

I had the same problem. Removing bin\ from the HADOOP_HOME path resolved this for me. The path to the HADOOP_HOME variable should look something like this.

 C:\dev\hadoop2.6\ 

A system reboot may be required. In my case, just restarting the IDE is enough.

+2
Jan 07 '16 at 10:45
source share

Set the HADOOP_HOME variable on Windows to solve the problem.

You can find the answer in org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0-sources.jar!/org/apache/hadoop/util/Shell.java :

IOException of

  public static final String getQualifiedBinPath(String executable) throws IOException { // construct hadoop bin path to the specified executable String fullExeName = HADOOP_HOME_DIR + File.separator + "bin" + File.separator + executable; File exeFile = new File(fullExeName); if (!exeFile.exists()) { throw new IOException("Could not locate executable " + fullExeName + " in the Hadoop binaries."); } return exeFile.getCanonicalPath(); } 

HADOOP_HOME_DIR of

 // first check the Dflag hadoop.home.dir with JVM scope String home = System.getProperty("hadoop.home.dir"); // fall back to the system/user-global env variable if (home == null) { home = System.getenv("HADOOP_HOME"); } 
+2
Oct 28 '16 at 10:23
source share
  1. Download [winutils.exe]
    From URL:
    https://github.com/steveloughran/winutils/hadoop- version / bin
  2. Past in HADOOP_HOME / bin
    Note: you must set environment variables:
    User Variable:
    Variable : HADOOP_HOME
    Value : Hadoop or spark dir
+2
Jun 22 '18 at 11:57
source share

In Pyspark, use the lines below to launch a local spark application using Pycharm

 os.environ['HADOOP_HOME'] = "C:\\winutils" print os.environ['HADOOP_HOME'] 
+2
Sep 27 '18 at 16:33
source share

I was getting the same problem in windows. I fixed it

  • Download hasoop-common-2.2.0-bin-master from the link .
  • Create the HADOOP_HOME user variable in the environment variable and assign the bino-common bin directory path as value.
  • You can verify this by running the hasoop command in cmd.
  • Restart the IDE and start it.
+1
Sep 15 '17 at 6:45
source share

Download the version of the hadoop folder you want (say, if you install spark on Windows, and then the version of hadoop your spark was created for) from this link in zip format.

Unzip the zip to the desired directory. You need to have a directory in the form of hadoop\bin (explicitly create such a directory structure of hadoop\bin if you want) with bin containing all the files contained in the bin folder of the downloaded hadoop. This will contain many files such as hdfs.dll, hadoop.dll, etc. In addition to winutil.exe.

Now create the HADOOP_HOME environment HADOOP_HOME and set it to <path-to-hadoop-folder>\hadoop . Then add ;%HADOOP_HOME%\bin; PATH environment PATH .

Open the "new command line" and try the command again.

+1
Mar 21 '18 at 9:34
source share

I used versions of "hbase-1.3.0" and "hadoop-2.7.3." Setting the HADOOP_HOME environment variable and copying the winutils.exe file to the HADOOP_HOME / bin folder solves the problem on os windows. Attention for installing the HADOOP_HOME environment in the hadoop installation folder (the / bin folder is not required for these versions). Also, I preferred to use the cross-platform cygwin tool to install linux os functionality (as much as possible), because the Hbase team recommends linux / unix env.

0
Mar 03 '17 at 11:05
source share



All Articles