When the hasoop process starts, you can specify the external files that should be available. This is done using the -files argument.
$HADOOP_HOME/bin/hadoop jar /usr/lib/COMPANY/analytics/libjars/MyJar.jar -files hdfs://PDHadoop1.corp.COMPANY.com:54310/data/geoip/GeoIPCity.dat
I donβt know if the files should be on HDFS, but if this is a job that will work often, it would be a good idea to put them there. From the code you can do something similar to
if (DistributedCache.getLocalCacheFiles(context.getConfiguration()) != null) { List<Path> localFiles = Utility.arrayToList(DistributedCache.getLocalCacheFiles(context.getConfiguration())); for (Path localFile : localFiles) { if ((localFile.getName() != null) && (localFile.getName().equalsIgnoreCase("GeoIPCity.dat"))) { Path path = new File(localFile.toUri().getPath()); } } }
That's all, except copying and pasting directly from working code inside several of our Mappers.
I do not know about the second part of your question. Hope the answer to the first part helps you get started. :)
In addition to -files there is -libjars to include additional cans; I have a little information about here - If I have a constructor that requires a file path, how can I "fake" what if it is packed in a jar?
source share