We are trying to configure Cloudera 5.5, where HDFS will work with s3 only so that we have already configured the necessary properties in Core-site.xml
<property> <name>fs.s3a.access.key</name> <value>################</value> </property> <property> <name>fs.s3a.secret.key</name> <value>###############</value> </property> <property> <name>fs.default.name</name> <value>s3a://bucket_Name</value> </property> <property> <name>fs.defaultFS</name> <value>s3a://bucket_Name</value> </property>
After setting up, we were able to view the files for the s3 bucket from the command
hadoop fs -ls /
And it shows only files available only on s3.
But when we start yarn services, the JobHistory server does not start with an error below and when we start jobs for pigs, we get the same error
PriviledgedActionException as:mapred (auth:SIMPLE) cause:org.apache.hadoop.fs.UnsupportedFileSystemException: No AbstractFileSystem for scheme: s3a ERROR org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils Unable to create default file context [s3a://kyvosps] org.apache.hadoop.fs.UnsupportedFileSystemException: No AbstractFileSystem for scheme: s3a at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:154) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:242) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:337) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:334) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415)
When starting on the Internet, we found that we also need to set the following properties in the core-site.xml file
<property> <name>fs.s3a.impl</name> <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value> <description>The implementation class of the S3A Filesystem</description> </property> <property> <name>fs.AbstractFileSystem.s3a.impl</name> <value>org.apache.hadoop.fs.s3a.S3AFileSystem</value> <description>The FileSystem for S3A Filesystem</description> </property>
After setting the above properties, we get the following error
org.apache.hadoop.service.AbstractService Service org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager failed in state INITED; cause: java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hadoop.fs.s3a.S3AFileSystem.<init>(java.net.URI, org.apache.hadoop.conf.Configuration) java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hadoop.fs.s3a.S3AFileSystem.<init>(java.net.URI, org.apache.hadoop.conf.Configuration) at org.apache.hadoop.fs.AbstractFileSystem.newInstance(AbstractFileSystem.java:131) at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:157) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:242) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:337) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:334) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:334) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:451) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:473) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.getDefaultFileContext(JobHistoryUtils.java:247)
The banks needed for this are in place, but still getting the error, any help would be great. thanks in advance
Update
I tried to remove the fs.AbstractFileSystem.s3a.impl property, but it gives me the same first exception that I received earlier:
org.apache.hadoop.security.UserGroupInformation PriviledgedActionException as:mapred (auth:SIMPLE) cause:org.apache.hadoop.fs.UnsupportedFileSystemException: No AbstractFileSystem for scheme: s3a ERROR org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils Unable to create default file context [s3a://bucket_name] org.apache.hadoop.fs.UnsupportedFileSystemException: No AbstractFileSystem for scheme: s3a at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:154) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:242) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:337) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:334) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:334) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:451) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:473)