I have not yet seen a solution to my specific problem. At least it doesn't work. It drives me crazy. This particular combo does not have much space on Google. My mistake arises as the task is performed in the cartographer from what I can say. The entrance to this task is the output of avro schema'd, which is compressed with deflate, although I also tried to compose.
Avro: 1.7.7 Hadoop: 2.4.1
I am getting this error and I am not sure why. Here is my job, assembler and shortening. An error occurs when the display device is turned on.
Sample uncompressed Avro input file (StockReport.SCHEMA is defined this way)
{"day": 3, "month": 2, "year": 1986, "stocks": [{"symbol": "AAME", "timestamp": 507833213000, "dividend": 10.59}]}
Work
@Override public int run(String[] strings) throws Exception { Job job = Job.getInstance(); job.setJobName("GenerateGraphsJob"); job.setJarByClass(GenerateGraphsJob.class); configureJob(job); int resultCode = job.waitForCompletion(true) ? 0 : 1; return resultCode; } private void configureJob(Job job) throws IOException { try { Configuration config = getConf(); Path inputPath = ConfigHelper.getChartInputPath(config); Path outputPath = ConfigHelper.getChartOutputPath(config); job.setInputFormatClass(AvroKeyInputFormat.class); AvroKeyInputFormat.addInputPath(job, inputPath); AvroJob.setInputKeySchema(job, StockReport.SCHEMA$); job.setMapperClass(StockAverageMapper.class); job.setCombinerClass(StockAverageCombiner.class); job.setReducerClass(StockAverageReducer.class); FileOutputFormat.setOutputPath(job, outputPath); } catch (IOException | ClassCastException e) { LOG.error("An job error has occurred.", e); } }
Mapper:
public class StockAverageMapper extends Mapper<AvroKey<StockReport>, NullWritable, StockYearSymbolKey, StockReport> { private static Logger LOG = LoggerFactory.getLogger(StockAverageMapper.class); private final StockReport stockReport = new StockReport(); private final StockYearSymbolKey stockKey = new StockYearSymbolKey(); @Override protected void map(AvroKey<StockReport> inKey, NullWritable ignore, Context context) throws IOException, InterruptedException { try { StockReport inKeyDatum = inKey.datum(); for (Stock stock : inKeyDatum.getStocks()) { updateKey(inKeyDatum, stock); updateValue(inKeyDatum, stock); context.write(stockKey, stockReport); } } catch (Exception ex) { LOG.debug(ex.toString()); } }
Scheme for the card output key:
{ "namespace": "avro.model", "type": "record", "name": "StockYearSymbolKey", "fields": [ { "name": "year", "type": "int" }, { "name": "symbol", "type": "string" } ] }
Stack trace:
java.lang.Exception: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected at org.apache.avro.mapreduce.AvroKeyInputFormat.createRecordReader(AvroKeyInputFormat.java:47) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:492) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:735) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)
Edit: It doesn't matter, but I'm working to reduce this to data from which I can create JFreeChart outputs. Do not get to the cartographer so that it is not connected.