public class AvroTrevniKeyInputFormat<T> extends org.apache.hadoop.mapreduce.lib.input.FileInputFormat<AvroKey<T>,org.apache.hadoop.io.NullWritable>
InputFormat for Trevni files.
 
 This implement was modeled off
 AvroKeyInputFormat to allow for easy
 transition
 
 A MapReduce InputFormat that can handle Trevni container files.
 Keys are AvroKey wrapper objects that contain the Trevni data. Since Trevni container files store only records (not key/value pairs), the value from this InputFormat is a NullWritable.
 A subset schema to be read may be specified with
 AvroJob#setInputKeySchema(Schema).
| Constructor and Description | 
|---|
| AvroTrevniKeyInputFormat() | 
| Modifier and Type | Method and Description | 
|---|---|
| org.apache.hadoop.mapreduce.RecordReader<AvroKey<T>,org.apache.hadoop.io.NullWritable> | createRecordReader(org.apache.hadoop.mapreduce.InputSplit split,
                  org.apache.hadoop.mapreduce.TaskAttemptContext context) | 
addInputPath, addInputPaths, computeSplitSize, getBlockIndex, getFormatMinSplitSize, getInputPathFilter, getInputPaths, getMaxSplitSize, getMinSplitSize, getSplits, isSplitable, listStatus, setInputPathFilter, setInputPaths, setInputPaths, setMaxInputSplitSize, setMinInputSplitSizepublic org.apache.hadoop.mapreduce.RecordReader<AvroKey<T>,org.apache.hadoop.io.NullWritable> createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) throws IOException, InterruptedException
createRecordReader in class org.apache.hadoop.mapreduce.InputFormat<AvroKey<T>,org.apache.hadoop.io.NullWritable>IOExceptionInterruptedExceptionCopyright © 2009-2013 The Apache Software Foundation. All Rights Reserved.