Encode video in h264 using bufferedimages?

I am trying to translate a large set of bufferedimages (pre-saved images created on the fly by my application) into video using java and, hopefully, a library that can help in this process.

I learned a few different parameters like jcodec (there was no documentation on how to use it). Xuggler (failed to start it due to compatibility issues with jdk5 and related libraries). and a number of other libraries that had very poor documentation.

I am trying to find a library that I can use that uses java to (1) create h264 video by recording buffered frames per frame and (2) has documentation so that I can actually figure out how to use the dam.

Any ideas on what I should look at?

If pure Java source code exists somewhere that can achieve this, I would be very interested in viewing it. Because I would like to see how a person achieved his functionality and how I can use it!

Thanks in advance...

+4
source share
2 answers

Here you can do it with JCodec:

public class SequenceEncoder { private SeekableByteChannel ch; private Picture toEncode; private RgbToYuv420 transform; private H264Encoder encoder; private ArrayList<ByteBuffer> spsList; private ArrayList<ByteBuffer> ppsList; private CompressedTrack outTrack; private ByteBuffer _out; private int frameNo; private MP4Muxer muxer; public SequenceEncoder(File out) throws IOException { this.ch = NIOUtils.writableFileChannel(out); // Transform to convert between RGB and YUV transform = new RgbToYuv420(0, 0); // Muxer that will store the encoded frames muxer = new MP4Muxer(ch, Brand.MP4); // Add video track to muxer outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25); // Allocate a buffer big enough to hold output frames _out = ByteBuffer.allocate(1920 * 1080 * 6); // Create an instance of encoder encoder = new H264Encoder(); // Encoder extra data ( SPS, PPS ) to be stored in a special place of // MP4 spsList = new ArrayList<ByteBuffer>(); ppsList = new ArrayList<ByteBuffer>(); } public void encodeImage(BufferedImage bi) throws IOException { if (toEncode == null) { toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420); } // Perform conversion transform.transform(AWTUtil.fromBufferedImage(bi), toEncode); // Encode image into H.264 frame, the result is stored in '_out' buffer _out.clear(); ByteBuffer result = encoder.encodeFrame(_out, toEncode); // Based on the frame above form correct MP4 packet spsList.clear(); ppsList.clear(); H264Utils.encodeMOVPacket(result, spsList, ppsList); // Add packet to video track outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0)); frameNo++; } public void finish() throws IOException { // Push saved SPS/PPS to a special storage in MP4 outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList)); // Write MP4 header and finalize recording muxer.writeHeader(); NIOUtils.closeQuietly(ch); } } 
+1
source

jcodec now (jcodec-0.1.9.jar) includes a SequenceEncoder that directly allows BufferedImage to be written to the video stream.

I spent some time fixing the default import of this new class in Eclipse. After removing the first import, trying (as I said above, I could not find some of the classes) to create my own Stanislav code and re-import, I noticed an error:

 import org.jcodec.api.awt.SequenceEncoder; //import org.jcodec.api.SequenceEncoder; 

The second is completely out of date, with no documentation directing me to the latter.

Proportional method:

 private void saveClip(Trajectory traj) { //See www.tutorialspoint.com/androi/android_audio_capture.htm //for audio cap ideas. SequenceEncoder enc; try { enc = new SequenceEncoder(new File("C:/Users/WHOAMI/today.mp4")); for (int i = 0; i < BUFF_COUNT; ++i) { BufferedImage image = buffdFramToBuffdImage(frameBuff.get(i)); enc.encodeImage(image); } enc.finish(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } 
+1
source

All Articles