Read real-time metadata in Java HTTP Live Streaming

In the world of Flash, there is a concept of objects representing a streaming stream connection and stream (NetConnection and NetStream). Although the Flash API, these objects can be used to inject text metadata into a direct stream (NetStream.send ()). You can then listen to this data at the end of a view in a Flash viewer with listeners in ActionScript code. You can transfer function calls through the video stream through this and listen to them and execute them on the client side.

Does this concept exist in the area of ​​Apple HTTP Live Streaming?

+2
source share
1 answer

Yes, metadata is generated into the file using id3taggenerator and embedded into the video using the mediafilesegmenter , both of which are included in the HTTP Live Streaming Tools download. Example:

 id3taggenerator -o camera1.id3 -text "Dolly camera" id3taggenerator -o camera2.id3 -text "Tracking camera" 

There are several types of metadata that you can embed, including binary objects. See the man page for details. Now we need to refer to the generated file from the "meta macro file". This is a simple text file in the following format:

 60 id3 camera1.id3 120 id3 camera2.id3 

The first number is the seconds elapsed from the beginning of the video into which you want to insert a notification. I don’t remember the mediafilesegmenter command, I need to transfer at least a macro file, index and video file.

The resulting video contains metadata that is published by MPMoviePlayerController as notifications. See this page for details: http://jmacmullin.wordpress.com/2010/11/03/adding-meta-data-to-video-in-ios/

+2
source

All Articles