HTTP LIve Streaming

Ok, I tried to wrap my head around this live http streaming. I just don’t understand, and yes, I read all the documents in apples and watched the wwdc video, but still very confused, so please help, please, be a programmer!

Is the code you write on the server? not in xcode? If I'm right, how do I set this up? Do I need to configure something special on my server? like php or something else? How to use tools that come from Apple ... segmentator, etc.

Please help me, thanks

+54
ios video-streaming
Jul 06 2018-11-11T00:
source share
3 answers

HTTP Live Streaming

HTTP Live Streaming is a streaming standard proposed by Apple. See the latest draft standard .

Files involved

  • .m4a for audio (if you want only audio stream).
  • .ts for video. This is an MPEG-2 transport, usually with a h.264 / AAC payload. It contains 10 seconds of video and is created by splitting the original video file or converting live video.
  • .m3u8 for the playlist. This is a UTF-8 version of WinAmp format.

Even when it is called streaming, there is usually a delay of one minute or so during which the video is converted, ts and m3u8 files are written, and your client updates the m3u8 file.

All these files are static files on your server. But in live events, more .ts files are added, and the m3u8 file is updated.

Since you tagged this iOS question, it is important to mention the relevant App Store rules:

  • You can only use progressive download for videos of less than 10 minutes or 5 MB every 5 minutes. Otherwise, you should use HTTP Live Streaming.
  • If you use HTTP Live Streaming, you must provide at least one stream with a speed of 64 Kbps or lower bandwidth (a stream with a low bandwidth can only be audio or audio with a still image).

example

Get Streaming Tools

To download the HTTP Live Streaming Tools, do this:

Installed command line tools:

  /usr/bin/mediastreamsegmenter /usr/bin/mediafilesegmenter /usr/bin/variantplaylistcreator /usr/bin/mediastreamvalidator /usr/bin/id3taggenerator 

Help page descriptions:

  • Media Stream Segmenter: Create segments from MPEG-2 transport streams for streaming over HTTP.
  • Media File Segmenter: Create segments for streaming over HTTP from media files.
  • Variant Playlist Creator: Create a playlist to switch streams from HTTP Live streaming segments created by the mediafilesegmenter.
  • Media Stream Validator: checks HTTP streaming streams and servers.
  • ID3 Tag Generator: Create ID3 Tags.

Create video

Install Macports, go to the terminal and sudo port install ffmpeg . Then convert the video to a transport stream (.ts) using the FFMpeg script:

 # bitrate, width, and height, you may want to change this BR=512k WIDTH=432 HEIGHT=240 input=${1} # strip off the file extension output=$(echo ${input} | sed 's/\..*//' ) # works for most videos ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts 

This will create a single .ts file. Now we need to split the files into segments and create a playlist containing all these files. We can use the Apple mediafilesegmenter to do this:

 mediafilesegmenter -t 10 myvideo-iphone.ts 

This will create one .ts file for every 10 seconds of video, plus a .m3u8 file pointing to all of them.

Set up a web server

To play .m3u8 on iOS, we point to a file with a mobile safari. Of course, first we need to host them on a web server. In order for Safari (or another player) to recognize ts files, we need to add its MIME types. In Apache:

  AddType application/x-mpegURL m3u8 AddType video/MP2T ts 

In lighttpd:

  mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" ) 

To link this from a web page:

 <html><head> <meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/> </head><body> <video width="320" height="240" src="stream.m3u8" /> </body></html> 

To determine the orientation of the device, see the section β€œDefining and Setting the Orientation of the View Area of iPhone and iPad Using JavaScript, CSS and Meta Tags” .

Other things you can do is create different versions of the bitrate video, insert metadata for reading during playback as notifications, and, of course, enjoy programming with MoviePlayerController and AVPlayer.

+132
Jul 06 2018-11-11T00:
source

This can help in quickly:

  import UIKit import MediaPlayer class ViewController: UIViewController { var streamPlayer : MPMoviePlayerController = MPMoviePlayerController(contentURL: NSURL(string:"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8")) override func viewDidLoad() { super.viewDidLoad() streamPlayer.view.frame = self.view.bounds self.view.addSubview(streamPlayer.view) streamPlayer.fullscreen = true // Play the movie! streamPlayer.play() } } 

MPMoviePlayerController is deprecated from iOS 9 onwards. We can use AVPlayerViewController () or AVPlayer for this purpose. Take a look:

 import AVKit import AVFoundation import UIKit 

AVPlayerViewController:

 override func viewDidAppear(animated: Bool){ let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4") let player = AVPlayer(URL: videoURL!) let playerViewController = AVPlayerViewController() playerViewController.player = player self.presentViewController(playerViewController, animated: true) { playerViewController.player!.play() } } 

AVPlayer:

  override func viewDidAppear(animated: Bool){ let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4") let player = AVPlayer(URL: videoURL!) let playerLayer = AVPlayerLayer(player: player) playerLayer.frame = self.view.bounds self.view.layer.addSublayer(playerLayer) player.play() } 
+4
Feb 12 '16 at 5:45
source

Another explanation from Cloudinary http://cloudinary.com/documentation/video_manipulation_and_delivery#http_live_streaming_hls

HTTP Live Streaming (also known as HLS) is an HTTP-based streaming protocol that provides mechanisms that are scalable and adaptable to different networks. HLS works by breaking a video file into a sequence of small downloads of HTTP-based files, uploading one short fragment of the video file each time it is downloaded.

As the video stream plays, the client player can choose from several different alternative video streams containing the same material encoded at different data rates, which allows the streaming session to adapt to the available data rate with high-quality playback in networks with high bandwidth and low bandwidth playback quality in networks where bandwidth is reduced.

At the beginning of the streaming session, the client software downloads the main M3U8 playlist file containing metadata for the various substreams that are available. The client software then decides what to download from the available media files, based on predefined factors such as device type, resolution, data rate, size, etc.

+1
May 31 '16 at 12:06
source



All Articles