I am trying to understand the conceptually best way to deliver real streaming audio and video content. I would like it to be consumed in a web browser using the least amount of proprietary technology. I will not serve static files and use progressive download, these will be real audio streams that will be recorded live. How to broadcast a stream that will be reasonably synchronized with the source? Which protocol is suitable?
Edit:
In the study, I found that there are several protocols: RTSP, streaming HTTP, RTMP, and RTP.
HTTP streaming is somewhat unacceptable if you are streaming a live execution / message of some kind because it relies on TCP (like HTTP-based) and you do not lose packets. In a low bandwidth situation, the client can lag significantly behind in playback. ref
RTMP is a proprietary technology that requires a flash server. Shit on that. The reason I watched the flash is because they are extremely flexible as far as users are concerned. SoundManager2 provides an excellent javascript interface for flash playback. This is what I would like to find in the client application.
RTSP / RTP is what Microsoft switched to using, abandoning its MMS protocol. RTSP is a management protocol. It is similar to HTTP with several distinguishing differences - the server can also talk to the client, and there are additional commands such as PAUSE. Its also a state protocol, which is supported with a session identifier. RTP is a protocol for delivering payload (encoded audio or video). There are several open source projects, one of which is supported by apple here . It looks like it can do what I want, and it looks like quite a few players support it . It seems like it would be suitable for live streaming from this page here .
Thanks Josh