I think I'm very close to getting the Java server application to speak on the browser page via WebRTC, but I can't get it to work. I feel like I'm missing something small, so I hope someone here can have a suggestion.
I carefully examined WebRTC examples - the Java unit test ( org.webrtc.PeerConnectionTest ) and the Android application example ( trunk/talk/examples/android ). Based on what I found out, I built a java application that uses WebSockets for signaling and tries to send a video stream to Chrome.
The problem is that there is no video in the browser, although all my code (both Javascript and Java) runs in the order I expect it, hitting all the correct logging entries. There is a suspicious output from libjingle's own code in the console log, but I'm not sure what to do with it. I highlighted suspicious log lines with a βββ below. For example, it seems that video port allocators are destroyed soon after they are created, so something is clearly wrong. In addition, " Changing video state, recv=1 send=0 " also seems to be wrong, since the Java side should send the video, not receive .... Perhaps I am abusing the OfferToReceiveVideo option?
If you look at the logs below, you will see that the WebSocket connection with the browser is working fine and that I can successfully send the SDP offer to the browser and get the SDP response from the browser. Setting local and remote descriptions on PeerConnections seems to work correctly. The HTML5 video element gets the source set in the BLOB URL, as you would expect. So what could I lose? Do I need to do anything with ICE candidates, even if my client and server are on the same machine right now?
Any advice would be greatly appreciated!
SDP Messages (from the Chrome Javascript Console)
1.134: Java Offer: v=0 o=- 5893945934600346864 2 IN IP4 127.0.0.1 s=- t=0 0 a=group:BUNDLE video a=msid-semantic: WMS JavaMediaStream m=video 1 RTP/SAVPF 100 116 117 c=IN IP4 0.0.0.0 a=rtcp:1 IN IP4 0.0.0.0 a=ice-ufrag:dJxTlMlXy7uASrDU a=ice-pwd:r8BRkXVnc4dqCABUDhuRjpp7 a=ice-options:google-ice a=mid:video a=extmap:2 urn:ietf:params:rtp-hdrext:toffset a=extmap:3 http:
It seems good to me. Java offer includes my video stream.
Writing Source Code (libjingle)
(Suspicious lines with the inscription "β")
Camera '/dev/video0' started with format YUY2 640x480x30, elapsed time 59 ms Ignored line: c=IN IP4 0.0.0.0 NACK enabled for channel 0 NACK enabled for channel 0 Created channel for video Jingle:Channel[video|1|__]: NULL DTLS identity supplied. Not doing DTLS Jingle:Channel[video|2|__]: NULL DTLS identity supplied. Not doing DTLS Session:5893945934600346864 Old state:STATE_INIT New state:STATE_SENTINITIATE Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p Setting local video description AddSendStream {id:JavaMediaStream_v0;ssrcs:[3720473526];ssrc_groups:;cname:nul6R21KmwAms3Ge;sync_label:JavaMediaStream} Add send ssrc: 3720473526 >> Warning(webrtcvideoengine.cc:2704): SetReceiverBufferingMode(0, 0) failed, err=12606 Changing video state, recv=0 send=0 Transport: video, allocating candidates Transport: video, allocating candidates Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Udp Jingle:Port[:1:0::Net[eth0:192.168.0.0/24]]: Port created Adding allocated port for video Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Added port to allocator Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Udp Jingle:Port[:1:0::Net[tun0:192.168.128.6/32]]: Port created Adding allocated port for video Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Added port to allocator Ignored line: c=IN IP4 0.0.0.0 Warning(webrtcvideoengine.cc:2309): GetStats: sender information not ready. Jingle:Channel[video|1|__]: Other side didn't support DTLS. Jingle:Channel[video|2|__]: Other side didn't support DTLS. Enabling BUNDLE, bundling onto transport: video Channel enabled >> Changing video state, recv=1 send=0 Session:5893945934600346864 Old state:STATE_SENTINITIATE New state:STATE_RECEIVEDACCEPT Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p Setting remote video description Hybrid NACK/FEC enabled for channel 0 Hybrid NACK/FEC enabled for channel 0 SetSendCodecs() : selected video codec VP8/ 1280x720x30fps@2000kbps (min=50kbps, start=300kbps) Video max quantization: 56 VP8 number of temporal layers: 1 VP8 options : picture loss indication = 0, feedback mode = 0, complexity = normal, resilience = off, denoising = 0, error concealment = 0, automatic resize = 0, frame dropping = 1, key frame interval = 3000 WARNING: no real random source present! SRTP activated with negotiated parameters: send cipher_suite AES_CM_128_HMAC_SHA1_80 recv cipher_suite AES_CM_128_HMAC_SHA1_80 Changing video state, recv=1 send=0 Session:5893945934600346864 Old state:STATE_RECEIVEDACCEPT New state:STATE_INPROGRESS Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Relay Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Relay Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Tcp Jingle:Port[:1:0:local:Net[eth0:192.168.0.0/24]]: Port created Adding allocated port for video Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Added port to allocator Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Tcp Jingle:Port[:1:0:local:Net[tun0:192.168.128.6/32]]: Port created Adding allocated port for video Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Added port to allocator Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=SslTcp Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=SslTcp All candidates gathered for video:1:0 Transport: video, component 1 allocation complete Transport: video allocation complete Candidate gathering is complete. Capture delay changed to 120 ms Captured frame size 640x480. Expected format YUY2 640x480x30 Capture size changed : selected video codec VP8/ 640x480x30fps@2000kbps (min=50kbps, start=300kbps) Video max quantization: 56 VP8 number of temporal layers: 1 VP8 options : picture loss indication = 0, feedback mode = 0, complexity = normal, resilience = off, denoising = 0, error concealment = 0, automatic resize = 1, frame dropping = 1, key frame interval = 3000 VAdapt Frame: 0 / 300 Changes: 0 Input: 640x480 Scale: 1 Output: 640x480 Changed: false >> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Port deleted >> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Removed port from allocator (3 remaining) Removed port from p2p socket: 3 remaining Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Port deleted Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Removed port from allocator (2 remaining) Removed port from p2p socket: 2 remaining >> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Port deleted >> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Removed port from allocator (1 remaining) Removed port from p2p socket: 1 remaining Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Port deleted Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Removed port from allocator (0 remaining) Removed port from p2p socket: 0 remaining
HTML
<html lang="en"> <head> <title>Web Socket Signalling</title> <link rel="stylesheet" href="css/socket.css"> <script src="js/socket.js"></script> </head> <body> <h2>Repsonse from Server</h2> <textarea id="responseText"></textarea> <h2>Video</h2> <video id="remoteVideo" autoplay></video> </body> </html>
Javascript
(function() { var remotePeerConnection; var sdpConstraints = { 'mandatory' : { 'OfferToReceiveAudio' : false, 'OfferToReceiveVideo' : true } }; var Sock = function() { var socket; if (!window.WebSocket) { window.WebSocket = window.MozWebSocket; } if (window.WebSocket) { socket = new WebSocket("ws://localhost:8080/websocket"); socket.onopen = onopen; socket.onmessage = onmessage; socket.onclose = onclose; } else { alert("Your browser does not support Web Socket."); } function onopen(event) { getTextAreaElement().value = "Web Socket opened!"; } function onmessage(event) { appendTextArea(event.data); sdpOffer = new RTCSessionDescription(JSON.parse(event.data)); remotePeerConnection = new webkitRTCPeerConnection(null); remotePeerConnection.onaddstream = gotRemoteStream; trace("Java Offer: \n" + sdpOffer.sdp); remotePeerConnection.setRemoteDescription(sdpOffer); remotePeerConnection.createAnswer(gotRemoteDescription, onCreateSessionDescriptionError, sdpConstraints); } function onCreateSessionDescriptionError(error) { console.log('Failed to create session description: ' + error.toString()); } function gotRemoteDescription(answer) { remotePeerConnection.setLocalDescription(answer); trace("Browser Answer: \n" + answer.sdp); socket.send(JSON.stringify(answer)); } function gotRemoteStream(event) { var remoteVideo = document.getElementById("remoteVideo"); remoteVideo.src = URL.createObjectURL(event.stream); trace("Received remote stream"); } function onclose(event) { appendTextArea("Web Socket closed"); } function appendTextArea(newData) { var el = getTextAreaElement(); el.value = el.value + '\n' + newData; } function getTextAreaElement() { return document.getElementById('responseText'); } function trace(text) { console.log((performance.now() / 1000).toFixed(3) + ": " + text); } } window.addEventListener('load', function() { new Sock(); }, false); })();
Java server
public class PeerConnectionManager { public void createOffer() { peerConnection = factory.createPeerConnection( new ArrayList<PeerConnection.IceServer>(), new MediaConstraints(), new PeerConnectionObserverImpl());