When does socket.io use polling instead of websockets?

I am new to socket.io and wrote my first application in node / express / socket.io. Now everything works fine on my nginx server. I want to publish my application to the public, but I fear that it simply won’t work for many people. I had several friends who tested my application and everything went smoothly (this is a pretty simple application). Here is my concern: right now, every connection seems to be using websockets, what I want. But will my application sometimes go down to “polling” because of something strange at the end of the client? If so, how does socket.io decide when to use polling and when to use websocket (is it based on a browser / version or connection, or what)? I'm sure it uses websocket whenever possible, but are there any conditions somewhere,who will knock him down to the "poll"? Also, is there a way I can test my application using a “poll” to find out if it works?

I can send the code, but I think this is a general question about how socket.io works.

+4
source share
1 answer

The only time a client will go down to an ajax poll (assuming your server supports it, what it does), when the client browser does not support websites (for example, a very old client) or, possibly, if any The proxy on the client path does not support webSockets.

webSockets are supported in IE10 + and all recent releases of other browsers.

So actually it's really just IE8 or IE9 or a bad proxy server where you might not see webSocket client support.

There are no other conditions (other than lack of support) that will “knock down” the connection before polling.


xhr-poll , , , .


, WebSocket HTTP-, "" webSocket, , , , HTTP- - . socket.io , , webSocket.

+6

All Articles