When creating a TCP client using the socket API, the port is used on the local host to connect to the TCP server.
The port used is not available for another application, which is a TCP server for binding and acting as a server.
Since the port used for the client is dynamically determined, this may be the port that my application wants to use as a server.
Is it true that a TCP client will dynamically select a port to use and prevent other programs from being a server on that port?
Can the client control which port it uses to make sure that it does not occupy the port required by another program?
Ivan Novick
source share