I've created a server for my game, but for some unknown reason the server (most likely) stops receiving packets while the client still tries to send them.
I've tried to use a couple examples from MSDN, but it still just stops receiving.
This is what I think that happens:
Client connects to server Server sends packets as response Client responds to those packets Server sends a lot of packets (one packet at a time) Client tries to send packet, but it kinda fails (it does send packets though!)
Server waits (doesn't get any packet anymore) Client triggers timeout code on the server side and gets disconnected from the server
Does anyone have an idea how and why this can happen?
Here are the receive functions I use:
BeginReceive (SocketInfo includes data such as amount of data to receive, state of packet (header or content) and the buffer)
public void BeginReceive(SocketInfo socketInfo)
if (mDisconnected != 0) return;
args = new SocketAsyncEventArgs();
args.Completed += (s, a) => EndReceive(a);
args.UserToken = socketInfo;
args.SetBuffer(socketInfo.DataBuffer, socketInfo.Index, socketInfo.DataBuffer.Length - socketInfo.Index);
if (args != null)
View Complete Post