Any practical limit on size of batch calls?

We are for example sending 2500 pairs of curves to be filleted by a single call to Curve.CreateFillet and are getting unpredictable behaviour from our server: sometimes the call succeeds, other times it crashes the server.

5/5/2021 5:34:50 AM|Fatal|<>c__DisplayClass9.<ReadBytesAsync>b__8|System.ObjectDisposedException: Cannot access a disposed object.
                          Object name: 'SslStream'.
                             at System.Net.Security.SslState.CheckThrow(Boolean authSuccessCheck, Boolean shutdownCheck)
                             at System.Net.Security.SslState.get_SecureStream()
                             at System.Net.Security.SslStream.EndRead(IAsyncResult asyncResult)
                             at WebSocketSharp.Ext.<>c__DisplayClass9.<ReadBytesAsync>b__8(IAsyncResult ar)

There’s no explicit limit, other than available memory and request/response timeouts. How long does the request take that throws this exception?

It varies, for reasons not known to us, from 10 to 50+ seconds. 15-20 seconds is typical.

When it crashes we have to restart the Compute process on the server.