Bulk Requests

It’s a bit server-stressful/client-wasteful to call the POST /Rhino/Geometry/Point3d/New 100 times if I want 100 points; not to mention things like Promise.All() break my balls.

I obviously tried calling it with [ [ 1, 11, 21 ], [12, 1112, 3112] ] and I’m optimistically getting 200s but no actual response.

Would it be possible to have some sort of bulk endpoints for routes too, where i can pass an array of args and get the results in one go?


Sounds reasonable; I’ll see what we can do. My first thought would be to support a querystring of something like multiple on the endpoints so I know that the input json array should be processed as a list of calls.

The multiple querystring is now supported for all endpoints. Here is a curl example for creating two circles

curl -H "api_token: steve@mcneel.com" -H "Content-Type: application/json" 
-d '[[{"X":1.0,"Y":2.0,"Z":3.0},12],[{"X":1.0,"Y":2.0,"Z":4.0},30]]' 

Results are just returned as a json array. I didn’t take the time to do any optimizations on the server for this, but I plan to. multiple calls for things like intersections should really be done using tasks for each item in the input array. This is not implemented at the moment and everything is very serial on the server.


This is absolutely lovely!!!

Hello Steve,

I’ve got the BulkRequestService on a feature branch.

The timer is just a first pass. It works but would be better implemented in closer coordination with the enqueueing and dequeueing with some Tasks and timeouts. I got the 35 seconds down to 15 seconds… and the time savings seems to increase when adding more requests…


It enables writing code just as one would without the service, but capitalizes on situations were the same uri comes up again and again in asynchronous/parallel code. It automatically routes requests to the single or multiple (with querystring) endpoints.


Thanks Nathan, I’ll check this out

Thanks for typing this up. I spent a bunch of time thinking about this approach last night.

I see what you are doing, but boy does that get complicated and I can’t tell how you would link the results together with what was calling it. The nice thing is that the compute server API allows for this pattern in order to bulk upload. I definitely want to support “multiple” style calls from the C# SDK, but at the same time I really want to keep this as simple as possible.

Going off of your concept of creating pipelines of objects, I’m playing around with having “Bulk” versions of the compute functions which return an object that encapsulates the function call, json data, and return information. I implemented this as a ComputeBlock generic data structure which holds on to the function name and json data. The compute functions would return ComputeBlock objects and then you would pass an array of these to the ComputeServer class for processing. This technique got me down from 45 seconds in the “serial” mode to around 8 seconds using ComputeBlocks. Essentially the same thing as you were doing, but hopefully getting some of the complexities out of the way.

Here’s the feature branch for ComputeBlocks based on your prototype.

I’m autogenerating the RhinoCompute.cs file from our internal RhinoCommon source and cam probably automatically create versions of all of the functions that return a ComputeBlock as well as the current serial return style.

1 Like

I’m also going to start investigating getting the server computation portion of ?multiple=true requests to be run in parallel. This won’t make much of a difference on the wimpy machine that we are currently using for testing, but if we scale up to a larger core count…

Not sure if this is useful, but you could serialize expression trees to json and run them in the server (expression trees are stateless). You would be able to write queries in linq that would be evaluated in the server possibly avoiding back and forth of unnecessary data.

1 Like

Rather than coming up with a new language (expression trees), what about posting geometry + a python script, and sending back the results of the script?

1 Like