Multiple parallel api web calls in c#

Hi all,
I’m trying to make a c# script to make multiple url api requests with “async” and “await”, so thousand of request would be done “simultaneously” instead one after the other.

Non-parallel method works.
(I’m sorry but during edit I’ve lost the string-comment giving credit to original author of this fragments of codes… it was someone on the old gh forum…)
But it starts each url api request after the previous one ended (in a for cycle). Not what i want.

Generically, I hope someone could help in this… I’ve lost many hours already and now i’m struggling with…
script editor removing keywords!!!

Usually c# script editor “cleans” up the script by adding or removing spaces in the script, and that’s ok.
Here instead it completely delete the “await” keyword every time you enter the editor!! (see comment inside script)
Is this a bug?

Anyhow, that script works for just “that” slider set at “2”, if I make the lists larger it make the whole rhino environment (or gh, who knows…) to freeze.

Hoping in some of you experts! (16.4 KB)

It’s not non-parallel, and it works, and therefore it’s not making “each url api request after the previous one ended”.

It’s actually quite fast (~75ms / 4 requests) and if you press F5 several times you will see that each request (or, results) comes in arbitrary order, which is because of the asynchronous Parallel.For-loop being used.

Fig. IMO Parallel.For isn’t very slow:

But, the code was not safe since storing asynchronous results in a regular string list (List< String>) isn’t thread-safe. Instead I recommend using a fixed length string array, like so:

private void RunScript(List<string> Longitude, List<string> Latitude, ref object AB, ref object B, ref object C)

    // Creating url list
    var latlong_cnt = Longitude.Count + Latitude.Count;
    var urls = new string[latlong_cnt];
    var ix = 0;
    for(int i = 0; i < Longitude.Count ; i++)
      for(int j = 0; j < Latitude.Count ; j++)
        string text = @"" + Latitude[j] + "&lon=" + Longitude[i] + "&appid=7370f9a3c197cb724539bd8a0309ffeb";
        urls[ix] = text;

(splitting to make all code visible)…:

    var results = new string[latlong_cnt];  // <-- Thread-safe!
    System.Threading.Tasks.Parallel.For(0, latlong_cnt, i =>
      var request = WebRequest.Create(urls[i]);
      request.Method = "GET";
      request.Timeout = 1000;
      request.Proxy = null;
        var stream = request.GetResponse().GetResponseStream();
        var reader = new StreamReader(stream, System.Text.Encoding.UTF8);
        results[i] = reader.ReadToEnd();
      catch (System.Net.WebException e)
        RhinoApp.WriteLine("Err: " + e.Message);
    AB = results;

The corrected code: (14.1 KB)

The first time a ScriptComponent is run it also needs to compile the code. Perhaps that was the reason why you thought it was slow?

// Rolf

async and await are keywords being introduced in DotNet Framework 4.5. Only Rhino 6 is supporting this. However the scripteditor, which is a 3rd party library used within Grasshopper needs also to support this new keyword and its syntax. As Rolf pointed out, you can use traditional multithreading such as the Task.Parallel namespace or you can even do it the old fashioned way in using the Threading namespace. However multithreading does not always speed things up. The overheat created by starting things in parrallel can be greater than solving it linear when the amount of operations is low.

1 Like

Oh, great tip! Thanks!
But with longer input lists i’ve found an error: var latlong_cnt = Longitude.Count + Latitude.Count; must be corrected with multiplication instead of addition:
var latlong_cnt = Longitude.Count * Latitude.Count;
and now it works, or at least, it doesn’t freeze anymore.

But still this is not parallel, each web call wait the previous one to finish.
Total time of execution is proportional to number of URLs to call.
This is not working “parallel” at all.
(and still I don’t understand why “await” keyword is being removed)

– Ops!

Score 1 - 1 to you! :wink:

// Rolf

(I’m on Rhino 6)
I can open like 150 URLs simultaneously with a browser (open old session with ctrl+shift+T), and they load in a split of second (each one independent to others). Contacting the server, getting the reply and all.
I don’t know if it should be Task.Parallel or else.
I’m not expert, but I think it just should be possible to do 1000 URL calls or 10 and it should take almost the same time.

For now the script takes 150ms for 4 calls, and some seconds for like 50 calls… it seems completely un-parallel to me.

Try using VS. Any “slowness” is probably due to ScriptComponents being very slow to output lists.

In any case, Parallel.For is parallel, there’s no doubt about that… :slight_smile:

If you have many thousands of urls you can also partition the list before entering the loop. This removes some overhead for doing the partitioning while in the loop.

// Rolf

Are you sure? I assume they’re not all on the screen at once, so all the browser has to do is display a bunch of empty tabs and start populating them at leisure with the visible ones prioritised.

Yes, just re-tried.
(I see the reload icon on all the tabs for just half a second, then it all stops, and in task manager i see a spike in web usage and then stop)
But, that’s how it is supposed to be, i mean, a browser surely make tabs works independently each-other.
All the (pseudo-working) methods I tried on c#/grasshopper make webcalls one after the other… for a large number… you wait.

Independent yes, but there’s only so many threads your computer can run simultaneously so firing off 150 of them means they’ll just get into each others way.

I haven’t looked deeply into the code posted here, Parallel.For is part of the TPL (Task Parallel Library) which is the most modern multi-threading api available in .NET. If stuff doesn’t go faster using it or just using Tasks straight up, something is wrong elsewhere.

1 Like

Parallel.For may still wait on a thread for a request to complete. If this is the case then you would only be able to have as many requests as processors at any given time. You may want to play with Tasks instead using GetReponseAsync to see if you get any difference in perfomance.