AWS RhinoCompute Server Specs

Are there any baseline and/or recommended hardware specs for running a RhinoCompute instance on AWS EC2?
My useage pattern is primarily:

-sending requests as tiny json chunks to RC
-RC does in-memory generation of 3d mesh content
-streaming that content back to caller

I’m leaning towards starting with:

-single windows server VM with a “decent” number of cores
-let the rhino.compute app spin up and down compute.geometry instances on that machine
-first scale up: increase number of cores as concurrent user load increases
-second scale up: duplicate above instance into an elastic load balancing & scaling group

Does anyone have a sense of which AWS EC2 instance would be my best starting candidate for this? Even a rough “You’ll wanna start out with at least X, but pretty soon you’ll probably want Y” would be helpful. Thanks!

Honestly, we don’t really give recommendations generally because everyone’s use case is so different. However, I think your description of how you plan to proceed sounds reasonable. Honestly, I’ve found the t2.micro to simply be so slow in doing almost anything that I tend to usually use a t2.small or t2.medium as my starting point. I think you could start there and see if that gives you the performance you were looking for.

Thanks Andy! You saved me at least a few iterations of performing manual binary search on AWS’s list of instance types.

1 Like

I recommend using the t3.* or t3a.* instance family, which provides better performance at less cost over t2. You can read about this here.

For example, the t3.medium instance type is a good choice for running two instances of Rhino and Grasshopper in parallel. On average, each instance of Rhino will have 2GB of RAM available. If you have more memory-intensive workloads, use t3.large or higher (see list of t3 instance types here).

Depending on your workload, it also makes sense to enable unlimited mode, allowing you to use more CPU than the included baseline.

1 Like

Thanks Alexander, that’s useful info I wasn’t familiar with, will read up!