I built RhinoMCP, exploring AI assisted modelling

Hello everyone,

The AI hype train has been driving for some time now, but I have not yet seen a convincing tool/technic that integrate AI with 3d modelling due to the lack of communication strategies between the model and LLM.

With the recent development of MCP (Model Context Protocol), it becomes relatively simple to connect AI models to different data sources and tools, in our case Rhino. This allows us to have two-way communication between a Rhino 3d model and the AI.

Over the last weekend I built a rhino plugin RhinoMCP as a proof of concept to allow AI to directly control Rhino and assist modelling.

There are two demos I made to show it’s potential.

By providing different “tools”, similar to api endpoints that are exposed by RhinoMCP, any AI model can understand which functions to call to get different type of jobs done. In demo, this includes get_document_info, create_object, modify_object, get_selected_objects etc..

As a minimal variable product, I already see a bigger potential to have AI contribute to 3d modelling. I fully understand that the geometric complexity, parametric relationships and precision constrains in engineering models speak against AI in different aspects, so I don’t expect it to create a full building in one sentence.

As I haven’t been actually modelling for som time, so I would like your ideas to pivot the development of the tool to have it more practical. In the day to day modelling, at what moment do you think a AI assistant would make sense? What I can think of:

  • meta data management: read/write different attributes
  • filters: fe.g. select parts that have attribute xx of value xx
  • combining tool chains / scripts: e.g. run my script1, then select created objects and run script2

I am looking forward to your ideas.

Best,
Chen

7 Likes

Hello Chen, I’ve watched your amazing demo and would like to discuss it in urban design workflow.
In urban design field we don’t need highly precision geometry, we just need a quick building massing in concept stage (floor plates, rough curtain walls).
We expect that input some planning index (floor area ratio, density, lan use type) and site boundries, then a massing model would shows up simultaneously. Also, we could manual adjust those outcome models then planning index would change simultaneously(reversely). The complex part is each land type expect different types of building(residential building naturely looks different compare to an office tower), and each land type might mixup with other function.
Please let me know if you are interested in the topic above, our office has some research funding, maybe we could work close into it. I’ll attach some links below to help you to understand, thanks!

https://www.food4rhino.com/en/app/metacitygenerator

Hello Yijie, thanks for sharing your project. It looks quite interesting. I understand what you need is more of a design generating tool rather than general modelling tool, which I would like to put my focus on at the moment. However based on the codebase we already have, you should have a clear path implement something that fits your requirements. Looking forward to your further development!