GHPT - Generates a Grasshopper Script from a prompt with Chat GPT

Hey Discourse!

At the aec-tech Seattle Hackathon myself and my team created a new Grasshopper Component called GHPT which takes a prompt and generates a Grasshopper Script! It was created at a hackathon so it’s a bit rough around the edges, but its still quite exciting!

Get it now @ Food4Rhino


How about posting some examples here? Of what you asked for and the reply you got?

I know, this is different… But there have been predictions for four decades that programmers will be obsolete “soon”. This turned up in one of my feeds yesterday, a trip down memory lane of the misery of churning through “next new thing” cycles in software development.

Hype cycles

Nothing is ever needed for everything.

Hey @Joseph_Oster,

In the github repo is a directory of samples. This script shows all the successes and failures of the current version.It’s not meant to make anyone obsolete, I am a Programmer so I’d only be shafting myself :joy:. It’s just meant to explore LLMs and see what they are and aren’t useful for.

– cs




sounds quite interesting. I checked out your website. Do I understand it right, that its generating grasshopper components or is it generating some kind of python script?

Hey @the_Monty, it generates a Grasshopper script, it doesn’t currently generate python

1 Like

If you try using the “Download and Install” that should resolve your package dependency issues. I find I have to restart after installing GHPT currently, not sure why I’m not too experienced making Grasshopper Yak Plugins

– cs

I know but am not that interested. I think it would be instructive to see the queries and results without requiring the plugins but again, my interest is very low.

How do the GH components get created, placed on the canvas and wired together?

You could also click close and it’d just open with a few blank components. (ChatGPT only knows about the default set, so it’s unlikely you’ll miss much)

When the question is fed into the component, it’s wrapped in a larger prompt with examples (each example has the example response in a json format) that is sent to ChatGPT. ChatGPT then responds with its answer in the form of a json schema which we deserialize into components, and then use that to create them on the canvas.

– cs


I did that, of course. The programming you did to make this work is impressive. The results… meh.

I think the fact ChatGPT can figure out something and return its thoughts in the schema presented to it is quite impressive. It does more impressive results with GPT-4 and if the prompt is improved it could likely do some very cool things. I also think asking better questions will yield more impressive results

Yes, for sure. I wonder if it reads this forum? And wonder too how much computing power goes into answering these queries? The infrastructure under the hood (in the cloud) must be MASSIVE?

“How would Joseph_Oster create a twisted tower in Grasshopper?”
Answer: Sorry there are some “Unrecognized Objects” in the solution
:rofl: :wink:


Thank you for sharing your work. It’s exciting to see work with this but it’s also disappointing that it’s not living up to the expectations being sold.

These are my first three tests:

No curve length and unncesary components.

It got the components but not how to connect them.


This 3 calls had a cost of $0.27-$0.36 from OpenAI and they took aprox 30s~1min to complete each request.

In the sample file there are also many examples of fails, which is strange because you said “it’s a bit rough around the edges”.

I think it’s unncesary to leave the panel and the component, although I understand now that it is not a plugin intended for real use for the moment.

This is important, you must to let know your users that their OpenAI private key is going to be exposed in a ghpt.json file in the package folder. Calls with this credential cost money so is not a good idea to expose them even locally, if you do at least make the user aware.

Please avoid to put a single component (or a few, like less of 10) in a unique tab. You can use any of the existing ones or to use the “Extra” tab shared by many third party plugins.

So, what was supposed to be something fun and exciting has turned into something disappointing by selling it dishonestly. It’s a WIP in very early stages and it certainly doesn’t stand up to “turn a prompt into a GH definition”. The first thing I thought when I saw your post was that someone had already trained a GPT for GH, and that even so it was not going to work well because the model has to be local to be fast enough (for the use cases I want) and becase of lack of labeled training data and probably a LLM without turning it into an expert system (with a lot of constrains) is no enough to solve the problem.

But still, is great to have people working on this. I just didn’t want to waste other people’s time. What plans do you have for the future?

I just found your prompt.txt file with the system prompt in it. I think this is the most interesting part of your project, because there are already several plugins doing calls to OpenAI API.
I think you should avoid the use of parameter names because this can change in important components like Addition. Use its index instead. Also, you have to try to minize the format as much as possible to reducce token, for example, instead of:

"Connections" : [
				"To": {
					"Id": 3,
					"ParameterName": "Curves"
				"From": {
					"Id": 1,
					"ParameterName": "Circle"
				"To": {
					"Id": 3,
					"ParameterName": "Curves"
				"From": {
					"Id": 2,
					"ParameterName": "Geometry"

you can use:

"Connections" : [ ["3-1", "1-1"], ["3-1", "2-1"] ]

where the number after - is the parameter index.
Also use “A” instead of “Additions” or “C” instead of “Connecttions”, and explain it at the start of the prompt. Also do not include long examples in the system prompt.


Good morning @Dani_Abalde ,

Quite a number of points here to address!

The samples show the things it can and cannot currently do quite well I thought, we weren’t intending to oversell anything! We also did use GPT-4 which does create even better results.

That’s fair, it was created at a hackathon, so we couldn’t do anything too stupendous in the time constraint. I’ll add a little note in Food4Rhino / github.

That’s a good call, I’ll update that

The first line of the post does say it was created at a Hackathon, and it is version 0.1.x. It does take a prompt and turn it into a definition, just maybe not at the scale you were hoping. We also can’t really control the cost or speed of OpenAI. It is also free and open source, you’re always welcome to contribute as well so that we can get to this stage! :blush:

I think a lot of what I’d like to do is polish up the component, improve the prompt and allow for switching between GPT-3.5 and GPT-4. Improve its component understanding, and allow for feeding scripts into ChatGPT to train it so that it can become even better. Oh, and Mac support would be good too.

I do agree, it’s certainly the part that I learnt the most about during the Hackathon, and does show what could be possible.

I’d much rather use indexes, or even GUIDs, but ChatGPT doesn’t know this. It only knows parameter names, and it is often wrong about those too. So this is the only ‘reliable’ way to do it.

For sure, this would be another great improvement.

Overall, thanks for such in depth engagement and feedback @Dani_Abalde

– cs


There is certainly room for improvement, but this is absolutely amazing for something that was created in a little over 24 hours and it seems like everyone else at the hackathon concurred – great work!


Yes, I agree. I think translating a ChatGPT response directly into a Grasshopper solution is a brilliant idea. Doing that in less than 2 days is always an achievement. And the code doesn’t look bad at all. But it is obvious that something like this can’t work reliable at the moment. I think this is also not the point of a Hackathon in the first place. In my company we are doing this frequently for multiple reasons, such as looking at the latest technologies, social networking, motivating and building confidence… and just to have a fun time. Thank you for sharing. Do you know of any events in Europe?


Yeah but you have to install the plugin to take a look at them, it is far from converting a prompt into a GH script.

Cost and speed is a function of how big your prompt is. It is currently unnecessarily long and not very compact. That’s easy to fix.

I would like to contribute with code if there are plans to fine-tuning the model, since I already spent some time with GH definition embbedings, which is the way to encode/decode the script into a ligther and better vectorial representation. I don’t see another way to make this idea some usefull in UX terms. For me it will be more helpfull to have an assistant that understand the current GH script and helps you to continue than creating a natural language to GH definition translator, and embeddings are mandatory for this.

But I’m happy to help with the prompt too if you need feedback, I don’t think it’s going to work properly with just prompt engineering but it’s interesting how far it can go.

I don’t think so. The component name in “Additions” may be mandatory but no the “parameterName” field, it can be replaced by an integer id. Names and GUIDs are too long token-wise. What it miss is an explanation of the format you are using in the prompt, so it can understand how it works.

I also recommend you to send the message with roles (not sure but I think you are not using them), to separate between system prompt, assistant prompt and user prompt, instead of using a plain message. In this way, besides to make it understand better the prompt context, you could connect to the chat assistant once per GH session and just make one message as a system prompt the first time of use, and then just stream user/assistant messages, to improve dramatically cost and speed.

1 Like

Hey @Dani_Abalde,

It sounds like you would be able to really help us fine tune the prompt. Which is something we’re really interested in improving and learning about. Would you be able to open a PR with some changes for us? We’re all quite new to ChatGPT Prompts

This sounds very interesting, can you expand on this?

1 Like

Just take a look at the OpenAI API:
This is similar to what you can do with plain prompts like “Act as … and the user will provide you a … then you respond with …”, but in a structured way, and I guess the system prompt is remembered throughout the conversion, rather than until it loses the original context as it happens with ChatGPT at some point. You can play with it from the playground or using my OpenAI-for-GH integration.