Suggestion for the Forum

Hi

I have found ChatGPT to be incredibly helpful in creating Grasshopper components and resolving coding issues. I would like to suggest the addition of ChatGPT as an option in the forums for Rhino and Grasshopper. This would provide users with an additional resource to obtain answers and assistance. Furthermore, it would be beneficial if McNeel developers could contribute to the creation of specific training data to enhance ChatGPT’s performance for Rhino users.

This post improved using ChatGPT

2 Likes

OpenAI free account allow 3 requests per minute.
This does not meed the demand of an international public forum.
Are you refering to OpenAI Plus account?

it would be beneficial if McNeel developers could contribute to the creation of specific training data to enhance ChatGPT’s performance for Rhino users.

Also, ChatGPT training data does not work that way.
All public information such as mcneel wiki, mcneel forum, mcneel websites are already inside ChatGPT training data.
There is nothing left to contribute.

Here’s a nice round of Rhino/Python misinformation courtesy of ChatGPT. In reference to a recent post here, I wanted to make sure I hadn’t missed something in RhinoCommon that allowed one to get the volume of a SubD object directly without having to convert it to a Brep first. Conversation follows (JU is me):


Looks good, no? Except… The SubD class does not have GetVolume() method.

Better, no? Except… rs.IsSubD() and rs.IsSubDClosed() do not exist (yet).

So, finally we have some code that should work - after I questioned ChatGPT’s responses. For me that was just a check that I hadn’t missed a method that might have recently been added to the API. For someone who had no idea about how to script Rhino/Python however, it would have been a lot more confusing.

I have used ChatGPT for a bunch of Python components in Grasshopper and scripts in Rhino.

As with a lot of topics, ChatGPT has a tendency to be confident in things that seem plausible, but actually do not exist, like functions that sound like they should be there, but are not.

I think it has to do with the limited training data/examples that are around for Rhinoscript and Rhino-specific commands.

Usually you ask a follow up question and tell it the function does not exist and it will say something like “Ah yes, sorry for the confusion. You’re right, there is no function XY…”

On the other hand I have managed to create quite a few rather complex scripts with the help of ChatGPT. I find that it is beneficial for us, as well as ChatGPT, to develop things in small steps. Also it helps to have ChatGPT outline the necessary steps for a certain problem first and then go through those step by step.

Lastly, it can help to “train” ChatGPT just in one chat, since it remembers the contents of a chat. So I have one chat where I have shown it a bunch of examples of Rhinoscripts with functionality it did not know about (because it was added after September 2021 or whenever ChatGPTs cutoff is). That works really well. If you have plugins and/or web browsing enabled in ChatGPT, you can also ask it to look up some current examples for newer topics, like SubD.

ChatGPT is far from perfect when it comes to coding, but with a bit of help and patience it can create some remarkable tools that would have taken me weeks on the forum to get all the parts. A complete newbie will have a hard time with ChatGPT alone. You do have to ask it relevant follow up questions and know a bit on how to debug code. For someone like me, who has some programming knowledge in other domains and knows quite a bit about Rhino/Grasshopper, but has no real experience in Python and Rhinoscript, it’s an incredible help.

1 Like

At least ChatGPT trains people in asking better questions! It is not uncommon that people post dozens of statements but do fail to ask a single question…

1 Like

Absolutely. Also the way we ask questions and the way we ask ChatGPT to behave can have a huge impact on the quality of the output, especially when it comes to reasoning, as shown by the recent paper on Tree-of-thoughts reasoning (good explanation of it here: Tree of Thoughts - GPT-4 Reasoning is Improved 900% - YouTube).

Making ChatGPT behave like they are actually multiple experts is still a new concept and one that takes some adjusting of our questioning process. But it’s something that we do effectively when we post here on the forum for example. We ask many experts their opinions and get different answers and answers that build on the previous answers by someone else.

1 Like

The big problem with ChatGPT is that it doesn’t appear to have been taught the contexts in which invention (or extrapolation) is OK and those where it is not. If I’m writing Jabberwocky 2 then invention is helpful. If I’m asking about methods in an API it isn’t.

It will be interesting to see whether we can add an effective “do not invent clause” to our interactions.

(The most notorious case of misplaced invention I’ve seen so far is the one where lawyers asked ChatGPT for legal precedents to support their case in court and ChatGPT invented half a dozen plausible but entirely fictional ones that said what the lawyers wanted. Lawyers were found out and now in big trouble with the court.)

2 Likes