Bugs? in Rhino 7/8WIP: GPU heating and no extrusion handle in SubD, WIP 8

Heres a recorded video of the problem. Look how the GPU clocks from 210 Mhz directly to 1035 Mhz as well as other parameters increase when Rhino opens.

hi @Hans_Christian_FĂźrst thanks for posting the video. Apart from the increase in temp at startup, does the temp of your graphics card change significantly when you are using Rhino in the way you are normally doing?
I don’t know if it is possible for Rhino to only activate the better graphics card in your system when it is actively being used. @jeff can you comment on that?

I just testet it. What happens is that the most significant temperature change takes place at startup - just about 10 C. Then if I begin to model with shaded on the temperature increases if I do a lot in the viewports at the same time but only by 2-3 degrees. Of course if I begin hardcore rendering there’s an increase again but thats normal.

It seems like Rhino behaves normally besides the initial increase in temperature cause by the increase in clock speed, memory and currrent.

I noticed that the increase in temperature by modeling (the following increase in temperature by 2-3 C after the initial) is cause by increase in current/power. The clock speed doesnt change.
So it must be something in the initialization of Rhino that makes the clockspeed go up in the beginning.

I will post a test from Blender afterwards that shows how Blender only increase the clockspeed when needed instead of setting a high permanent clock speed from the beginning.

Ok now I made a video with the temperature utility side by side with Blender. I used Snagit to capture and the capture increases the temperature by itself because it uses the GPU. The same is the case for the first video. But that shouldn’t distract too much to see the difference between Blender and Rhino anyway.

When looking at clockspeed and current in Blender it is clear that Blender only increases the parameters if it needs to like when orbiting in the viewport, modeling, zooming in and out etc. In the moment the Viewport is static the GPU clocks down to minimum speed.

Thats the big difference. In Blender the clock speed depends on the ressources needed in the viewport but in Rhino the clock speed is set to high just by opening the program and it never goes down until Rhino is closed.

How is it going with this issue?

Neither Rhino nor Blender are “controlling” the video card like that, it’s isolated from the GPU hardware by 20 layers of crap(as John Carmack would put it,) has zero say on the power settings of the GPU, and there’s virtually no way of knowing what’s going on without actually working directly with Nvidia and the laptop manufacturer–who all implement their own power settings voodoo such that you really can’t tell what sort of performance you’re going to get from the specs–which obviously isn’t going to happen, “Using niche 3D software causes temperatures to increase slightly when I think they should not” is not going to get the attention of anyone. You don’t even know if your measurements are correct. Laptops just kind of suck.

Your best bet is to try to get Steve Burke or Der8aur interested in it.

Well that was not exactly what I was expecting as an answer. I’m not into 3D graphic programming but the “20 layer of crap” is some sort of API-interaction between the GPU and the 3d-program or else there would be no rendering by the GPU.
There’s nothing wrong with my measurement. I have a state of the art Thinkpad Laptop that costed $4.500 and I have no problem using all kinds of 3d software except very few among them Rhino.

I don’t even have to measure anything I can just put my hand close to the laptop frame and feel the heat and hear the fan noise when comparing Rhino and Blender. A 10C increase is not a “slightly increase in temperature”.

Actually I was expecting Mcneel to contact Nvidia by mail and ask if they had heard of this before and if so maybe get a hint if it could be solved easily or not. I don’t need 20 scientist measuring my laptop temperature. The videos must be more than enough to show that there is indeed an issue.

Nvidia absolutely does not care. They barely give a hoot about bugs that cause Rhino to crash, and there are many many factors involved here.

Well with that attitude nothing is ever going to happend. I simply refuse to believe that Nvidia wont respond to a concern from a 3d modelling company. The reason why I purchased that laptop was because it had that gpu.

If the gpu can’t interact with 3d software my incentive to pay extra for a laptop with a fast gpu is gone and that would be revenue lost. Nvidia now even makes specific drivers for 3d software targeting users having rtx consumer cards.

A 210mhz for a GPU means it is in Idle state or Dormant. This happens when your PC is using the integrated GPU, this helps reduce heat and power consumption and for sure extending your battery life.
When your 3070 GPU starts (triggered by Rhino) it goes to the base clock this means the GPU is turned on and you should expect more heat.
What can concern you is the amount of excessive heat you get likely due to thermal design of the laptop I suggest you install a video game ( a little modern one) and I’m very sure that you will get the same behavior.

One way to do it is to underclock your GPU to run at a comparatively low voltage which should generate less heat. That’s why I haven’t bought a single laptop in the last decade just because the hardware manufacturers doesn’t give thermal design a high priority.

For the blender example. I’m pretty sure this is a setting you can look at the Nvidia control panel. Rhino picks the setting for openGL GPU with “Prefer Maximum Performance” and likely’s blender settings uses “Nvidia Recommended” where it keeps both GPUs running and picks the 3070 whenever needed.

Hi tay,

Thanks for you answer.

Some of the installed software uses special settings in the Nvidia Control panel but neither Rhino or Blender do. They use the global automatic settings and “program controlled”. When I found out about this issue the first thing I did was to investigate if I Rhino was forcefully set to full GPU usage. But it’s not and Blender is neither. If it was just about controlling settings in the control panel then I wouldnt have had to use anyones time here.

I actually have a 3D game installed and when I play the fans go on maximum depending on the resolution settings etc. That’s normal and understandable since there is much going on on the screen. On the other hand the GPU temperature just rises by having Rhino open.

I can tell you that laptop has come a long way in the last 10 years and I’m pretty impressed by the speed, noise and thermal temperature control. I haven’t had any blue screen of death or unstable windows session. If you play a 3D game from a desktop computer the GPU fan will also be noisy.

1 Like

@Hans_Christian_FĂźrst

I tried to reproduce the issue after tweaking some of the Rhino Performance Settings in Nvidia Control panel, In the video you can see.

  1. On Startup, Rhino uses the Dedicated GPU to construct the UI, that’s why you get the initial boost in frequency and I think this is a good thing since it will expedite the application startup.
  2. after a couple of seconds, the GPU returns to the idle mode, when there is no heavy request from Rhino. this keeps repeating as you rotate the view or modify the objects.

** please note I’m testing this on a laptop with hybrid graphics , Intel Iris + RTX A1000 laptop, (it shows as A5000 just because I played with the driver). also the laptop is unplugged and I’m sending greetings from Heathrow airport.

2 Likes

also can you check this parameter in GPU-z while running blender?

I have a feeling the Blender is not utilizing your 3070 TI at all.

Yes I can see that your laptop is having the expected behavior in Rhino. When starting up and drawing windows and the core engine it’s logical that more GPU-ressources is used. This is much like the behavior in Blender.
But my problem is that the GPU never clocks down or reduce memory size when idle. You can see that in my video.

Hm mine “PerfCap Reason” is idle all the time but I doubt that the Blender doesn’t use the GPU. If you look at my Blender video the clock and memory rises in the moment I begin to orbit in the viewport.

Thanks for your efforts by the way!