I need a script for poly-count shaming please

Hi All,

The problem:

We are working on some very large assemblies (no worry blokes, there are are no blocks there, just layers!) of car stuff and we are trying to decide how much we retopo (it’s all mesh data from car companies) to bring the entire file size down to a manage polycount in VR.

In our experience we can run Unreal VR scenes working great with about 6 million polygons on good hardware (RTX cards) but things get less than ideal with our polycount goes up in the 10 million polygons. And they suck at 15-20 million.

The mathematically correct thing to do would be retopo everything to endup with a total poly count at/bellow our threshold. The reality is that it takes a lot of time/money/labor to do that so we trying to see how in some cars of we only retopo the marts/objects with the largest polycount we can get to what we need.

The scripted solution:

Can we have a scrip that shows the X number of largest poly count objects in a file. and it tells us the cumulative poly count of those objects?

For example…

Script: “how many meshes do you want to shame?”
me: “10”
Script: “Here I have selected the 10 heaviest meshes in this file and their total polycount is 2,537,455 polygons”

The script should only with with visible/pickable objects. It should omit locked objects/locked layers and hidden objects/hidden layers.

Any takers?

Thanks!

Gustavo

PS: you will be rewarded with cookies, coffee, booze, etc if you are a McNeel contributor, or with money as a form of appreciation (Paypal/Apple Cash/etc) if you are not.

Hi Gustavo - I might have something… hold on a bit.

-Pascal

1 Like

We’ll never know who saw what before posting… :rofl:

See what this is worth -

FindHighPolyCount.py (5.3 KB)

It asks for a poly count threshold - not quite the same but it should be possible to tweak pretty easily, I’ll take a look.

To use the Python script use RunPythonScript, or a macro:

_-RunPythonScript "Full path to py file inside double-quotes"

Here’s another - it works on estimated memory use -

FindLargestObjects.py (4.0 KB)

I guess you want the offspring of these two scripts. They’re in the same folder. I haven’t looked in there for a long time. It’s not like they did not have the opportunity.

Actually the second one may be it - it offers to label with the polygon count, for shaming.

-Pascal

The first one is too accusatory. No one want’s to shame a big-boned mesh all by itself. Imagine if an elevator’s alarm said: “The person with the blue shirt and the bucket of Frapuccino needs to get out!”

Largest object (by physical size?) also not good, we need to know largest polycount objects.

Now that I think more about this… if we want to shame 40 objects it should say should say: all visible picable ojects are 12,xxx,xxx polygons. The 40 selected objects are 2,5xx,xxx polygons" …So we know no matter how much we retopo/optimize those 40 objects we still need will be over 9.5M total polygons so we need to shame more of them. Makes sense?

Thanks!

G

What is there now:

the first one selects all meshes above whatever polycount you set.

he second one selects the X (you set it) number of largest memory use objects it can find. This should correspond to poly count.

Is that not it, pretty much? I know, more info…

@gustojunk - see how this works. Probably should have started from scratch, messing with an existing script makes a big mess but it seems to work…

FindLargestMeshes.py (5.1 KB)

-Pascal

How about this shamer tool:

The visible not locked meshes are listed high facecount first (I think quads count for 1 and not 2)
Right number is the facecount.
Left number is the total added starting at the bottom.
Select the meshes you want to see selected in Rhino and hit OK.

mesh_shamer.py (1.4 KB)

-Willem

P.S. I think the mesh reduce tool is pretty good at decimating dense meshes.

1 Like

What? muscling in on my beer??

-Pascal

1 Like

Gustavo,

If I get the price, can I make it beer for Pascal?

wait wait, I’m not done yet!

-Pascal

1 Like

The playing field is all yours.

It just got past midnight here so I need to get some sleep before my 2 yo wakes us up at 6:30 again.

“g’night”
-Willem

2 Likes

@pascal,
You are right, the 2-step program might be a better approach. We first select all the heavy meshes. Then we focus-shame only among those. I’ll test it when I’m back in the office.

…but then I see @Willem’s approach and oh man. That’s so revealing and shaming it’s down right poly-abusing!

Since this is trophy-generation I declare you both winners. I need to figure out how to ship something to Willem now that I’m back in the states :slight_smile:

My team will be happy with this.

Thanks!

G

@gustojunk - just to make sure you saw it - I posted a third script above -

https://discourse.mcneel.com/uploads/short-url/3uY4hYNUAIzkHsv6cYGEhRfY60Z.py

the code is a twisted pile of what’s-it but I I think it does what I think you asked - selects the X number of heaviest meshes, reports the poly count and the poly count of all the rest. Note it will select the top X number of polygon counts - if more than one mesh has the same count then the total selected will be more than you asked for.

-Pascal

Hi Gustavo,

After some sleep I wanted to make the data a little better by adding percentages.

mesh_shamer_percent.py (1.7 KB)

What I mean by

Was to give my price to Pascal, so no need to ship to Europe just to the West Coast.

enjoy!

-Willem

2 Likes