Does this mean it could be an option under -ViewCaptureToFile ?
Hi @ThomasAn_ I just made an image - > dithered pointcloud script:
This is easily changed to harvest the viewport frame.
Calculates in a few seconds.
Do you think a pointcloud could be the way to go, or is a bitmap the best output?
Isnât âdifficultâ to do that either.
EDIT: Here is a viewport - > pointcloud modified version.
yeah, I like it My new render settings will be 1 spp.
Ideally the point distribution would be a bit more even. Iâve tried by projecting blue noise points on the geometry
I see chairs and a table.
Another way if you just want to show the shape rather than the shadows can be to remesh and turn on vertex display
Thatâs cool!
And great for close up objects, (AND MOST PATENT DRAWING ARE!) but distant once will occlude and turn black(er) and same goes for planes, they will turn darker with distance since dot size is uniform.
Yes, distributing points on the shapes in 3d is definitely not the same thing as distributing them on the 2d shaded image, since it doesnât give any control over the maximum density when projected.
A proper real time stippling shader (as opposed to dithering) operating on the rendered image would be really nice.
This seems wicked fast, clean and detailed with a fixed noice so the image doesnât flicker in real time.
Ah, cool, but you are using 3 colors instead of 2?
Right, I tweaked it a bit, but just as easily have it do just black and white.
I believe in order to do nice looking dithering, we would need to perform a post process on the final image produced. Otherwise things would not look correct at higher resolution tiled viewcapture output.
The idea is along the right lines, but canât have points clamping together. Thatâs what Floyd Steinberg is trying to achieve and Stucki improves. For patent drawings there is also a step of lightening the image to prevent dark areas causing high point density.
No biggy though, we managed for years without Rhino supporting this. Wonât matter now.
I understand.
Clamping can be easily adjusted by adjusting either the point size or the density of the points in the image, and adjusting so black = 90% is no problem either. So shout out if you want to give it a go.
Hereâs a post-process using Floyd-Steinberg dithering. Definitely too slow to be used for real-time display. I think if GhGL had compute shader support, then we could make this realtime, but thatâs a pretty big project to take on. I would also smooth out those chunky pixels to nice anti-aliased circles.
haha! this reminds me monkey island of '92
good fun jesterking!
Fantastic effort. Thank you for it !
I doubt real time viewport is needed for this. As long as -viewCaptureToFile can produce is, itâs good.
Also, Floyd-Steinberg produces these streaks of straight lines, thatâs why Stucki is better.
this might appear like a really dump question to many of you, but isnât the view we have on our screen a projection to a plane surface? Wouldnât that mean that these effects donât necessarily need to be applied to the 3d shape, but only to the calculated 2d image on the screen?