I’ve just updated the drivers, clean installation. No luck.
I’ve tried the same on RH7 on a full-Intel laptop:
With antialiasing on, PointSize from 1 to 16 is displayed correctly, values bigger than 16 are shown as 16.
With antialiasing off, the point is displayed as a perfect square (like you just said), PointSize from 1 to 255 are correct, any greater is shown as fixed 255.
I will be giving another presentation in two hours on creating billboard effects for GhGL (10 am Pacific time). Please send me a PM if you want an invite. I will post a recording of this later today.
Yes, we can do quite a bit more than what can be done with ShaderToy since we aren’t limited by WebGL and can work with any sort of geometry generated from Grasshopper. I’m trying to get everything set up you can install some GhGL based logic as a “preview component” and just directly use it without having to know how to program it.
Great! I really wish that it in the future it can be fully utilized by both shader maniacs and also by less experienced “regular” GH users. I already tested some files posted here and there.
You really drawed my attention,
I wish you the best implementing it.
Is there any way to output the Light Direction and/or RGBA values back into grasshopper? I’m interested in using the values (in camera & world coordinate space) to inform geometry downstream that can then be used to output 3D Geo or 2D vectors.
After seeing some videos about shader creating I must admit that it looks fantastic.
Thanks @jho for awesome videos on youtube.
I got (maybe silly) question. Is it possible, that custom shader creation would be done solely from nodes? I am talking about some sort of GH plugin (with let’s say 20 components). Or it is impossible to translate that coding language into nodes?
EDIT:
Oh, @stevebaer, you just aswered this question a second ago, I guess.
Yes that would work because then we could feed the bitmap through the image sampler to attain numerical values. However, that leads me to the question: would the Bitmap be ‘live’ and/or parametric?
Honestly, I am lacking complete comprehension of GLSL / OpenGL to understand if a live output of screen color/shade values is easily attainable.
I’m not sure how this would work either. It seems like we would need to specify a single viewport for the image to be generated from as currently GhGL previews in all viewports.