Kernel smoothing of mesh

Here some screen copy of (serious) work I am doing now. The idea is to smooth Monte-Carlo simulations. Mesh representing a density (purple) is smoothed via a Kernel (in red). That gives the blue mesh. I think the “land” is quite pretty. The Z is logarithmic so it explains the gaps seen in the render.

The data not in logarithmic scale in Z

7 Likes

Nice work! Do you have an example how this works on a simple polyline?

No i haven’t. For sure it could be done on some curves for example on z. I don’t have idea for general polylines.

very cool! Is the source code available? Can you apply the algorithm to other types of fuzzy data? Smoothing geographic topography, heat-maps, etc?

The source code is specific to my problem. It very simple just do the equation presented in the link. You need to measure a distance apply the kernel Gaussian.
When I will be near my pc I 'll post the code. 5 lines …

Here the code

        /// <summary>
/// Voir https://en.wikipedia.org/wiki/Kernel_(statistics)
/// </summary>
/// <param name="arg_meshECEF"></param>
/// <param name="arg_tab_values">Values to be smoothed at mesh vertex</param>
/// <param name="arg_KernelSize">Size of kernel, same unit as mesh</param>
/// <param name="arg_logInterpolation">if true, arg_tab_values will be smoothed using thes log values</param>
/// <returns>array of smoothed datz</returns>
public static double[] FilterGaussianKDE(Mesh arg_meshECEF, double[] arg_tab_values, double arg_KernelSize, bool arg_logInterpolation)
{
if (arg_meshECEF.Vertices.Count != arg_tab_values.Length)
{
throw new Exception("Filtre KDE : Il n'y a pas le même nombre de points entre le maillage et les données sous forme tabulaire ");
}
double[] filteredData = new double[arg_meshECEF.Vertices.Count];
double n = (double)arg_meshECEF.Vertices.Count;
int i = 0;
foreach (Point3d ptECEF in arg_meshECEF.Vertices)
{
int j = 0;
double sumOfKernel = 0.0;
foreach (Point3d pt2ECEF in arg_meshECEF.Vertices)
{
double distSquared = pt2ECEF.DistanceToSquared(ptECEF);
double kernel = 0;

if (arg_KernelSize > 1e-6)
{
kernel = Math.Exp(-distSquared / (2 * arg_KernelSize * arg_KernelSize));//Kernel de type Gaussien
}
else
{
if (distSquared < 1e-6)
{
kernel = 1;
}
else
{
kernel = 0.0;
}
}
{
filteredData[i] = filteredData[i] + Math.Log10(arg_tab_values[j]) * kernel;
}
else
{
filteredData[i] = filteredData[i] + arg_tab_values[j] * kernel;
}
sumOfKernel = sumOfKernel + kernel;
j++;
}
{
filteredData[i] = Math.Pow(10, filteredData[i] / sumOfKernel);
}
else
{
filteredData[i] = filteredData[i] / sumOfKernel;
}
i++;
}
return filteredData;
}
2 Likes

How you apply the double array for the mesh?

Is it:
mesh.Vertices[i] = mesh.Vertices[i]+mesh.Normals[i]*filteredData[i];

I make a calculation on a flat Mesh and I represent on the picture a mesh with Z moved depending on arg_tab_values[i] or filteredData[i]. It is a sort of 2.5D mesh. I don’t use a full 3D mesh in the calculations in order to not duplicate meshes.
To do that on a general 3d mesh you must have a distance function between points and a weight associated on each vertex.

This is cool indeed.
I think it is similar to the methods used in Sandworm
see video how it smoothens a 200k face mesh in 20ms:

Cool, Is this done within a grasshopper c# environment?

Yes and also implemented in a software done in Visual studio

Cool, can you share the full script? Is that it?

I can’t share. I also think it will be useless as it is very specific.

Thanks for sharing,

Out of interest: Does kernal smoothing relate to this?
Robust Fairing via Conformal Curvature Flow

There is a difference in complexity between the 2 and also a difference of purpose, so to my opinions it doesn’t relate. Kernel smoothing is just on operator, like addition or multiplication.