I rewrote your code in C# and it was like 20x faster. Iâd recommend to try avoiding dict
as they are quite slow.
Tried many of those⌠Iâm using here a bit of Poisson with own multithreaded twist
Nope my is C# - Iâm not aware of Py overhead but it may also lay in structures @gankeyu points exactly that some of those are causing unnecessary and big overheads Rule of thumb wherever you can use a finite array of elements youâll be surprised how it can speed up things - dynamic allocation of memory just have itâs time cost
Not true at least in Python, especially if you do a lot of searching, dictionaries are far faster and superior in comparison to lists or tuples, especially if you add new values on the fly. Lookups in lists are O(n) and in dictionaries O(1) on average. Sets can even be faster, but you canât easily associate data to other data. Dictionaries and sets are hash tables which are fast but memory hungry
True, and Iâd even add one-dimensional, butâŚ
⌠as I understand it, itâs not the dynamic allocation thatâs the problem in this case. Dynamic allocation refers to storing data in heap memory (dynamic memory), as opposed to on the stack. Btw, in C# nearly everything is heap allocated.
The copying is the issue here. Finite arrays stay at a fixed memory address and a fixed amount of space has been pre-allocated for the data type to be stored. Other data structure, like lists, vectors, and so fourth that you can on the fly add to, allocate only so much memory when they are instantiated. While adding elements, once the memory has been filled, the entire data structure needs to get copied to another memory location, where more space is available, and this copying takes a lot of time and happens over and over the more items you add.
Perfectly explained. My explanation just wasnât precise enough
What is the GUI from that youâre showing above in your preview?
Oh, now I see where the highly optimised algorithms come into play! Looks promising!!
dict
is faster if you are doing an arbitrary-key lookup. But the OP was utilizing dict
as a way to store sequential data. He used 0
, 1
, 2
⌠as keys, under which situation list
is almost a better alternative.
Btw, in C# nearly everything is heap allocated.
Yes. The issue here is in OPâs code, tons of list are created dynamically, which are big performance overheads.
OPâs code is a mess! I took a quick peek, but quickly gave up. I mean no offense @mikity_kogekoge, but you should take a look at PEP8.
My code is a good code always.
The dictionary that have sequential numbers starting from 0 as its keys is used to make the merge after a parallel computation easier.
What the code is doing is,
- compute a âcomfortableâ distance between points
- divide the region by cells whose edge is the comfortable distance and store each point in each cell.
3 Create point-> cell and cell->point data structure (done by dictionary) - compute forces between the points, but only points in adjacent cells are computed.
- update the positions by looking at the forces.
(repeat 20 times)
I rewrote the code in C# and it became 10x faster!!
500pts 0.37s
5000pts 1.9s
50000pts 25.5s
500000pts 5.3m
sampling_C#.gh (6.2 KB)
The algorithm was slightly improved especially around the boundary edge.
Iâm not familiar with parallel in Python. But at least you donât quite need to handle it manually in C#. Let me go through your code and make it faster.
Oh, yes please!
Slightly faster. still much room for improvement. Curve operations are definitely bottlenecks.
protected override void SolveInstance(IGH_DataAccess DA)
{
unchecked
{
Curve crv = null;
var N = 0;
var iteration = 20;
DA.GetData(0, ref crv);
DA.GetData(1, ref N);
DA.GetData(2, ref iteration);
var sw = new Stopwatch();
sw.Start();
var srf = Brep.CreatePlanarBreps(crv, 0.0001)[0];
var rnd = new System.Random();
var face = srf.Faces[0];
var domU = face.Domain(0);
int ss = 100;
var AM = AreaMassProperties.Compute(srf);
var area = AM.Area;
double dd = Math.Sqrt(area / (N * ss));
ee = Math.Sqrt(area / ((double)N) * 4d / Math.Sqrt(3d)) * 1.1d;
int NN = (int)(Math.Sqrt(N * ss));
var pts = new List<Point3d>();
_crv = crv;
for (var i = 0; i < NN; i++)
{
double t = ((double)i) / ((double)NN);
double u = domU[0] + (domU[1] - domU[0]) * t;
var cc = face.TrimAwareIsoCurve(1, u);
for (var j = 0; j < cc.Length; j++)
{
var ccc = cc[j];
double L = ccc.GetLength();
int NNN = (int)(L / dd);
if (NNN > 0)
{
pts.AddRange(ccc.DivideByCount(NNN, false).Select(ccc.PointAt));
}
}
}
ppp = pts.OrderBy(x => rnd.NextDouble()).Take(N).ToArray();
dict1 = new Dictionary<(int, int), List<Point3d>>();
dict2 = new (int, int)[N];
for (var i = 0; i < N; i++)
{
var p = ppp[i];
int xx = (int)Math.Floor(p.X / ee);
int yy = (int)Math.Floor(p.Y / ee);
var key = (xx, yy);
if (dict1.TryGetValue(key, out var list))
{
list.Add(p);
}
else
{
dict1[key] = new List<Point3d>() { p };
}
dict2[i] = key;
}
var boundary = new Dictionary<(int, int), bool>();
foreach (var key in dict1.Keys)
{
boundary[key] = test(key.Item1, key.Item2);
}
for (var t = 0; t < iteration; t++)
{
var dict3 = ParallelEnumerable.Range(0, N).AsOrdered().Select(F1).ToArray();
for (var i = 0; i < N; i++)
{
var V = dict3[i];
var p = ppp[i];
var key = dict2[i];
dict1[key].Remove(p);
Point3d newp = p + V * 0.7;
if (boundary[key])
{
var PC = _crv.Contains(newp, Plane.WorldXY, 0.0001);
if (PC == PointContainment.Inside)
{
ppp[i] = newp;
}
else
{
double tt;
_crv.ClosestPoint(p, out tt, 1000);
var T = _crv.TangentAt(tt);
T = new Vector3d(T.Y, -T.X, 0);
T.Unitize();
V -= (T * V) * T;
newp = p + V * 0.7;
ppp[i] = newp;
}
}
else
{
ppp[i] = newp;
}
int xx = (int)Math.Floor(newp.X / ee);
int yy = (int)Math.Floor(newp.Y / ee);
key = (xx, yy);
if (dict1.TryGetValue(key, out var list))
{
list.Add(newp);
}
else
{
dict1[key] = new List<Point3d>() { newp };
}
dict2[i] = key;
}
foreach (var _key in dict1.Keys)
{
if (!boundary.ContainsKey(_key))
{
boundary[_key] = test(_key.Item1, _key.Item2);
}
}
}
sw.Stop();
DA.SetDataList(0, ppp);
DA.SetData(1, sw.ElapsedMilliseconds);
}
}
Point3d[] ppp;
Curve _crv;
Dictionary<(int, int), List<Point3d>> dict1;
(int, int)[] dict2;
double ee;
static readonly int[] jjj = new[] { 0, -1, 1 };
private bool test(int xx, int yy)
{
unchecked
{
double minx = (double)(xx) * (ee);
double maxx = (double)(xx + 1) * (ee);
double miny = (double)(yy) * (ee);
double maxy = (double)(yy + 1) * (ee);
var P = new Point3d(minx, miny, 0);
var Q = new Point3d(maxx, miny, 0);
var R = new Point3d(maxx, maxy, 0);
var W = new Point3d(minx, maxy, 0);
return _crv.Contains(P, Plane.WorldXY) != PointContainment.Inside
|| _crv.Contains(Q, Plane.WorldXY) != PointContainment.Inside
|| _crv.Contains(R, Plane.WorldXY) != PointContainment.Inside
|| _crv.Contains(W, Plane.WorldXY) != PointContainment.Inside;
}
}
Vector3d F1(int i)
{
unchecked
{
var p = ppp[i];
var key = dict2[i];
var xx = key.Item1;
var yy = key.Item2;
var V = Vector3d.Zero;
for (var iu = 0; iu < 3; iu++)
{
for (var iv = 0; iv < 3; iv++)
{
var u = jjj[iu];
var v = jjj[iv];
key = (xx + u, yy + v);
if (dict1.TryGetValue(key, out var cell))
{
for (var ii = 0; ii < cell.Count; ii++)
{
var q = cell[ii];
var dist = (p.DistanceTo(q));
if (dist <= ee)
{
var L = 1 - dist / ee;
V += (p - q) * (L * L);
}
}
}
}
}
return V;
}
}
Thanks gankeyu! This very helpful.
It seems like I need to learn LINQ more. I didnât know even Take().
Btw, I guess this syntax is very new. Donât know if it works in C# node inside GHâŚ
And, heeey! Whatâs this syntax!? This is Python tuple⌠Didnât know recent C# has it!
Itâs ValueTuple
. Compared to Tuple
, itâs about 50% faster when used as a Dictionary
key. I donât think GHâs C# supports it.
Nope. You would need:
...
List<Point3d> list = null;
if (dict1.TryGetValue(key, out list))
...
Thank you! Thank you! Iâve been away from C# for a couple of years.
Iâve been using Python because writing is faster than C#. But obviously performance is notâŚ
Is there somebody out there in McNeil who want to buy this code?
500pts 0.37s
5000pts 1.9s
50000pts 25.5s
500000pts 5.3m
How about $10?
David Rutten is doing Poisson Sampling for GH2 so I am not sure McNeel need another one