Spatial Interpolation of Sensor Data

I have an experimental setup which is composed of six illuminance sensors in a room.
From these sensors I would like to determine the percentage area of the room that is capable of achieving an illuminance of 200lux.

The sensors are on a LoRaWan network so I have managed (via The Things Network) to get the live data into Grasshopper using a webhook and a http listener (credit to the Swiftlet plugin).

In order to get the most accurate floor area reading that I can I am thinking that I should:

  1. Represent the room with a finer grid in grasshopper (something like 24 cells).
  2. Read in the six illuminance values that have been recorded and attribute them to the correct cell in the grid.
  3. Work out the correct illuminance values for the remaining cells via interpolation.
  4. Add up the areas for all cells where an illuminance above 200lux is either measured or estimated.

Is anybody aware of an existing tool/plugin/workflow that could be used for this application.
I’d be very grateful if anybody could point me in the right direction on this.

Many thanks,

Rory

Hi @Rory_Walsh,

Ladybug Tools has a nice set of visualisation tools.

You can best install the grasshopper plugin via the one click installer of pollination.

for questions about the plugin you can go to the ladybug forum: