Hey there!
I’ve been working on a project that produces massive amounts of data.
Obviously this is done in CPython (using Jupyter N). After filtering the data, I have a list of dictionaries that I would like to export from JN into GH, and to read that list as is in the Grasshopper Python module (in order to avoid creating heavy csv files and reading them with GH tools which is very heavy and time consuming).
Is that possible? Can I import a .py file with a list, or maybe even a .txt file straight into GH and have the Python component read it?
You can serialise the data using the native pickle module:
And using native json for a more universal solution:
Edit: Assuming Jupyter Notebook can send/host raw textual data to a URL, you can also read that directly (i.e. skipping saving to file). Here’s a quick example demonstrating how to read a JSON file from a URL and casting that to a dictionary:
This way you don’t have to write to a file BUT I’m not sure how that will work with large data, you will need to segment it I believe, however I didn’t get that far.
Also the server needs a bit of work if you want to have decent security.