Comments (6)
Did some experimentation by creating a custom GPT (https://chat.openai.com/g/g-pf75s5Rug-oceanparcels). So far it looks to be a good resource for new users who aren't as familiar with Python, as it gives the GPT a surface knowledge of the API. I haven't stress tested it (I'll leave that to you @erikvansebille if you'd like to).
Model setup
Instructions
Your name is Guppy, and you're a GPT trained to help oceanographers use the OceanParcels Python package. This Python package allows for Lagrangian particle simulations, and is specifically created to work with large computations. In your replies, include references to xarray commands as needed (as working with NetCDF files is quite common in this domain).
Your custom knowledgebase consists of the latest tutorial tutorials for the project. These tutorials are in markdown format.
When referencing information from the custom knowledge base, I want you to always provide the name of the markdown file you are providing the help from at the beginning of the response. This reference should be formatted as a URL in the following format https://docs.oceanparcels.org/en/latest/examples/{tutorial_stem}.html
. For example, the knowledgebase file parcels_tutorial.md
will result in the following link being mentioned https://docs.oceanparcels.org/en/latest/examples/parcels_tutorial.html
.
Knowledge
I uploaded the following files:
parcels_tutorial.md
tutorial_jit_vs_scipy.md
tutorial_kernelloop.md
tutorial_interaction.md
tutorial_parcels_structure.md
tutorial_output.md
I converted the notebooks to md
(omitting outputs) in the hope to make them more machine parsable. I kept it limited to a few tutorials to keep to the essentials (and avoiding diluting the knowledge-base.
Capabilities
- Web Browsing
- DALLΒ·E Image Generation
- Code Interpreter
Disabled all capabilities (Code Interpreter isn't useful as parcels wouldn't be installed anyway)
Trial conversation (unedited, first try)
Where can I get an overview of how OceanParcels is structured?
To gain an overview of how OceanParcels is structured, you should refer to the "OceanParcels Overview" tutorial. This tutorial is designed to give you a comprehensive look at the structure and functioning of the OceanParcels package.
In this tutorial, you'll learn about the key components of OceanParcels, including its classes and functions, and how they work together to facilitate Lagrangian particle simulations in oceanography. The tutorial covers essential elements such as the FieldSet class, which is used to hold the grid data; the ParticleSet class, which represents a collection of particles; and the different particle classes available in OceanParcels.
The tutorial also provides insights into how OceanParcels handles data, particularly NetCDF files, through the integration with xarray, which is critical for working with large datasets typically encountered in oceanographic studies.
For a detailed understanding, you can access the tutorial here: OceanParcels Overview. This will give you a solid foundation to start using OceanParcels effectively for your oceanographic research.
Provide me a code example that runs an OceanParcels simulation in JIT mode instantiating a line of 10 particles between lan,lot=20,30 and lan,lot=40,30, and running the simulation for 4 days in timesteps of 6 hours. Save the output in NetCDF format.
For running an OceanParcels simulation in JIT (Just-In-Time) mode with a line of 10 particles between the coordinates (lat, lon) = (20, 30) and (40, 30), and simulating for 4 days with timesteps of 6 hours, while saving the output in NetCDF format, you can use the following code example. This example assumes you have a suitable fieldset (e.g., ocean currents data) loaded for the simulation.
from parcels import FieldSet, ParticleSet, JITParticle, AdvectionRK4
from datetime import timedelta
import numpy as np
# Load your fieldset here
# fieldset = FieldSet.from_netcdf(...)
# Define start and end points for the line of particles
start_lat, start_lon = 20, 30
end_lat, end_lon = 40, 30
# Create an array of lats and lons for particles
lats = np.linspace(start_lat, end_lat, 10)
lons = np.full(10, start_lon)
# Create a ParticleSet with a line of particles
pset = ParticleSet(fieldset=fieldset, pclass=JITParticle,
lon=lons, lat=lats)
# Define the simulation runtime and timestep
runtime = timedelta(days=4)
dt = timedelta(hours=6)
# Set up a Kernel for advection
kernel = pset.Kernel(AdvectionRK4)
# Execute the simulation
pset.execute(kernel, runtime=runtime, dt=dt,
output_file=pset.ParticleFile(name="output.nc", outputdt=dt))
# The output will be saved in 'output.nc' in NetCDF format
In this code:
- You need to replace the fieldset loading part with your specific data source.
- The linspace function from NumPy is used to evenly distribute 10 particles between the start and end latitudes along the same longitude.
- ParticleSet is instantiated with these positions and the JITParticle class, which is suitable for JIT mode.
- The simulation is run with the AdvectionRK4 kernel for 4 days, with a timestep of 6 hours.
- The output is saved in NetCDF format in a file named "output.nc".
Make sure you have the necessary OceanParcels package and dependencies installed and your environmental data (like ocean current fields) ready to use in this simulation.
Note in the conversation that it gave a broken link to the documentation. That was the first time it did that, with the 2 other times it giving valid links. Your mileage may vary.
from parcels.
Just clarifying, I think https://docsgpt.ai/ and https://github.com/arc53/DocsGPT (website) are two different projects. The first being a commercial product, with the other being an open source project. A bit of a common name π.
from parcels.
OpenAI's Assistant API was just released, and looks to be a great option for this. I'm going to see if I can do some experimentation :)
from parcels.
I have been experimenting with the Assistants API too (for a different project), and it does seem like OpenAI's knowledge retrieval tool is on its way to replace Retrieval Augmented Generation (RAG, which I used in the demo I showed during our meeting).
from parcels.
Did some experimentation by creating a custom GPT (https://chat.openai.com/g/g-pf75s5Rug-oceanparcels). So far it looks to be a good resource for new users who aren't as familiar with Python, as it gives the GPT a surface knowledge of the API. I haven't stress tested it (I'll leave that to you @erikvansebille if you'd like to).
Awesome, @VeckoTheGecko! I'd be keen to try, but following the link above seems requires a ChatGPT Plus account to use? I only have the free account at the moment... Or is there another way to access this bot without a Plus account?
from parcels.
No way to get around that unfortunately for this proof of concept. You're welcome to email me a list of prompts you'd like me to test.
If we take this further we can use the API directly
from parcels.
Related Issues (20)
- recovery option diseapperaed HOT 3
- Time-Varying Depths From Nemo HOT 2
- `NaN` padding in zarr files with delayed start times HOT 2
- RunTimeErrorWarning in output files HOT 1
- Delayed particles does not repeat the set initial locations HOT 2
- Field[time, depth, lat, lon] indexing not working correctly in JIT mode. HOT 1
- FieldSet.from_mom does not consider grid rotation HOT 2
- Rename FieldSet.from_mom to FieldSet.from_mom5 and create new FieldSet.from_mom6 HOT 2
- Add tutorial information about `indices` keyword when creating FieldSet
- Confusing compilation error message when variable in Kernel has same name as FieldSet constant HOT 1
- RK45 not working HOT 1
- Fieldset.from_zarr? HOT 1
- How to Output or Check Interpolation Results of u_uss and v_uss in Parcels Custom Kernel? HOT 2
- KDTree is not installed by default, notification or warning is missing when called upon
- nearest neighbor interpolation is still interpolating via iteration HOT 1
- Add a tutorial to help users choose an appropriate advection kernel
- Missing dependencies in pyproject.toml HOT 7
- Why time indices are ignored in FieldSet.from_netcdf? HOT 1
- Zarr 2.18 Breaks ParticleSet.execute(): ValueError: buffer source array is read-only HOT 6
- Feature: Indexing fields by latitude/longitude rather than grid indices HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from parcels.