Giter Site home page Giter Site logo

rafael-carvalho / meta-surf-forecast Goto Github PK

View Code? Open in Web Editor NEW

This project forked from swrobel/meta-surf-forecast

0.0 2.0 0.0 2 MB

๐ŸŒŠ Aggregated surf forecast from Surfline, MagicSeaweed & Spitcast APIs

Home Page: https://metasurfforecast.com

License: MIT License

Ruby 82.90% JavaScript 5.35% CSS 2.78% HTML 8.97%

meta-surf-forecast's Introduction

Meta Surf Forecast

Purpose

Pull data from Surfline, MagicSeaweed & Spitcast APIs to display an aggregated surf forecast.

Screenshot

Developer Setup

  1. Install postgres, if you don't have it already: brew install postgresql
  2. Start postgres: brew services start postgresql
  3. Install Ruby dependencies: bundle install
  4. Create your database & seed it with spots: bin/rails db:setup
  5. Install yarn: brew install yarn
  6. Install yarn packages: yarn
  7. Install Invoker: gem install invoker
  8. Setup Invoker: sudo invoker setup
  9. Grab some Spitcast data: bin/rails spitcast:update
  10. Grab some Surfline data: bin/rails surfline:update
  11. Grab some MagicSeaweed data (requires a valid API key): MSW_API_KEY=xxx bin/rails msw:update
  12. Refresh the materialized Postgres view that collates all forecast data into one table: bin/rails database_views:refresh
  13. Start the server: bin/invoker start
  14. Connect to https://surf.test
  15. Score!

Note: If you get a security warning in Chrome, Click "Advanced" and then "Proceed to surf.test (unsafe)". Nothing to worry about, you're just connecting to your own machine and it's a self-signed SSL certificate so Chrome freaks out. You will also probably need to open the Browsersync javascript and Webpacker bundle once each to trust those certificates as well. I'm hoping to find a better workaround for this in the future...

Pull requests welcome, especially around new data sources/better data visualization (see TODO for suggestions)

Adding Spots

Contributing new spots is easy! Make sure you're signed into your Github account and edit the seeds file:

  1. Create a new Region/Subregion if necessary. For example, Los Angeles is created like so:
    CA = Region.find_or_create_by(name: 'California')
    LA = Subregion.find_or_create_by(name: 'Los Angeles', region: CA)
    LA.timezone = 'Pacific Time (US & Canada)'
    LA.save!
    You can get valid timezone names from this list.
  2. Get the Spitcast spot id, slug (unique text id) & lat/lon data using their spot list API (you can change the county at the end of the URL). The slug is spot_id_char in their API.
  3. Go to the MagicSeaweed page for the spot you want to add. Their spot id is the number at the end of the url, and the slug is the text after the slash and before -Surf-Report, ex: for https://magicseaweed.com/Pipeline-Backdoor-Surf-Report/616/ the slug is Pipeline-Backdoor and the id is 616.
  4. Go to the Surfline page for the spot you want to add. Their spot id is also at the end of the url, ex: for https://www.surfline.com/surf-report/venice-beach-southern-california_4211/ it's 4211.
  5. It's strongly encouraged to add all spots for a particular county or region rather than just a single one. Be a pal!
  6. Submit a pull request and I'll get it on the site ASAP!

Use the following as a template. Delete the lines for surfline_id, msw_id, etc, if that spot doesn't exist on that particular site.

  {
    name: 'County Line',
    lat: 34.051,
    lon: -118.964,
    surfline_id: 4203,
    msw_id: 277,
    msw_slug: 'County-Line-Yerba-Buena-Beach',
    spitcast_id: 207,
    spitcast_slug: 'county-line-malibu-ca',
    subregion: LA,
  },

Data Sources

New API

Old API

Surfline's old API is undocumented and unauthenticated, but is used via javascript on their website, so it was fairly easy to reverse-engineer. They return JSON, but with a very odd structure, with each item that is time-sensitive containing an array of daily arrays of values that correspond to timestamps provided in a separate set of arrays. For example (lots of data left out for brevity):

"Surf": {
  "dateStamp": [
      [
        "January 24, 2016 04:00:00",
        "January 24, 2016 10:00:00",
        "January 24, 2016 16:00:00",
        "January 24, 2016 22:00:00"
      ],
      [
        "January 25, 2016 04:00:00",
        "January 25, 2016 10:00:00",
        "January 25, 2016 16:00:00",
        "January 25, 2016 22:00:00"
      ]
    ],
  "surf_min": [
      [
        2.15,
        1.8,
        1.4,
        1
      ],
      [
        0.7,
        0.4,
        0.3,
        0.3
      ]
    ],
}

Requests are structured as follows:

https://api.surfline.com/v1/forecasts/<spot_id>?resources=&days=&getAllSpots=&units=&usenearshore=&interpolate=&showOptimal=&callback=

This is a breakdown of the querystring params available:

Param Values Effect
spot_id integer Surfline spot id that you want data for. A typical Surfline URL is https://www.surfline.com/surf-report/venice-beach-southern-california_4211/ where 4211 is the spot_id. You can also get this from the response's id property.
resources string Any comma-separated list of "surf,analysis,wind,weather,tide,sort". There could be more available that I haven't discovered. "Sort" gives an array of swells, periods & heights that are used for the tables on spot forecast pages. To see the whole list, just set 'all'.
days integer Number of days of forecast to get. This seems to cap out at 16 for Wind and 25 for Surf.
getAllSpots boolean false returns an object containing the single spot you requested, true returns an array of data for all spots in the same region as your spot, in this case "South Los Angeles"
units string e returns American units (ft/mi), m uses metric
usenearshore boolean The best that I can gather, you want this set to true to use the more accurate nearshore models that take into account how each spot's unique bathymetry affects the incoming swells.
interpolate boolean Provide "forecasts" every 3 hours instead of ever 6. These interpolations seem to be simple averages of the values of the 6-hour forecasts.
showOptimal boolean Includes arrays of 0's & 1's indicating whether each wind & swell forecast is optimal for this spot or not. Unfortunately the optimal swell data is only provided if you include the "sort" resource - it is not included in the "surf" resource.
callback string jsonp callback function name

MagicSeaweed has a well-documented JSON API that requires requesting an API key via email. This was a straightforward process and they got back to me quickly with my key.

I've asked MagicSeaweed a few questions and added their responses below:

  • "Our API provides 5 days of forecast data, with segments of data provided for each 3 hour interval during that 5 day time span."
  • "Our data is updated every 3 hours."

Spitcast only provides a list of API endpoints, but the data is sanely-structured JSON so it's pretty easy to parse.

I've asked Jack from Spitcast a few questions and added his responses below:

  • To get more than the default 24 hour forecast for a spot, add dcat=week to the querystring.
  • Why does the site show a size range, but the API only returns one size value? "I actually take the API number and create the max by adding 1/6 the height (in feet), and then create the min by subtracting 1/6 the height."
  • All possible values for shape:
    • Poor
    • Poor-Fair
    • Fair
    • Fair-Good
    • Good

TODO

  • Improve charts:
    • Fix timestamp formatting.
    • Account for min/max size forecast. Currently charts just reflect the max.
    • Display forecast quality ratings. Perhaps color each bar different depending on how good the rating is. Surfline also has an optimal_wind boolean that is being crudely integrated into the display_swell_rating method - improvements welcome.
  • Refresh data on a schedule based on when new data is available (refreshing all forecast sources hourly)
  • Support multiple timezones as opposed to Pacific Time only
  • Don't show forecasts for nighttime hours since they just waste graph space
  • Fetch & display tide/wind/water temperature data from NOAA (they actually have a decent API!)
  • Fetch & display recent buoy trends that are relevant to each spot to give an idea of when swell is actually arriving.
  • Stop manually seeding the db and figure out a way to pull all spots from each data source and automatically associate them to a canonical spot record (probably using geocoding)

meta-surf-forecast's People

Contributors

swrobel avatar ablelincoln avatar jcohenho avatar tobiastom avatar

Watchers

James Cloos avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.