solararbiter / solararbiter.github.io Goto Github PK
View Code? Open in Web Editor NEWContains the Solar Forecast Arbiter static website
Home Page: https://solarforecastarbiter.org/
License: MIT License
Contains the Solar Forecast Arbiter static website
Home Page: https://solarforecastarbiter.org/
License: MIT License
probably start with a list of metrics and some references. can take from PMP and stakeholder materials. Order by deterministic, probabilistic, and cost.
let's not do too much until we've thought through it more carefully.
\1. Evaluate forecasts
\9. Analyze forecasts
\3. Perform forecast evaluation in a standard manner
\6. Administer the framework
and allow missing values. Also add recommendations for resampling
I was testing Quantile Skill Score in the reports and came across some results that initially confused me. Table below shows metrics for 4 pairs of forecasts
Forecast | BS | QS | BSS | QSS |
---|---|---|---|---|
Table Mountain Boulder CO Day Ahead GEFS ghi Prob(f <= x) = 0.0% | 0.46 | 46.3 | -0.113 | 0.653 |
Table Mountain Boulder CO Day Ahead GEFS ghi Prob(f <= x) = 40.0% | 0.282 | 39.6 | -0.161 | 0.688 |
Table Mountain Boulder CO Day Ahead GEFS ghi Prob(f <= x) = 50.0% | 0.25 | 44.7 | 0.00e+00 | 0.643 |
Table Mountain Boulder CO Day Ahead GEFS ghi Prob(f <= x) = 100.0% | 0.264 | 93.4 | 0.55 | 0.201 |
Table Mountain Boulder CO Hour Ahead Prob Persistence ghi Prob(f <= x) = 0.0% | 0.415 | 129 | nan | nan |
Table Mountain Boulder CO Hour Ahead Prob Persistence ghi Prob(f <= x) = 40.0% | 0.243 | 122 | nan | nan |
Table Mountain Boulder CO Hour Ahead Prob Persistence ghi Prob(f <= x) = 50.0% | 0.25 | 121 | nan | nan |
Table Mountain Boulder CO Hour Ahead Prob Persistence ghi Prob(f <= x) = 100.0% | 0.585 | 112 | nan | nan |
I came away from this wanting a few things in the brier score documentation
Mailchimp will let us link to a web page version of the emails that we send out. We could add those links to the stakeholder page as they come out. Maybe useful for summarizing public stakeholder engagement? Maybe just busy work? Here's what we have so far:
Only an idea.
Pros:
Cons:
Distilled from my notes and @dplarson's notes on the second test trial.
Discuss:
mainly that when creating an object, the organization is set to the creating user's organization. Thus sharing of create permissions to user's outside of the role/permission org is pointless.
Probably need the program number.
Consider supporting these climate zones presented at the 2019 PVSC. They're designed as zones for degredation analysis, but I think they'd work for forecasts too. 2019_pvsc_submission.pdf
python package supporting the paper: https://github.com/toddkarin/pvcz
The one at the bottom of https://solarforecastarbiter.org/documentation/framework/
Users shouldn't need to dig to get a concrete idea of what we're doing. Maybe steal text/graphics from the SETO review poster and recent talks.
Somehow two probabilistic forecast metrics were mistakenly not added to the metrics page during previous PRs:
These two should be added to the metrics page, with a separate issue/PR open on the core code repo for implementing the functions themselves.
The only documentation we currently have for quality flags and the data validation toolkit is the core api. We need dashboard documentation that
following example introduced in #108. Start with the references section of the April survey doc.
Could link to...
the core function docstring: https://solarforecastarbiter-core.readthedocs.io/en/latest/generated/solarforecastarbiter.metrics.deterministic.normalized_mean_absolute.html#solarforecastarbiter.metrics.deterministic.normalized_mean_absolute
or to the RTD source: https://solarforecastarbiter-core.readthedocs.io/en/latest/_modules/solarforecastarbiter/metrics/deterministic.html#normalized_mean_absolute
I'm not aware of any way to cleanly link to the source on github based on function names.
Maybe include this xkcd in the documentation for how to specify site location, especially in an section that describes how to specify an anonymous plant location.
We received a handful of questions about if we'd use satellite data to validate forecasts. We should record the response in a blog post.
Maybe put it on the homepage?
https://ams.confex.com/ams/2019Annual/meetingapp.cgi/Paper/354714
https://www.ieawindforecasting.dk/
In particular, we can use the Task 36 work products to provide more guidance on evaluation scenario design.
include links to Data Use Agreement, other docs
Update the climate zone map with more refined outlines of the climate zones, preferably in the form of a GIS-type file (e.g. a shapefile) that can be re-used in the future.
Add Site
data type and Update
permissions to the permission reference table found at https://solarforecastarbiter.org/documentation/dashboard/administration/#permissions-reference-table
Correct the CRPS definition on the metrics page which should use (F_i - O_i)^2
, not |F_i - O_i|^2
.
Due to a line break, the [Definitions](../definitions/#probforecastdef)
link under the Data Models page (Metadata >> Forecasts >> Probabilistic forecasts: https://solarforecastarbiter.org/datamodel/#probabilistic-forecasts) shows up as plaintext instead of as an HTML link.
They're mostly uninhabited and we can greatly reduce the area of the models we have to save
I think the "Data Rights" page should be renamed "Data Policies". I'll do this by end of week unless I hear objections.
The report documentation should include
I also want to consider moving the report subsection from the "working with data" section into its own top level section. Maybe add some examples.
(I couldn't remember the status of the event reports, so I tried to look it up in the documentation.)
https://solarforecastarbiter.org/documentation/dashboard/administration/ is a fine reference. Especially for data sharing, we need more documentation that builds understanding and provides concrete examples.
I've thought highly of the approach described by https://documentation.divio.com/ since I saw their PyCon talk a few years ago, but I haven't actually tried to implement it. So here are some ideas for new documentation in that framework that are specific to data sharing:
How-to:
Tutorials:
Explanation:
If we like the approach we can blow up and reassemble the entire documentation, but that's more work than I want to commit to right now.
to mirror api/database. For example, sites no longer have network or well known text but do have extra parameters. Should also update the number of new aggregate parameters.
A stakeholder was confused about our event metrics and thought that they were only applicable to solar power ramps. We could improve the writing a little and add a second example such as clear/cloudy.
The getting started section describes how to use test log in information. Not sure if we want to keep the test info, but we definitely need to say how to use real log in.
Perhaps also move the data access control help page's account creation and establishing or joining an organization sections into the primary dashboard help page.
For example, the reference data map shows a SRML site in Richland but the database does not include it because the data is too old (1996).
Perhaps the thing to do is recreate the google map using a metadata dump from the database. Or perhaps just do SolarArbiter/solarforecastarbiter-dashboard#286
Should explain what this page is (static information about the project, help pages) and link to the dashboard. Remove references to early development and June 2019.
On the metrics page, the Brier Score decomposition is (incorrectly) listed asBS = REL + RES + UNC
, when it should actually be BS = REL - RES + UNC
(i.e. subtract RES
, not add RES
).
This was a typo from when I wrote up the Word report on metrics in Markdown.
catch all issue for things I want to change on the metrics page.
new reference: https://www.wpc.ncep.noaa.gov/html/hpcverif.shtml worthwhile because it's standard NCEP product.
number of forecasts that failed to be submitted by the submission deadline
Add content on cost metrics (i.e. the financial costs of mis-forecasting) to the website, based on the document shared by Aidan. Probably add the content to the end of the Metrics page (unless a more obvious place is identified). The content should provide general guidance and mirror the notation used in other parts of the website (e.g. the error metric definitions).
MIT?
Copyright Solar Forecast Arbiter Team?
Currently, the metrics page links to a document that summarizes the metrics. Instead, it would be better to list out the metrics on the metrics page itself (e.g. as a bullet point list).
The main question with this approach is how to represent mathematical symbols (e.g. the square root in the RMSE definition). There are libraries for rendering LaTeX in an HTML page (e.g. MathJax), which may be better than trying to use an ASCII-representation (e.g. RMSE = sqrt( 1/n * sum( (F_i - O_i)^2 ) )
).
Not sure if this belongs here, but we should document what permissions allow. For example, having the write_values permission allows one to write values to specified id but also allows one to read the interval_length of the parent object and to get the last value uploaded before a certain time.
We should add a link or even embed the @lboeman's nice reference data map. It's not clear to me where it should go. We might need a new Reference Data section. Thoughts?
Add a comment form to allow people to submit comments anonymously.
This will require something like google's recaptcha system.
Implementation depends on what kind of system we use for a mailing list. Might only need a link to another page. That's a separate discussion.
And no signup feature unless they can also unsubscribe.
Dashboard currently allows for the following report options:
MAPE
NMAE
NMBE
NRMSE
and links to the relevant sections of https://solarforecastarbiter.org/metrics . But the links for NMAE and NMBE are broken since they don't exist on the page.
We should also clarify the normalization rules as discussed in SolarArbiter/solarforecastarbiter-core#269
@dplarson can you take this on?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.