Giter Site home page Giter Site logo

energymodels.jl's Introduction

PyPSA - Python for Power System Analysis

PyPI version Conda version CI CI with micromamba Code coverage Documentation Status License Zenodo Examples of use pre-commit.ci status Code style: black Discord Contributor Covenant Stack Exchange questions

PyPSA stands for "Python for Power System Analysis". It is pronounced "pipes-ah".

PyPSA is an open source toolbox for simulating and optimising modern power and energy systems that include features such as conventional generators with unit commitment, variable wind and solar generation, storage units, coupling to other energy sectors, and mixed alternating and direct current networks. PyPSA is designed to scale well with large networks and long time series.

This project is maintained by the Department of Digital Transformation in Energy Systems at the Technical University of Berlin. Previous versions were developed by the Energy System Modelling group at the Institute for Automation and Applied Informatics at the Karlsruhe Institute of Technology funded by the Helmholtz Association, and by the Renewable Energy Group at FIAS to carry out simulations for the CoNDyNet project, financed by the German Federal Ministry for Education and Research (BMBF) as part of the Stromnetze Research Initiative.

Functionality

PyPSA can calculate:

  • static power flow (using both the full non-linear network equations and the linearised network equations)
  • linear optimal power flow (least-cost optimisation of power plant and storage dispatch within network constraints, using the linear network equations, over several snapshots)
  • security-constrained linear optimal power flow
  • total electricity/energy system least-cost investment optimisation (using linear network equations, over several snapshots and investment periods simultaneously for optimisation of generation and storage dispatch and investment in the capacities of generation, storage, transmission and other infrastructure)

It has models for:

  • meshed multiply-connected AC and DC networks, with controllable converters between AC and DC networks
  • standard types for lines and transformers following the implementation in pandapower
  • conventional dispatchable generators and links with unit commitment
  • generators with time-varying power availability, such as wind and solar generators
  • storage units with efficiency losses
  • simple hydroelectricity with inflow and spillage
  • coupling with other energy carriers (e.g. resistive Power-to-Heat (P2H), Power-to-Gas (P2G), battery electric vehicles (BEVs), Fischer-Tropsch, direct air capture (DAC))
  • basic components out of which more complicated assets can be built, such as Combined Heat and Power (CHP) units and heat pumps.

Documentation

Documentation

Quick start

Examples

Known users of PyPSA

Installation

pip:

pip install pypsa

conda/mamba:

conda install -c conda-forge pypsa

Additionally, install a solver.

Usage

import pypsa

# create a new network
n = pypsa.Network()
n.add("Bus", "mybus")
n.add("Load", "myload", bus="mybus", p_set=100)
n.add("Generator", "mygen", bus="mybus", p_nom=100, marginal_cost=20)

# load an example network
n = pypsa.examples.ac_dc_meshed()

# run the optimisation
n.optimize()

# plot results
n.generators_t.p.plot()
n.plot()

# get statistics
n.statistics()
n.statistics.energy_balance()

There are more extensive examples available as Jupyter notebooks. They are also described in the doc/examples.rst and are available as Python scripts in examples/.

Screenshots

PyPSA-Eur optimising capacities of generation, storage and transmission lines (9% line volume expansion allowed) for a 95% reduction in CO2 emissions in Europe compared to 1990 levels

image

SciGRID model simulating the German power system for 2015.

image

image

Dependencies

PyPSA is written and tested to be compatible with Python 3.7 and above. The last release supporting Python 2.7 was PyPSA 0.15.0.

It leans heavily on the following Python packages:

  • pandas for storing data about components and time series
  • numpy and scipy for calculations, such as linear algebra and sparse matrix calculations
  • networkx for some network calculations
  • matplotlib for static plotting
  • linpy for preparing optimisation problems (currently only linear and mixed integer linear optimisation)
  • cartopy for plotting the baselayer map
  • pytest for unit testing
  • logging for managing messages

The optimisation uses interface libraries like linopy which are independent of the preferred solver. You can use e.g. one of the free solvers GLPK and CLP/CBC or the commercial solver Gurobi for which free academic licenses are available.

Documentation

Please check the documentation.

Contributing and Support

We strongly welcome anyone interested in contributing to this project. If you have any ideas, suggestions or encounter problems, feel invited to file issues or make pull requests on GitHub.

  • In case of code-related questions, please post on stack overflow.
  • For non-programming related and more general questions please refer to the mailing list.
  • To discuss with other PyPSA users, organise projects, share news, and get in touch with the community you can use the discord server.
  • For bugs and feature requests, please use the PyPSA Github Issues page.
  • For troubleshooting, please check the troubleshooting in the documentation.

Code of Conduct

Please respect our code of conduct.

Citing PyPSA

If you use PyPSA for your research, we would appreciate it if you would cite the following paper:

Please use the following BibTeX:

@article{PyPSA,
   author = {T. Brown and J. H\"orsch and D. Schlachtberger},
   title = {{PyPSA: Python for Power System Analysis}},
   journal = {Journal of Open Research Software},
   volume = {6},
   issue = {1},
   number = {4},
   year = {2018},
   eprint = {1707.09913},
   url = {https://doi.org/10.5334/jors.188},
   doi = {10.5334/jors.188}
}

If you want to cite a specific PyPSA version, each release of PyPSA is stored on Zenodo with a release-specific DOI. The release-specific DOIs can be found linked from the overall PyPSA Zenodo DOI for Version 0.17.1 and onwards:

image

or from the overall PyPSA Zenodo DOI for Versions up to 0.17.0:

image

Licence

Copyright 2015-2024 PyPSA Developers

PyPSA is licensed under the open source MIT License.

energymodels.jl's People

Contributors

coroa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

energymodels.jl's Issues

Pathway optimization

There should be a second time coordinate for years, in which investment decisions would be taken.

@FabianHofmann Can you describe how lisa's work in the pathway optimization branch did work? Maybe she can?

Add support for using ParameterJuMP

ParameterJuMP allows defining variables as parameters and then compute dual values for them. Ie if one sets the capacity of a not-extendable generator or line as a parameter one can request the dual of the max generation constraint by querying the dual of the p_nom parameter.

Benchmark MOI

MOI's direct mode allows to skip the model copy in JuMP and instead hand all data through to Gurobi. Repeat the earlier benchmark to compare memory consumption to PyPSA.

Store variables and constraints in AxisArrays instead of DenseAxisArrays

Context: The commit 01a8b8a was necessary, since JuMP's new DenseAxisArray is not yet on par with the integration the AxisArray provides. For instance broadcasting is not working properly, which means the user has to be conscious of the differences and convert where approriate (conversion is pretty much instant, and does not involve copying the memory, only the axis metadata).

Instead the ModelView jump extension in modelviews.jl should be extended to store dense variable and constraint references directly in AxisArrays instead!

Add interface to PowerModels.jl

Depends on #2 .

Provide some sort of connector type that inherits from an PowerModels abstract super-type and intercept the dispatch of PowerModels ref, var and con to plug in our data and siphon of their variable/constraint references.

Package not compiling

Hi,

The current package is not compiling due to dependencies not up to date. Are there any plans to upgrade EnergyModels?

Thanks

Add tests

We need tests to confirm the produced models are correct and new changes do not change the formulation.

Compare how JuMP and PowerSimulations do tests!

Add dictionary and AxisArray based data layer

For changing data in-memory, we need an easy way to override individual components.

Plan: KISS

Add a DictData type which has a dictionary per element class holding the available attributes. The type can also have a fallback Data object to which missed elements or attributes are forwarded, typically some NetCDF.

Exposes:

  • Add element class interface similar to madd.
  • Set attribute on existing class.

Do we need a way to mask attributes or classes in the fallback?

Think about how to build StructJuMP models

StructJuMP allows to organize and separate the block structure of huge models. In a nutshell one creates variables and constraints on several StructuredModel instances, one of which is the master and several others which define the subproblems.

These instances can then be communicated to parallel solvers like PIPS or DSP.

Do we want to split the variables constraints of a single component? ie Generator dispatch being split from capacity? Are there multiple cuts one would like to make?

@fneum Thoughts?

New representation of time

Currently, all addto! functions which formulate the equations per device, call

T = axis(m, :snapshots)

effectively requesting the snapshots coordinate from the NetCDF dataset as an AxisArrays.Axis.

In this scheme, we can only support a single time-dimension and rolling horizon is effectively not possible without intercepting this axis call in the Data layer.

Wishlist

  1. Special handling of time dimensions as functions on a separate -- probably mutable -- object, which is either passed as an argument to addto! calls, or kept and updated on the EnergyModel object (i'm favoring the latter).
  2. We want to be able to model several investment periods, snapshots, different durations for snapshots (was called weights in PyPSA), maybe several groups of continuous snapshots with different weights per group.

Plan (WIP)

Snapshots

Typically hourly.

Works as in PyPSA. Would be good for time clustering methods if the duration of a single snapshots is variable. There was an argument that there should be weights separately from durations (?) in the context of storageunits. Can you still remember, @fneum ?

Investment periods

For instance each year every five years.

Represented by a separate time coordinate :periods on which capacity values and capital costs can depend.

If that coordinate :periods is managed orthogonally, it's easy to represent changing costs, at the same time with weather for just a single year, ie per snapshot. On the other hand it's not possible to change the snapshot representation for later investment periods (for instance, PLEXOS allows to reduce the amount of representative days for later periods, on the other hand representative days is a difficult concept with seasonal storage units).

@lisazeyen @FabianHofmann Can you elaborate how investment periods are represented in the PSA.jl pathway_optimization branch? What worked, what was the difficulty?

Groups of continuous snapshots (maybe later?)

Modeling seasonal storage by Leander suggests it's possible to use a typical day approach even with seasonal storage to represent an energy system with a small error margin if the selected days are adequately linked. I do not understand how everything fits together but it effectively boils down to being able to group the operational time-coordinate :snapshots into multiple continuous groups and link them together.

Can someone read that paper or similar descriptions and try to distill how to model this? @fneum , you were playing with Leander's tsam library before! Care to comment?

PowerSystems importer or Data layer

Either import a PowerSystems.jl model into the DictData data container (#15) or provide a separate data layer.

The main tricky bit is the same as with the PyPSA models, on how to split the generator definitions into separate classes and how to name them.

Maybe one can abstract that problem away?

PowerSystems will then allow running their examples from https://github.com/NREL-SIIP/Examples , and provide a tried and tested PSS/E raw and MATPOWER case file importer.

Makes a huge contribution to examples for #9 .

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.