Giter Site home page Giter Site logo

Comments (8)

lafrech avatar lafrech commented on June 1, 2024 1

Also related, we're facing CI limitations in apispec so I'm probably going to move to GHA as soon as I get the time.

marshmallow-code/apispec#747

from webargs.

lafrech avatar lafrech commented on June 1, 2024

Here's the GHA setup I use in flask-smorest.

https://github.com/marshmallow-code/flask-smorest/blob/master/.github/workflows/build-release.yml

It uses a PYPI_API_TOKEN I defined in Settings -> Security -> Secrets -> Actions.

from webargs.

sirosen avatar sirosen commented on June 1, 2024

That's a good approach!

I've been somewhat disappointed with the publicly available actions for publishing to pypi in the past. They just don't seem to offer very much in exchange for another dependency in the critical path to publishing.
Writing it like this seems better. It's extremely simple and I don't need to jump out to a github repo and start reading an action.yml file to try to figure out what it does.

from webargs.

sirosen avatar sirosen commented on June 1, 2024

From #687 , there was a question of how to make the release step depend on linting. In the flask-smorest build-release workflow, this is done with needs. I think that the current azure-pipelines setup doesn't enforce that other jobs pass before the release job runs. So we'd need to change the release process a little anyway.

I think the simplest thing is to have a near-verbatim copy of the flask-smorest release workflow. So there's some minor duplicate linting, but we make sure a release passes checks.

from webargs.

lafrech avatar lafrech commented on June 1, 2024

On a related topic, I've been using pip-tools recently (https://github.com/BEMServer/bemserver-core, https://github.com/BEMServer/bemserver-api), inspired by Pallets repositories and API Flask [1]. I like the fact that the build can be reproducible with all dependencies pinned recursively.

I did it a bit differently, with a pre-commit action ensuring pip-compile is run when needed.

I find it nice, although not perfect. Not perfect for Pallets either, apparently: pallets/werkzeug#2334

What I dislike is the fact that everything is pinned but pre-commit and tox. Maybe it's not that bad. To actually pin them, I'd need to add dedicated requirement files. Or add them to dev.in and install dev.txt in GHA and tox.ini, thus installing unneeded stuff (no need to install pre-commit in GHA tox action and no need to install tox inside tox task). I decided to just let it go, like they do in Pallets. After all, those are just tools, not library imported in my code. Maybe I'm overthinking it.

[1] When it comes to CI, marshmallow-code and Pallets are my sources of inspiration.

from webargs.

sirosen avatar sirosen commented on June 1, 2024

I've put in a PR for this, with most of what I want in place.
I'm going to see about doing a test version of a publishing build which pushes to test-pypi, but intended as follow-up work.

On a related topic, I've been using pip-tools recently (https://github.com/BEMServer/bemserver-core, https://github.com/BEMServer/bemserver-api), inspired by Pallets repositories and API Flask [1]. I like the fact that the build can be reproducible with all dependencies pinned recursively.

I have used pip-tools to deploy applications in the past -- and liked it -- but never as part of working on a library. I've also had good experiences using poetry, FWIW. I'll have to look again at what's being done in pallets to understand how pip-tools it's being used.

When it comes to CI, marshmallow-code and Pallets are my sources of inspiration.

💯 ! I agree! I also tend to look at what the pypa and psf repos have, as they often seem to do sophisticated and interesting things.

What I dislike is the fact that everything is pinned but pre-commit and tox. ... those are just tools, not library imported in my code. Maybe I'm overthinking it.

I think it's good to think about. At this point, I would actually recommend against pinning things that don't fit neatly into the common pattern of "dev dependencies".

I used to try to pin these sorts of things aggressively in my workflows, and I've stopped after finding that I wasn't getting very much out of it other than busywork. e.g. I used to pin the version of twine in one of my projects, but then realized that we were just blindly bumping to the next version anyway.

We might need to re-assess after the tox rewrite is done and tox v4 is released, but I'm hoping that all of our configs will be supported and that we'll not need to make any changes.

from webargs.

lafrech avatar lafrech commented on June 1, 2024

I used to pin the version of twine in one of my projects, but then realized that we were just blindly bumping to the next version anyway.

Yeah, exactly my point. Even if it breaks with tox 4, it should be obvious what the problem is and we'll fix it then. No need to add extra work for each minor and patch version.

What can be weird with the setup in Pallets that I copied (IIUC) is that stuff in requirements/dev.in is pinned, but the versions of tox and pre-commit are actually not the version used during the test.

from webargs.

davidism avatar davidism commented on June 1, 2024

the versions of tox and pre-commit are actually not the version used during the test

We now use pre-commit.ci which uses a specific pre-commit image, but pre-commit has other guarantees about compatibility and pinning hooks.

Now that you mention it, tox not being pinned was an oversight. With pip-compile-multi, it's absolutely fine to split up into lots of small requirements files, like one for a pre-commit env (again, check out pre-commit.ci), one for tox, etc. Although I guess it turns out it wasn't a big deal that it was unpinned, it hasn't caused any issues since I started doing all this.

The nice part about having all dev dependencies pinned is that it's much easier to ensure new contributors have a known-good environment. This was important at conference sprints, where we'd need to get lots of people set up quickly and not run into weird version differences.

Not perfect for Pallets either

Just to be clear, we're fine with pip-tools, pip-compile-multi is a wrapper around it that automates updating lots of requirements files. What I was unhappy with was Dependabot, which was noisy and also only worked specifically with the output of plain pip-compile, not the slightly different output of pip-compile-multi.

from webargs.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.