Giter Site home page Giter Site logo

Workflow Development about 3blue1brown.com HOT 6 CLOSED

3b1b avatar 3b1b commented on June 9, 2024
Workflow Development

from 3blue1brown.com.

Comments (6)

kurtbruns avatar kurtbruns commented on June 9, 2024

After sleeping on it, my take is that we should go with the simple approach. That way we can start converting posts to markdown as early as today.

The simple workflow then can evolve naturally into a more robust version over time or as the robust version is developed.

from 3blue1brown.com.

vincerubinetti avatar vincerubinetti commented on June 9, 2024

Notes on Git LFS from my coworkers who have used it a lot:

Be sure to run git lfs track file.txt before adding the file to the commit, but then you can handle everything else normally I think. If you forget and commit before you do lfs-track then it’s in the commit history forever and causes a world of pain

i think there is a monthly bandwidth limit per account: https://docs.github.com/en/github/managing-large-files/versioning-large-files/about-storage-and-bandwidth-usage
that applies to other people downloading files that you’ve uploaded using git-lfs IIRC, i’ve run into issues with it on some old non-greenelab repos
not sure if we pay to extend it

they have given us an exception as a scholarly team
we complained to them back in the day and they gave us either 2TB or 20TB/mo forever

from 3blue1brown.com.

3b1b avatar 3b1b commented on June 9, 2024

Notes from our conversation this morning; feel free to correct if I'm saying the wrong things.

  • We start by using git-lfs for all things pre-production.
  • We'll then add a GitHub action that syncs the status of files as seen on lfs with a single Linode bucket whenever changes are merged with master, with some environment variable dictating where to look for files.
    • Note, this means nixing the development bucket mentioned above. Might as well use lfs for everything where bandwidth is no concern. We can revisit the idea of an intermediate bucket if issues seem to arise.
  • Insofar as any development depends on whether we use Jekyll vs. Hugo, hold off until we make a final decision there; plenty of content work to be done in the meantime.
  • Keep notes on any necessary git-lfs setup for newcomers (seems straightforward).

Further thoughts

  • Back of the envelope estimations. The topology article had ~20 MB of media. After compression, though, it was closer to 8 MB. So for every 100,000 post views, that would be ~800 Gigabytes of bandwidth.
    • This is a sample size of 1, and other posts will be longer/shorter/richer/lighter, but I suspect it's actually a decent representative of a "typical" post.
    • I suspect more visually rich posts will also be the ones with more traffic.
    • In the future, there's room to optimize where we only serve an animation when it's requested, but for now, let's stick with what's simplest and make further judgments only when we have actual user data.
    • As a side note, this means the compressed version of the full library could be around 1 GB.
  • On the topic of compression, we should probably keep the highest quality versions of all media in the repo as "ground truth", then if possible have compression take place when the files are all being synced with the Linode bucket.
  • When creating content, no one should be thinking about data constraints. The more visually rich, as long as it is in the service of a better learning experience, the better.

from 3blue1brown.com.

vincerubinetti avatar vincerubinetti commented on June 9, 2024

On the topic of compression, we should probably keep the highest quality versions of all media in the repo as "ground truth", then if possible have compression take place when the files are all being synced with the Linode bucket.

This is definitely something we can and should automate. I believe Kurt said that Hugo has some of these compression features built-in.

from 3blue1brown.com.

kurtbruns avatar kurtbruns commented on June 9, 2024

Update on the progress on this issue:

The workflow branch is setup to run an action that syncs the contents of the test directory with the Linode bucket. For example, the file

test/a.txt

is synced to the URL

https://3b1b-posts.website-us-east-1.linodeobjects.com/test/a.txt

Test Workflow

To test the action, download git lfs, install the extension and then checkout the workflow branch. Make and commit changes (add/edit/delete) to the test directory. To trigger the action, push the changes to the origin and watch the action run on Github in the actions tab.

Note: Files that are deleted from the test directory will also be deleted from the bucket on sync.

Examples of git lfs and bucket URLS can be found by serving the site locally and sorting the lessons by date. I created a git LFS test post which uses the title-animation.mp4 file in the test directory.

Next Steps

Integrating this action with the deploy action should be relatively straight forward. (1) Decide on the publish directory structure, (2) update the figure component and (3) update the deploy workflow.

  1. Build assets to publish directory like normal. Add rules to .gitignore on the gh-pages branch to prevent these assets from being deployed to Github Pages. Update sync command (see docs) to only sync pattern matching files. For example, sync files matching the path /publish_dir/lessons/**/*.mp4.

  2. Update the figure component to be environment aware. Note: The git lfs URL contains the branch name. I think we can pass an environment variable to the build step to construct the URL.

  3. Integrate the sync action into the deploy workflow. In the .github/workflows/sync-test.yaml file I have the Jekyll build step commented out. I haven't tested it yet, but it should work fine. Once setup, we should make sure that non-asset files from (1) are successfully ignored.

Edit: Previously, I recommended that we sync a media folder in the publish directory to the bucket. But, after thinking about it, I changed (1) and (2) to a solution that I think will be easier to implement.

from 3blue1brown.com.

kurtbruns avatar kurtbruns commented on June 9, 2024

I spent some time tinkering over the weekend with this action on a personal Hugo project and that implements the above steps. There was a couple of hiccups, but the action works perfectly and is ready to be implemented here to finish the workflow. However, I'll wait until the dust settles on #42

from 3blue1brown.com.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.