Giter Site home page Giter Site logo

static-webmentions's People

Contributors

dependabot[bot] avatar nekr0z avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

static-webmentions's Issues

Webmentions are not sent to the old targets when they disappear

Per spec:

If the source URL was updated, the sender SHOULD re-send any previously sent Webmentions, (including re-sending a Webmention to a URL that may have been removed from the document), and SHOULD send Webmentions for any new links that appear at the URL.

We do process the updated pages, and we do send webmentions for new links and re-send them for existing links, but we only look for the old links when processing pages with http-equiv set as 410 (denoting that the whole page has been deleted).

We need to always process the old version of the page and account for the removed links.

Finer-grained inclusion/exclusion when gathering webmentions

Problem

Loving static-webmentions so far, but better exclusion mechanisms would be helpful:

  • I add Webmentions to my website at build-time, and label their links with rel=nofollow. If I send webmentions to these webmentions, we could end up with a feedback loop.
  • My <footer> has links that are included in every page. Currently, I manually add the links in my footers to the exclude list in config.toml, but I might eventually add links that change (like a link to the latest git commit displayed by my preferred git forge).

Proposed solutions

Two possible solutions that are not mutually exclusive:

Solution 1: exclusions

  • Exclude links with rel="nofollow" or rel="ucg".
  • Exclude children of certain tags (e.g., <footer> or <section class="webmentions">

Solution 2: finer-grained inclusions

Currently, static-webmentions extracts the content of .h-entry. If it only extracts .e-content when it's present, then it will only gather pending webmentions from links in the article text.

If you have a preferred solution, I could implement it this weekend.

Multi-threading is not used

Obvious areas that could benefit from multi-threading would be:

  • simultaneous processing of several files (with a setting to specify a number, defaulting to 1, perhaps);
  • simultaneous sending of several webmentions (with a setting to specify a number per domain or, even better, per endpoint, defaulting to 1, perhaps).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.