Giter Site home page Giter Site logo

Comments (6)

carakessler avatar carakessler commented on May 24, 2024 1

New interface:

export interface TaskContext {
  readonly cliArguments: RunArgs;
  readonly cliFlags: RunFlags;
  readonly parsers: Map<ParserName, CreateParser<ParserOptions, Parser<ParserReport>>>;
  readonly config: CheckupConfig;
  readonly pkg: PackageJson;
  readonly paths: string[];
}

from checkup.

scalvert avatar scalvert commented on May 24, 2024

Thanks for getting this conversation going, @carakessler!

A couple of things:

  • should we instead just load the top level package.json from the root dir and pass it down to tasks rather than relying on them to load it? This will make them able to operate separate from a configured path. That said, if checkup is run in a directory without a package.json, it should error (it does today)
  • Tasks that read and analyze JSON - Are we thinking an example of this is a monorepo? eg. package.json files in packages? If so, I'm not sure we should do this, but maybe? Why wouldn't they just run checkup in the package directory itself?

It'd be super helpful to outline a bunch of example calls.

eg.

Checkup running at the top level, no flags:

checkup run

Checkup running at the top level, with single path:

checkup run ./packages/**/*

etc.

I'd show:

  • single path
  • multiple paths
  • different root
  • different root and multiple paths

from checkup.

carakessler avatar carakessler commented on May 24, 2024

Implementation Details

We want to ensure that tasks are all operating on the correct set of files, and make it so developers creating tasks don't need to think about what files to use in order to build their tasks. It is quite easy to make a mistake in this space, and totally invalidate the results of your report. I will break down how this should work for each "type of task" we currently support.

Currently, all tasks have access to a TaskContext object which contains:

export interface TaskContext {
  readonly cliArguments: RunArgs;
  readonly cliFlags: RunFlags;
  readonly parsers: Map<ParserName, CreateParser<ParserOptions, Parser<ParserReport>>>;
  readonly config: CheckupConfig;
}

The cliFlags will contain the paths that the checkup should operate on, but I don't want to rely on developers knowing that and using them manually. The nuance here is complex, and when developers build tasks, they wont necessarily be thinking about this use case. I propose we handle the selecting of which files to operate on to the framework in the following manner:

Task "type" What will it operate on? How will we enforce this?
Meta Root package.json
Checkup config
Add the root package.json to the TaskContext for MetaTasks
The TaskContext already contains checkup config, so ✔️
Lint/AST The globs passed into checkup The registeredParsers each have an execute function that takes in paths. I propose we remove the paths param, and have the parsers execute on the globs passed into checkup as cliArgs
Optional Enhancement: keep the paths param to the execute function and make it optional, in case "power devs" want to override it, and add checks that ensure that the paths being passed in are a subset of the globs being passed in as cliArgs. Sample use case: I only want to operate on -test.js files, so allow me to run my lint rule on the -test.js files that are within the globs being passed in as cliArgs. This would be more performant, but not required for P0
Tasks that analyze JSON Root package.json

Optional Enhancement: All package.jsons in the repo
Option 1) Add subclass like JsonTask that all tasks that read JSON can use, and put the correct package.json in the TaskContext for that subclass
Option 2) Just always add the correct package.json to the TaskContext, for all tasks. We add registeredParsers to the TaskContext of all Tasks, and they dont all necessarily need that

Worth noting - just because we add the correct package.json to the TaskContext doesn't mean they will use it.. The can still go read the file themselves, we can enforce this via linting maybe?

Open question: Do we want to have support for multiple package.jsons? If I were to run checkup on a yarn workspace, would I want to see the analysis of all package.jsons in the workspace? Probably. Potential solution would for there to be a config option that allowed for people to override the fact that we only analyze root package.json by default, in which case we would look for all package.jsons in the repo/globs provided. This feels like an enhancement to me.
Tasks that search for strings/files The globs passed into checkup 1) Add a list of all the file names we want to operate on to the TaskContext.
2) file-searcher.ts and all other utilities of the sort would operate on the globs by default (in the same manners that the parsers will)

The sentiment of everything described above could be made more intuitive/idiot proof by creating subclasses for the different types of tasks described, so only the tools/data required by a given task would be made available to the task author. It is also possible to accomplish this via composition, and just ensuring that all of our helpers/utils operate on the right files, I just worry that developers aren't required to use those utils by default, leaving room for mistake. For example, even if the getPackageJson util always returned the right package.json, there is still no way of requiring developers to use that util to read the package.json, and people may just use fs themselves.

If subclassing is not a direction we want to go in, I propose that we add all tools required to write all of these types of tasks to the TaskContext. This entails making the correct package.json and the file-searcher available in the TaskContext. We can should remove the utils getPackageJson and file-searcher being exported in core, so the only way to use them is via the TaskContext. I do worry that providing all tasks with all of the tools needed to do everything may eventually be too much bloat, but we can cross that bridge if/when we get there. Right now there are just 3 "tools" needed to write tasks (package.json, parsers, file-searchers), and that is ok for now.

None of this is 100% errorproof, as again, people can just read files themselves manually, but I think putting these tools in the hands of the authors in a clean and clear way, combined with adding lint rules that prevent people from using fs (and related libs) within their tasks should get us where we need to go.

from checkup.

scalvert avatar scalvert commented on May 24, 2024

I have some questions/comments:

The registeredParsers each have an execute function that takes in paths. I propose we remove the paths param, and have the parsers execute on the globs passed into checkup as cliArgs

Why would we not just pass the expanded globs from the taskContext into the execute function, which already takes an array of paths? I believe this is what we'd discussed before.

Just always add the correct package.json to the TaskContext, for all tasks. We add registeredParsers to the TaskContext of all Tasks, and they dont all necessarily need that

I would opt for this.

Worth noting - just because we add the correct package.json to the TaskContext doesn't mean they will use it.. The can still go read the file themselves, we can enforce this via linting maybe?

I wouldn't bother with this. It's not a huge deal for consumers to do this.

Do we want to have support for multiple package.jsons? If I were to run checkup on a yarn workspace, would I want to see the analysis of all package.jsons in the workspace?

My gut says no for now. They can "drop down" into the subdirectory as they see fit for now. If we want to support workspaces, I think we should come up with a more robust, well thought out solution (eg. aggregating package.json contents together, and segmenting things like dependencies by package)

Add a list of all the file names we want to operate on to the TaskContext.

I think this is what we'd discussed before. We should be passing in the expanded globs as a path array to all tasks.

could be made more intuitive/idiot proof by creating subclasses for the different types of tasks described

There's also another path we can take, and that's utilizing Partial, Exclude or Omit types for specific tasks, which would narrow the available properties on those types to imply the correct API.

from checkup.

carakessler avatar carakessler commented on May 24, 2024

Why would we not just pass the expanded globs from the taskContext into the execute function, which already takes an array of paths? I believe this is what we'd discussed before.

I wanted to remove the ability to for task authors to override the paths param, and not require them to pass in the expanded globs. Who is the "we" in your comment? Each individual task author?

from checkup.

scalvert avatar scalvert commented on May 24, 2024

We already freeze the taskContext object, and its members are marked as readonly. They wouldn't be able to override it. I'm saying we'd expand the globs ourselves, and provide it to tasks ourselves.

The we is us; you and I.

from checkup.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.