Comments (5)
Thank you for such a detailed proposal, I appreciate that!
In the current implementation, we can filter the files to use using Rake task's pattern
:
require 'yard/doctest/rake'
YARD::Doctest::RakeTask.new do |task|
task.pattern = 'app/**/*.rb'
end
Your point of more adoption is a valid one, so it makes sense to allow for one-by-one adoption. In this case, we need @doctest [true|false]
tag which will make the examples as tests. I am not sure if it should be turned on or off by default though, so your feedback is appreciated.
Another possible solution is to generate configuration which will skip all failing tests by default. Something similar to RuboCop --auto-gen-config
. The result of such a command can be doctest_helper_example.rb
with a bunch of doctest#skip
calls.
from yard-doctest.
In regards to # doctest: true
I'd like to avoid confusing people this is something similar to Ruby's pragamas (encoding
, frozen_string_literal
).
from yard-doctest.
TL;DR I like the config generation idea. It doesn't complicate the runtime code, it gives you a clear list of what doesn't work, doesn't involve any file-flagging mess, and lets you get going right away in an existing codebase.
On Implementation CLI or Rake task? Does pretty much everyone depend on Rake (and if you don't, well, do if you want to use this)?
I personally never liked Rake much... the "ENV vars are kind-of arguments sometimes and oh let's also use short generic ones like version
that you might have defined for any old reason especially in containers" thing has burnt me bad, but I acknowledge I'm probably in a sever minority here. And I do depend on it everywhere.
The way I understand the config generation approach is that you would want to generate the config when you first added YARD Doctest, since that's the only time a failed example may not mean a failed doctest, then probably update it by hand any time you added an example that isn't meant to doctest?
Yeah, that seems like it would work. It gives you a list of everything being skipped in one place, which is nice if you're aiming to eventually aiming to have all @example
doctest. It definitely hits the "I just added doctest and have a lot of examples that won't work, but should, and I want to migrate them gradually", which is the use case I've been talking about.
The major downside I see is having to remember check that list to make sure I don't think something is working when it's getting skipped, and possibly end up incorporating a false pretense into the library if you forget or miss it.
I realize that in addition to Ruby that "should run", I traditionally use @example
for pseudo-esque code that will never run, as the tag just seemed like the place to put that stuff, but this - as well as skipping and file-flagging - lead to a weird situation for the user reading the docs: there's no way to know which examples are tested and which aren't. You may end up back at spending forever trying to get something to run that was never intended to, which is exactly what I don't want for users.
It seems like doctest's inherent paradigm is all-or-nothing: you doctest, and all @example
should be valid doctests, or you don't. This has the major advantage of being simple.
From this angle, I like the config generation solution: it doesn't mess with the runtime code but it can buy you some time to get everything in order, with a clear list of what isn't yet.
The other road - some @example
are doctest and some just are not - seems fought with peril.
You would want a way of clearly marking in source and generated docs what is a doctest and what isn't, and that's just a terrible amount of complication for something that is nice and simple now.
There's always code blocks to demonstrate stuff that doesn't run.
from yard-doctest.
The more I think about this, the more it seems to me that yard-doctest can behave in the following way:
- On first installation, a user runs
yard doctest --auto-gen-config
which creates or appends its configuration todoctest_helper.rb
- The command collects all examples and runs them, collecting those which are failing.
- The commands creates a configuration which marks all failing tests as "pending" and dumps the configuration to file. We can use
doctest#pending
for such examples. - The next time user runs
yard doctest
, it runs all examples, but ignores the ones that are marked as "pending" inside configuration file. If example is passing now (e.g. user implemented it correctly) but it's marked as "pending" - command will say that this test should be removed from the configuration file. The logic is similar to RSpecsskip
vspending
.
This way we can get initially passing doctests and disable those that are not passing, but also guarantee that as soon as any they start passing - they won't be forgotten.
from yard-doctest.
The other road - some @example are doctest and some just are not - seems fought with peril.
You would want a way of clearly marking in source and generated docs what is a doctest and what isn't, and that's just a terrible amount of complication for something that is nice and simple now.
My idea was that any @example
can be used by a user as a "working" example. Can you share some of the examples which should not be "working" in a sense they should not be doctests?
from yard-doctest.
Related Issues (8)
- Deprecated warning from Minitest (5.11.3), wants `assert_nil`
- Thank you! HOT 1
- Support examples in README.md ? HOT 1
- Advanced examples HOT 3
- bug with using variable HOT 6
- `rake` is required although it's listed as dev dependency HOT 1
- Features fail on recent YARDs because they no longer auto-load plugins by default
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from yard-doctest.