explorable-viz / fluid Goto Github PK
View Code? Open in Web Editor NEWData-linked visualisations
Home Page: http://f.luid.org
License: MIT License
Data-linked visualisations
Home Page: http://f.luid.org
License: MIT License
Adopt a "native" (compiled) values approach similar to lambdacalc-old, simplifying where possible, enhancing where necessary.
Prototype the following, omitting tracing, annotations and execution indexing:
keyof T
Lex.Ctr
, Lex.Var
setα
helperarithmetic.lcalc
length
parametric in elementslength
depends on tailmap.lcalc
filter
to function that counts the elements that satisfy the predicatelookup
insensitive to keys in unexplored parts of the treelookup
sensitive to keys in explored parts of the treeNil
in reverse
erases outermost Cons
of outputNumber
class to parse matching strings to Javascript numberImplementation tasks:
Enough work to draw some bars:
Point
datatype (a "path" will just be a list of points, for now)index.html
?Rectangle
datatypestackRight
No support for nested patterns?
Data type constructors should automatically introduce projections for extracting field values, as well as some kind of field update notation. This feature potentially interacts with #48 because fields of different constructors may have the same name - in Haskell this would be an error, but we might want to treat the duplicate field like an overloaded function/implicit typeclass. Ideally such accessors/updaters would be first-class.
Not sure what to do here - loathe to drop the whole demand-driven approach, but also conscious of the fact that implementing incremental demand-driven properly is a non-trivial task. And a naive non-incremental but exploratory discovery of the output will be insanely redundant.
Notes:
match
is essentially trivial when the trie is Top - it simply promotes the value into the corresponding isomorphic trie - so there's not much to gain by worrying about this nowmatch
, but the default demand is now Top and those tests don't invoke match
Instead __shallowMergeAssign should recurse into "value" objects.
More integrated view of explained values and the demands which forced them to evaluate. Precursor to implementing a UI.
matchArgs
problem: inj
should expect a trie product, no?matchArgs
problem: don't understand how it should workmatchArgs
problem: product (recursive case) looks wronginstanceof
with is
pattern"Erased" values (that arise either through forward slicing or through demanding a computation with a variable trie) are represented as null
, where elsewhere I have started to use special Bot
values (with addresses). In particular bottom environments can't be represented as null
because they (probably) need to preserve the structure of the environment that they are a slice of. Also, now that Kont
is a sensible interface, it is not possible to have null
implement Kont
.
Uniformity suggests that values should be treated the same as other erases terms. But there seems to be a more fundamental problem with allowing erased values to not have addresses: it can impact the identity of any computation whose address is determined by an environment containing that value. This violates monotonity, since slicing can create "new" computations.
A naive first fix might be to simply ensure that erased values always have addresses. This is closer to what we want, but still doesn't work because in the 'non-terminal' cases we delegate the address of the result to a subcomputation. If we somehow don't know what that subcomputation is because of slicing then we can't allocate an address consistent with the "baseline" run, again breaking monotonicity.
I think what this reinforces is the idea that (dynamic) slicing is always with respect to a fixed, fully known, computation. The identity of all subcomputations is fixed for the slice. Slicing simply propagates bottom.
Some infrastructure to make testing easier and to gradually tease out an end-user interaction model:
parseExample
FwdSlicing
helperBwdSlicing
helpernewRevision
on
method which asserts class of syntax node from other navigation helpersneeded
, notNeeded
, need
, notNeed
)This is now supported, although not as useful as one might have hoped given the way we merge patterns into a single trie. Merging _
with a (named) variable is illegal, so a variable has to be unused on any branch for it to be substituted with _
.
We also now (like Haskell 98) treat _
as a reserved identifier, so that it is for example illegal to define a function called _
. The variable pattern grammar is extended specifically to include _
. (This makes the Nearley parser produce an absurdly long error message, but one which at least starts with the string Unexpected keyword token: “_”.
)
rectsAsPoly
doesn't yet make sense as we lack a generic way to draw a (filled) polygonApp.ts
Rect
(will generalise to polygons)fun y x → e
as shorthand for fun y → fun x → e
A more nested approach to charts, inspired by SVG's nested viewports/coordinate systems and Tomas's Compost library.
Graphic
element: list of GraphicsElements
with left-to-right rendering orderGraphic
rather than concat
to assemble bar chart example from chart + 2 axesTranslate
element not to be imperative but rather to apply to a contained Graphic
Transpose
to work in the same way and port axes to new approachBot
and null
constructsannotation
closeDefs
lookup
eval
(requires resolving issue with shared values)As per my "demand-indexed computation" experiment. Imported from rolyp/lambdacalc-old#4.
Trie
datatype, analogous to Constr
:
Trie.Var
maps any argument to a continuation, plus environment binding the variableTrie.Prim
(?) that demands a strict value of primitive typePrimOp
should be an expression form?Expr.OpName
(consolidate with Expr.Var
)Trie.Constr
case via nested tries**match..as
semantics, based on triesHaving an expression-trace distinction is nice in some ways, but giving instantiate
(the function that both turns an expression into a traced value, and gives it an id unique to its runtime environment) a meaningful type is quite hard: expressions include tries, that include nested tries with correspondingly complex types, and those types transform according to the embedding. Without type-level functions, this is hard to capture.
I think it might be preferable to build traced values directly in the parser, as we did previously, so that the only job of instantiate
is to translate the ids, leaving the metatype of the input unchanged. This should allow for more precise typing.
Subtasks:
instantiate
can be handled better via trie mapinstantiate
should take a Traced
, and extract its "expression id" to make the new runtime idKont
supertype(s)instantiate
functions more meaningful typesExpr
moduleSome notion of line/polyline with variable thickness. Played about with this:
import * as Meshline from "three.meshline"
export class ThickPath extends THREE.Geometry {
constructor (path: THREE.Vector2[]) {
super()
for (const point of path) {
this.vertices.push(new THREE.Vector3(point.x, point.y, 0))
}
}
object3D (): THREE.Object3D {
const line = new Meshline.MeshLine()
line.setGeometry(this)
const material = new Meshline.MeshLineMaterial({
color: new THREE.Color(0x000000),
sizeAttenuation: 0,
lineWidth: 0.020
})
return new THREE.Mesh( line.geometry, material )
}
}
Allow a single computation to be executed with two different (but related) demands.
To allow two different computations to be executed (with corresponding demands) and compared, we will need to parameterise evaluation on a "world" (or "version"); currently this is global:
If we had this notion then we could probably use it to execute a single computation with two different demands.
at
and make calling version
part of at
__version
, __mergeAssign
, __merge
Basic debugging:
bot
clears all annotations on an existing computationuneval
produces same expression (id) passed to eval
uneval
step into every testsetall
Sync with new match design:
ExplVal.Match
MatchId
based on ValId
- requires versioned trieslookup
(merge with match
)mapMatch
, mapArgs
by setκ
(might need to revert to something close to map)eval_
to store matches in tracesuneval
to use matches to unmatchFirst pass over:
bot
for annotated termsjoin
for environments - actually compute this as we go along, LVar-styleuncloseDefs
unmatch
uninstantiate
uneval
Spec for core language:
Spec for implementation language:
Haven't got any kind of module mechanism but should at least be able to put some core prelude definitions into a source file which is automatically loaded for every program.
lib/prelude
lib/graphics
The behaviour of match
is quite complex and essentially reconstructs information discarded during evaluation. Instead we could build these structures as we go along.
Extend bar chart example with at least one additional view and use this to drive new features.
PathStroke
shouldn't close path by defaultScale
transform and use instead of manual division by 30ceiling
primitiveA basic observation that started with our CONCUR 2016 paper and influenced the approach in our ICFP 2017 materialises again in this setting. Rather than starting with a meet-preserving "hole-propagating" evaluation semantics which doesn't require a trace, and then concluding that it has a lower adjoint, instead start with the "obvious" definition of the lower adjoint (which does require a trace), and conclude that it has an upper adjoint. That upper adjoint is a "forward dataflow" that pushes holes through the structure of an existing computation.
This way of setting up slicing is preferable whenever (as is turning out to be the majority of the time) hole-propagating forward evaluation doesn't have enough information to implement the upper adjoint. For us this arises in two situations:
Evaluation requires an output environment containing values bound during pattern-matching of the computed value; the "shape" of this environment isn't known statically, because the demand partitions the type and assigns a different pattern context to each partition. When forward slicing, we need to produce an environment slice of the correct shape; a universal "bottom" environment would be too destructive.
When computing the positions of graphical elements (e.g. bars in a bar chart), things like the width of the bars to the left are "dependees"; a naive hole-propagating semantics will be unable to render most of an image once holes appear. What we want instead is to preserve the structure of the image, but use the slicing to highlight parts of the image. (Less sure about this point.)
data
in the bar-chart
exampleBot
forms for Expr
and Trace
; we need these so that holes still have "addresses", and in particular sliced environments can work with instantiate
Bot
Bot
environments: build relative to the "complete" (unsliced) world
Result
to allow access to prior environment - requires explicit EvalKey
because same computation with different demands produces output environments with different shapesBot
continuations: in particular what does this mean when the continuation is a trie?instantiate
using earlier version of expression (see #33)populateScene
should render null elements in a particular waynewPathGeometry
should do the same with null pointsgetProp
to have a static type argument?__commit
The following parses incorrectly without the brackets around Nil
:
concat
(Nil)
(map (map transpose) (axis (length rects') width))
Could use a more Haskell-like (function-like) syntax for constructors, but need to consider the fact that constructors are not yet first-class. If we made constructors first-class, we could use #327 to allow the user to write infix :
instead of Cons
, which would leave us with array syntax very much like PureScript, although it's not clear how to make that work in patterns. I would probably then drop the old [x, …xs]
notation from the TypeScript implementation.
Open questions (not all necessarily related to this problem):
Cons e e’
rather than Cons(e, e’)
?Cons
and Cons e
be values?[e, e', e'']
For now what we need is a simple, reasonably familiar design that also leaves us open to improvements in the future. Current proposal:
eval
conjoined with annotation on partial constructor being applied (when saturated, consistent with previous monolithic approach)apply
interface to Unary
and Binary
, so the Binary
implementation can take care of creating the PartialApp
:
join
of partial eliminators should be fatal, rather than cause backtrackingSomething like a dynamically-typed version of Haskell's type classes. Also look at Clojure's multimethods, Agda's instance arguments. Basically just want the most brain-dead way of supporting ad hoc polymorphism. No really any way to support Haskell-style return type polymorphism (in the absence of type inference).
Use cases to support:
Should be able to write nested matches using match e e'
by analogy with function definitions.
util
submodule, delete unused bits (including memo
)ValueObject
constructor_
methods in versioned objects - doesn't workThe current system is too complicated for me to get my head around. One issue relates to the interaction between slicing and the (unsatisfying) notion of a "top" demand, which we need to force strict evaluation for images.
A "top" demand behave like a fully nested pattern: for example evaluating (with "top") a program which outputs a Rect
which is a pair of points would be akin to evaluating with the following demand:
Rect(Point(x, y), Point(h, w)) -> ...
If the first point is undefined (evaluates to bottom), then the match of the subsequent point also fails because the continuation (which in this case is the demand on the second point) is undefined. Thus top demands in general "linearise" the output in a way that effectively causes cascading erasure of the output.
A top demand behaves like a fully strict client application, one which starts with a maximally deep pattern match - effectively requiring the output to have a very specific structure without any independent parts.
dimensions
function to compute size of GraphicsElement
stackRight
in terms of width
spaceRight
in terms of width
x + width
and y + height
?get_x
/put_x
__shallowMergeAssign
axis
on sep
Currently we depend on the specific structure of the values computed by the application (see #50), whereas maybe the interpreter should operated on "compiled" values directly. At the very least we should be able to do datatype-generic reflection, rather than the currently hardcoded approach.
getRectsAxes
and helpers (instead of populateScene
) should:
Rect
values directlyPath
values directlyshape
variable evaluates to null
in App.ts
__shallowMergeAssign
:
__shallowCopy
? (fix by factoring latter through former)lookupArg
to support ES6Object.freeze
for interned objectsreflect
to convert Value
to native valuePunt for now:
Bar chart:
barGraph
library function taking categorical databarGraph
, replacing numerical constant 4
x-axis
function to return offset, rather than caller to assume axis_margin + tickLength
Line chart:
lineGraph
library function taking numerical datax-axis formatting:
y-axis formatting:
Still to do:
Tidy up memoisation infrastructure and put some proper backward-slicing tests in place.
uneval
need not return anything - actually for uninstantiate
and unmatch
useful to propagate (inverse) addresses through function callslookupArg_
and lookupArg
bar-chart
examplelength
: needing results needs only cons cells from input listsnormalise
:
zipW
:
zipW
itselfop
in body of zipW
to pair, but not components of pairDrop "demand-indexed" evaluation; too much of a distraction and not directly relevant to what we want to do. Closely related to backwards slicing, but not equivalent in that it doesn't automatically compute slices of functions.
We will retain tries-with-variable-binding as a nice approach to pattern matching, but will no longer need "top" tries, which are a not-entirely-straightforward special case at the moment (see #74).
Top
tries and argument trieseval_
; drop EnvEntry
, evalArgs
EvalKey
, Result
and Results
no longer requiredletrec
can only define recursive functions, not values; revisit closeDefs
lookup
which returns an environmentPrimOp.invoke
and delete demands of primitive opsmerge
/LVar semantics? -- noreflect
to propagate annotationsPathStroke
bar-chart
testiterate
(no I think it's always quadratic)MemoKey
or similar and consolidate ExprId
etc with keys used by memo
Currently instantiate
translates an expression address by an appropriate sequence of value addresses, and promotes it from an Expr
to Trace
. This requires us to have trace forms that support nulls purely as an interim state after instantiate
but before eval
. Moreover, the logic of instantiate
(while probably equivalent to the formalism), is presented differently in a way that isn't helpful. We should simplify. I seem to have the worst of both worlds at the moment: a distinction between Expr
and Trace
, yet a runtime representation of Expr
s as Trace
s.
eval
should take an expression form not a traceeval
to call instantiate
in the cases specified in the formalism (beta reduction)Trace
forms no longer need to support null
as an interim state - there may still be something analogous for error cases, e.g. if the function component of an application does not evaluate to a function, then the value returned by the application must be Bot
(without an address, so that it can be "below" any value with an address).Traced.RecDef
?Traced.Trie
, Args
and Kont
join
implementation out of class hierarchy to avoid problem with type signatureEvalId
and related gumpfThere is a potential issue with respect to monotonicity across computations - if the demand increases, then a previously "suspended" expression can become needed, and the relation between an expression and any of its traces would need to be an increasing one. For now the trace of an unneeded expression is bottom; we can reintroduce the embedding of expressions into traces later if needed (e.g. to visualise code given only a trace).
"Increasingness" within a computation should be unaffected.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.