September 4 2015, Barcelona
Massimiliano Mantione
Rich Hickey
(Transducers in Clojure)
Kevin Beaty
(Transducers in Ramda, and much more)
Erik Meijer
(push-pull duality talk)
About Myself
Passionate about languages and compilers
Worked in the V8 team in Google
Overall, worked on JIT compilers for +7 years
Started as a Javascript hater
Now using my own language
that transpiles to Javascript
(Metascript)
Composable algorithmic transformations
Decoupled from input or output sources
They compose directly (like functions)
A stepper function...
A transformer object...
The reduce transformation...
...give us a transducer!
Then we can use
the transduce transformation
Stepper<I,E> :: (I, E) -> I
A function that takes an initial value and an element, and returns a new value for the initial item
It is the typical argument of reduce
Transformer<I,E,R> :: {
init :: () -> I // initial value
step :: Stepper<I,E> // transformation
result :: I -> R // build the result
}
Three functions, like a generalized stepper (with added begin and completion handlers)
(the base transformation)
TransformerOrStepper :: Transformer | Stepper
reduce<I,E,R> :: (
TransformerOrStepper<I,E,R> // transformation
I // initial value
Iterator<E> // input sequence
) -> R // result
Transducer :: TransformerOrStepper -> Transformer
A function that transforms a transformer into another transformer
(let that sink in...)
It transforms transformers!
(the generalized transformation)
transduce<I,E,R> :: (
Transducer<I,E,R> // transformation
Stepper<I,E> // result builder
I? // initial value
Iterator<E> // input sequence
) -> R // result
Transformations can be composed
Each stepper only deals with its own values
Intermediate steppers don't create intermediate collections
Only the final transformer fully builds the actual result
What does "async computation" mean?
Many say it is about push vs pull
I don't fully buy it
Let's see why
What does dual mean?
It is similar to De Morgan's laws for
or and and
Reverse the direction of every function!
// getter<T> (pull)
() -> T
// setter<T> (push)
T -> ()
Remember them!
(we'll find them everywhere)
Iterable<T> :: {
'@@iterator' :: Iterator<T> // getter
}
Observable<T> :: {
Subscribe :: Observer<T> -> (() -> ()) // setter
}
Iterator<T> :: {
// getter of one of three things (pulls them)
next :: () -> ({done :: bool, value :: T} | throws Error)
}
Observer<T> :: {
// three setters (accepts pushes)
onNext :: T -> ()
onCompleted :: () -> ()
onError :: Error -> ()
}
Both specify what to do in case of
What changes is just the push vs pull
Remember that transducers don't care about the sequence they operate on?
They work in push contexts, too!
And they work unmodified!
Let's see...
...but what about async?
A push API looks async, but when the push happens the code is synchronous
So, what construct is really async?
Promise<T> :: {
// Two setters (works in push mode)
resolve :: T -> () // value
reject :: Error -> () // error
}
A promise represents a future value
It models a latency between the value availability and the moment when the code is ready to handle it
It is different from an Observable because it does not model a sequence of values
Implemented using promises
AsyncSequence<T> :: Promise<AsyncSequenceElement<T>>
AsyncSequenceElement<T> :: {
// These three work like an iterator
value :: T?
done :: bool?
error :: Error?
// ...but next gives a promise!
next :: AsyncSequence<T>
}
Like the regular push-pull duality...
...but with promises thrown in to model latency
// async-getter<T> (pull)
() -> Promise<T>
// async-setter<T> (push)
Promise<T> -> ()
An Iterator that gives promises
An observer that expects promises
Note that ramda and underarm
already support this model!
Transducers model composable and reusable transformations of sequences of values
Several modern libraries are starting to support them
Push and Pull are dual
Async is properly represented by promises
-
code, docs and slides are on github
twitter: @M_a_s_s_i, #metascript
-