13 KiB
streamline/lib/compiler/command
Streamline commmand line analyzer / dispatcher
command.run()
runsnode-streamline
command line analyzer / dispatcher
streamline/lib/compiler/compile
Streamline compiler and file loader
script = compile.loadFile(_, path, options)
Loads Javascript file and transforms it if necessary.
Returns the transformed source.
Ifpath
isfoo_.js
, the source is transformed and the result is not saved to disk.
Ifpath
isfoo.js
and if afoo_.js
file exists,foo_.js
is transformed if necessary and saved asfoo.js
.
Ifpath
isfoo.js
andfoo_.js
does not exist, the contents offoo.js
is returned.
options
is a set of options passed to the transformation engine.
Ifoptions.force
is set,foo_.js
is transformed even iffoo.js
is more recent.script = compile.loadFileSync(path, options)
Synchronous version ofcompile.loadFile
.
Used byrequire
logic.compile.compile(_, paths, options)
Compiles streamline source files inpaths
.
Generates afoo.js
file for eachfoo_.js
file found inpaths
.paths
may be a list of files or a list of directories which will be traversed recursively.
options
is a set of options for thetransform
operation.
streamline/lib/compiler/register
Streamline require
handler registration
register.register(options)
Registersrequire
handlers for streamline.
options
is a set of default options passed to thetransform
function.
streamline/lib/compiler/transform
Streamline's transformation engine
transformed = transform.transform(source, options)
Transforms streamline source.
The followingoptions
may be specified:tryCatch
controls exception handlinglines
controls line mappingcallback
alternative identifier if_
is already used.noHelpers
disables generation of helper functions (__cb
, etc.)
streamline/lib/require/client/require
Client-side require script
id = module.id
theid
of the current module.module = require(id)
requires a module synchronously.
id
must be a string literal.module = require.async(id, _)
requires a module asynchronously.
id
may be a variable or an expression.main = require.main
return the main modulerequire.main(id)
loads main module from HTML page.
streamline/lib/require/server/require
Server-side require handler
Handles require requests coming from the client.
dispatcher = require.dispatcher(options)
returns an HTTP request dispatcher that responds to requests issued by the client-siderequire
script.
The dispatcher is called asdispatcher(_, request, response)
streamline/lib/streams/server/streams
Server Streams module
The streams
module contains pull mode wrappers around node streams.
These wrappers implement a pull style API.
Instead of having the stream push the data to its consumer by emitting data
and end
events,
these wrappers let the consumer pull the data from the stream by calling asynchronous read
methods.
For a bit more background on this design, you can read this blog post
For a simple example of this API in action, see the google client example
Emitter
Base wrapper for all objects that emit an end
or close
event.
All stream wrappers derive from this wrapper.
wrapper = new streams.Emitter(stream)
creates a wrapper.emitter = wrapper.emitter
returns the underlying emitter. The emitter stream can be used to attach additional observers.emitter = wrapper.unwrap()
unwraps and returns the underlying emitter.
The wrapper should not be used after this call.
ReadableStream
All readable stream wrappers derive from this wrapper.
stream = new streams.ReadableStream(stream, [options])
creates a readable stream wrapper.stream.setEncoding(enc)
sets the encoding. returnsthis
for chaining.data = stream.read(_, [len])
reads asynchronously from the stream and returns astring
or aBuffer
depending on the encoding.
If alen
argument is passed, theread
call returns whenlen
characters or bytes (depending on encoding) have been read, or when the underlying stream has emitted itsend
event.
Withoutlen
, the read calls returns the data chunks as they have been emitted by the underlying stream.
Once the end of stream has been reached, theread
call returnsnull
.data = stream.readAll(_)
reads till the end of stream.
Equivalent tostream.read(_, -1)
.stream.unread(chunk)
pushes the chunk back to the stream.
returnsthis
for chaining.
WritableStream
All writable stream wrappers derive from this wrapper.
stream = new streams.WritableStream(stream, [options])
creates a writable stream wrapper.stream.write(_, data, [enc])
Writes the data.
This operation is asynchronous because it drains the stream if necessary.
If you have a lot of small write operations to perform and you don't want the overhead of draining at every step, you can write to the underlying stream withstream.emitter.write(data)
most of the time and callstream.write(_, data)
once in a while to drain.
Returnsthis
for chaining.stream.end()
signals the end of the send operation.
Returnsthis
for chaining.
HttpServerRequest
This is a wrapper around node's http.ServerRequest
:
This stream is readable (see Readable Stream above).
request = new streams.HttpServerRequest(req, [options])
returns a wrapper aroundreq
, anhttp.ServerRequest
object.
Theoptions
parameter can be used to passlowMark
andhighMark
values.method = request.method
url = request.url
headers = request.headers
trailers = request.trailers
httpVersion = request.httpVersion
connection = request.connection
socket = request.socket
(same ashttp.ServerRequest
)
HttmServerResponse
This is a wrapper around node's http.ServerResponse
.
This stream is writable (see Writable Stream above).
response = new streams.HttpServerResponse(resp, [options])
returns a wrapper aroundresp
, anhttp.ServerResponse
object.response.writeContinue()
response.writeHead(head)
response.setHeader(name, value)
value = response.getHeader(head)
response.removeHeader(name)
response.addTrailers(trailers)
response.statusCode = value
(same ashttp.ServerResponse
)
HttpServer
This is a wrapper around node's http.Server
object:
server = new streams.HttpServer(requestListener, [options])
creates the wrapper.
requestListener
is called asrequestListener(request, response, _)
whererequest
andresponse
are wrappers aroundhttp.ServerRequest
andhttp.ServerResponse
.server.listen(_, port, [host])
server.listen(_, path)
(same ashttp.Server
)
HttpClientResponse
This is a wrapper around node's http.ClientResponse
This stream is readable (see Readable Stream above).
response = request.response(_)
returns the response stream.status = response.statusCode
returns the HTTP status code.version = response.httpVersion
returns the HTTP version.headers = response.headers
returns the HTTP response headers.trailers = response.trailers
returns the HTTP response trailers.response.checkStatus(statuses)
throws an error if the status is not in thestatuses
array.
If only one status is expected, it may be passed directly as an integer rather than as an array.
Returnsthis
for chaining.
HttpClientRequest
This is a wrapper around node's http.ClientRequest
.
This stream is writable (see Writable Stream above).
request = streams.httpRequest(options)
creates the wrapper.
The options are the following:method
: the HTTP method,'GET'
by default.headers
: the HTTP headers.url
: the requested URL (with query string if necessary).proxy.url
: the proxy URL.lowMark
andhighMark
: low and high water mark values for buffering (in bytes or characters depending on encoding).
Note that these values are only hints as the data is received in chunks.
response = request.response(_)
returns the response.request.abort()
aborts the request.
NetStream
This is a wrapper around streams returned by TCP and socket clients:
These streams is both readable and writable (see Readable Stream and Writable Stream above).
stream = new streams.NetStream(stream, [options])
creates a network stream wrapper.
TCP and Socket clients
These are wrappers around node's net.createConnection
:
client = streams.tcpClient(port, host, [options])
returns a TCP connection client.client = streams.socketClient(path, [options])
returns a socket client.
Theoptions
parameter of the constructor provide options for the stream (lowMark
andhighMark
). If you want different options forread
andwrite
operations, you can specify them by creatingoptions.read
andoptions.write
sub-objects insideoptions
.stream = client.connect(_)
connects the client and returns a network stream.
streamline/lib/tools/docTool
Documentation tool
Usage:
node streamline/lib/tools/docTool [path]
Extracts documentation comments from .js
files and generates API.md
file
under package root.
Top of source file must contain /// !doc
marker to enable doc extraction.
Documentation comments must start with ///
(with 1 trailing space).
Extraction can be turned off with /// !nodoc
and turned back on with /// !doc
.
The tool can also be invoked programatically with:
doc = docTool.generate(_, path)
extracts documentation comments from filepath
streamline/lib/util/flows
Flows Module
The streamline/lib/util/flows
module contains some handy utilities for streamline code
Array utilities
The following functions are async equivalents of the ES5 Array methods (forEach
, map
, filter
, ...)
flows.each(_, array, fn, [thisObj])
appliesfn
sequentially to the elements ofarray
.
fn
is called asfn(_, elt, i)
.result = flows.map(_, array, fn, [thisObj])
transformsarray
by applyingfn
to each element in turn.
fn
is called asfn(_, elt, i)
.result = flows.filter(_, array, fn, [thisObj])
generates a new array that only contains the elements that satisfy thefn
predicate.
fn
is called asfn(_, elt)
.bool = flows.every(_, array, fn, [thisObj])
returns true iffn
is true on every element (ifarray
is empty too).
fn
is called asfn(_, elt)
.bool = flows.some(_, array, fn, [thisObj])
returns true iffn
is true for at least one element.
fn
is called asfn(_, elt)
.result = flows.reduce(_, array, fn, val, [thisObj])
reduces by applyingfn
to each element.
fn
is called asval = fn(_, val, elt, i, array)
.result = flows.reduceRight(_, array, fn, val, [thisObj])
reduces from end to start by applyingfn
to each element.
fn
is called asval = fn(_, val, elt, i, array)
.
Object utility
The following function can be used to iterate through object properties:
flows.eachKey(_, obj, fn)
callsfn(_, key, obj[key])
for everykey
inobj
.
Workflow Utilities
fun = flows.funnel(max)
limits the number of concurrent executions of a given code block.
The funnel
function is typically used with the following pattern:
// somewhere
var myFunnel = flows.funnel(10); // create a funnel that only allows 10 concurrent executions.
// elsewhere
myFunnel(_, function(_) { /* code with at most 10 concurrent executions */ });
The diskUsage2.js
example demonstrates how these calls can be combined to control concurrent execution.
The funnel
function can also be used to implement critical sections. Just set funnel's max
parameter to 1.
results = flows.collect(_, futures)
collects the results of an array of futures
Context propagation
Streamline also allows you to propagate a global context along a chain of calls and callbacks. This context can be used like TLS (Thread Local Storage) in a threaded environment. It allows you to have several active chains that each have their own global context.
This kind of context is very handy to store information that all calls should be able to access
but that you don't want to pass explicitly via function parameters. The most obvious example is
the locale
that each request may set differently and that your low level libraries should
be able to retrieve to format messages.
The streamline.flows
module exposes two functions to manipulate the context:
oldCtx = flows.setContext(ctx)
sets the context (and returns the old context).ctx = flows.getContext()
returns the current context.
Miscellaneous
flows.nextTick(_)
nextTick
function for both browser and server.
Aliased toprocess.nextTick
on the server side.result = flows.apply(_, fn, thisObj, args, [index])
Helper to applyFunction.apply
to streamline functions.
Equivalent toresult = fn.apply(thisObj, argsWith_)
whereargsWith_
is a modified argument list in which the callback has been inserted atindex
(at the end of the argument list ifindex
is not specified).