DataFlow¶
The following documentation explains how to configure and deploy DataFlows via the HTTP service.
HttpChannelConfig¶
asynchronous: boolUnused right now but will be accessible over a websocket in the future and used for long running flows.
dataflow : DataFlowFlow to which inputs from request to path is forwarded too.
input_mode : strMode according to which input data is passed to the dataflow,
default:default.defaultInputs are expected to be mapping of context to list of input to definition mappings eg:
{ "insecure-package": [ { "value": "insecure-package", "definition": "package" } ] }
preprocess:definition_nameInput as whole is treated as value with the given definition after preprocessing. Supported preprocess tags : [json,text,bytes,stream]
path : strRoute in server.
output_mode : strMode according to which output from dataflow is treated.
bytes:content_type:OUTPUT_KEYSOUTPUT_KEYS are
.separated string which is used as keys to traverse the ouput of the flow. eg:results = { "post_input": { "hex": b'speak' } }
then
bytes:post_input.hexwill returnb'speak'.
text:OUTPUT_KEYSjsonOutput of dataflow (Dict) is passes as json
immediate_response: Dict[str,Any]If provided with a response, server responds immediately with it, whilst scheduling to run the dataflow. Expected keys:
status: HTTP status code for the responsecontent_type: MIME type, if not given, determined from the presence of body/text/jsonbody/text/json: One of this according to content_typeheaders
{ "status": 200, "content_type": "application/json", "data": {"text": "ok"} }