Skip to content

Instantly share code, notes, and snippets.

@gschueler
Last active June 7, 2017 15:36
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save gschueler/7da861528d2accd135ac2c9cbf8e7184 to your computer and use it in GitHub Desktop.
Save gschueler/7da861528d2accd135ac2c9cbf8e7184 to your computer and use it in GitHub Desktop.
Dev blog for data capture/data passing feature for Rundeck

June 6

  • job copy page should edit log filters
  • job xml encode/decode of workflow and step log filters correctly
  • add "highlight output" filter plugin
  • add "quiet output" filter plugin

worked on build fixes

June

prerelease branch created, worked on build completion

June 2

Add validation of wf and step log filters in job save/update actions.

Log filters only operate on 'log' events.

June 1

Log filter edit works in job create form.

Experimental: globa expand for all node values of a data variable:

${data.key*} #all values for all nodes, separated by comma
${data.key*X} #all values for all nodes, separated by X delimiter

May 30

Updates to Data type filter plugin.

Add @SelectLabels annotation for java @PluginProperty definitions.

May 26

Add colorized replcement text for Mask Passwords plugin

Improvements to Tabular data plugin.

Add new "Render Datatype" log filter plugin:

  • allows log message to mark a start/end to a datatype, e.g. html, markdown, csv, json

Add support for "Workflow level" log filter configuration.

Add a "Log context data" step to stub plugin.

May 25

bugfix: rundeck log format: multiline log events should not conjoin first two lines in rendered output

improvements to bundled spring-bean based plugin loading/registration.

May 24

More work on Gui support for log filters in job editor.

May 23

More work on Gui support for log filters in job editor.

May 22

Gui support for log filters in job editor.

bug fix: remove option in job editor causes JS error

mask passwords plugin: allow specifying replacement string key value capture plugin: allow specifying regex

May 19

  • Job editor: add workflow step duplicate button
  • begin gui support for step log filters in job editor

May 18

Fixes for content converter plugins, and simple data capture log filter plugin.

fix: log events emitted by log filter plugins should be within correct context.

fix: node-step context data is uplifted to node-context after the step is complete

fix regression: WorkflowEngineExecutor package name reverted due to plugin compatibility requirement.

May 17

More Data Capture work.

  • log data stored at correct context for node steps
  • Created a simple Key/Value capture plugin:
    • format: RUNDECK:DATA:[KEY]=[VALUE]

May 16

Proceeded working on Data Capture

  • add core components for creating and configuring log filter plugins
  • add more data to the context object available to plugins
  • add domain class support for storing plugin config for step types
  • update the log filter override mechanism: allow a global as well as step-specific override this is to enable project and job global filters. one reason is the MaskPasswords plugin below*
  • created a MaskPasswordsLogFilter plugin, as POC for the filter mechanism
    • obscures any log text that contains a known password value.
    • also handles unix escaped/quoted variants
    • can be applied at job/global level, so that e.g. DEBUG level output (env vars, etc), which does not come via specific Step output, can be handled

May 12/15

Did some refactoring of BaseWorkflowExecutor. I ended up not using this code, because the result was not as nice as I had wished. The goal was to reduce the complexity of the executeWorkflowStep method.

I tried creating a "Collaboration" pattern:

  • use a common "Collaboration" type to manage multiple actors
  • it contains a shared object which can be used to exchange data
  • it is configured with multiple "Collaborators" which can then be processed in order

Collaborators allow "prepare","process","postprocess", and "finish" actions, so that they can perform setup steps, as well as wrap-up steps after other collaborators.

this all mostly worked, however I did not like several things about it:

  • the logic of the entire flow is now convoluted among multiple types, and phases of the operation (prepare,process,postprocess, etc) making it hard to reason about what happens when
  • some type safety is lost unless the shared object types are explicit. making them explicit reduces the "dynamic" flexibility of the collaboration pattern

the things i liked about the idea:

  • the logical pieces of the executeWorkflowStep method were broken into composable chunks, idaelly making testing each chuck in isolation much simpler
  • new chunks could be added easily
  • potentially the logic or some part of it could be made pluggable

I think the "Collaboration" idea has some merit somehow, but i will have to revisit it and think more about it.

May 11

Begin work on Data Capture.

Add thread bound "logging manager" which can be used to swap in the plugin capture system in place of normal streaming log writer.# May 10

Added shared data context variable expansion for:

  • Job reference arguments
  • Job reference node filter

May 9

Added shared data context variable expansion for:

  • scripts using syntax @step:group.key/node@ instead of ${step:group.key@node}
  • script URL steps
  • step/nodestep plugins

May 8

  • update "Data content converter" plugins

  • update shared data context expansion

    • fix multi data context "resolve" in case intermediate scopes have no data
    • add "strict" flag and set it to true when any scope is specified in variable expansion template

May 5

Add ContentConverterPlugin implementation to render json/csv/markdown/html into logs directly

May 4

Added ability to specify scope when plugins output data values.

May 3

create PR rundeck/rundeck#2482

May 2

Did a lot of work on DataContext. The shared data context now has the step data separated into "views" where a view is either global, a specific node step (number+node) or a specific node. When evaluating variable expansion, the current node is assumed for the context, but when loading the data, the scope can be widened to find a variable in a wider scope. Or the variable can specify the exact scope to use.

Todos: (added to goals)

Add lombok to core dependencies, finding it quite useful.

Added tentative syntax for expanding variables in various contexts:

'abc'            | 'abc'
'${a.b}'         | 'global'
'${a.b@^}'       | 'global'
'${a.b@node1}'   | 'node1'
'${2:a.b@node1}' | 'step2 node1'
'${2:a.b}'       | 'step2'
'${2:a.b}'       | 'step2'

Added a test "stub" step and node step plugin, to inject output variables into the data context, allowing properties, json, or yaml input text.# May 1

todo: DataContext refactor to support qualified shared context (Step, node)

worked on DataContext

April 28

9:25am: Some more refactoring so far, mostly cleanup. Not happy with the massive code sprawl in unhelpful package names.

Created dev blog.

10am:

Fixing tests related to refactoring...

12:30pm:

Propagating thread interruption caused inadverant test failures: fix by Future.get on all futures instead of Thread.wait

4:30pm

Additional refactoring and clenaup

April 27

Previous work: add workflow engine feature: pass a shared data object between steps, allow results of one step to be merged to the shared data object, and allow the shared object to produce a data value as input to newly executing operations.

Today I did a bunch of refactoring. I spent some time (sidetrack) updating Grails to 2.5.6, so that finally Java 8 lambdas work correctly. The problem was with the "spring loaded" dependency in 2.4.4 not being able to handle lamda/method references in code loaded via spring beans. The update to Grails 2.5.6 fixed this by updating the dependency. Had to fix a number of minor issues for the upgrade, but finally it works.

The refactoring:

Introduced some lambdas/method references, replaced guava functional stuff with Java 8.

The main "processOperations" method in WorkflowEngine was too complex: SonarLint flagged it. So I spent some time trying to reduce that complexity. Extracted a number of methods, and tried to simplify the loop logic. This left me with a number of methods all passing the same variables around. So I decided to construct a new class that encapsulates the common variables, which also reduced the complexity a bit.
Not satisfied with the WorkflowEngine/WorkflowEngineOperationsProcessor interdependencies right now, but at least the complexity is reduced.

Added a test: verify the shared data object is passed to operations as expected, and merges the data result of previous steps.

Goals:

Data passing: Workflow steps can export data values that are made available to subsequent steps. Data capture: Step logger system can filter/process output log events to extract useful data values and use the data passing feature to pass them along.

Tasks:

Data Passing:

  • Ability to pass data from one step to the next. This will be via a workflow global data context.
  • the shared data context should manage a set of contextual data: the data from each step, identified by step number, as well as the data from each step+node, identified by a node name and step number
  • operations should update the shared data context appropriately based on their number and with appropriate node data
  • operations should provide the shared data context (qualified by node for node steps) to the ExecutionContext used by workflow step execution
  • step context should allow Output data to specify a scope, e.g. export to global scope instead of local scope.
  • variable expansion should resolve values by widening scope if scope is not specified
  • do not widen scope if scope is explicit
  • workflow step execution should enhance the data variable expansion ${option.x} feature to allow dereferencing data values for specific steps and/or nodes, but have the current node (for node steps) be a default qualifier in that context.
  • use shared data expansion in place of basic data context expansion (script args, plugin configs, etc)
  • job references args
  • job reference node filter
  • support for uplifting captured data to higher scope(s)
    • step+node scope: auto uplifting to node scope
    • any scope: allow uplifting to global scope
  • support for list/map values (after data capture)\
    • use glob expand for all values in a list, ${data.key*}

Data Capture:

  • introduce data capture plugin interfaces, allowing filtering/reading/modifying step log output before passing to log handler
  • allow configuring steps with specific filter plugin types+config values
    • gui support
  • add intercepting of log data for the steps, using configured plugin for the step
  • node steps should intercept logs as well, and emit data for the appropriate contextview
  • after step is complete, use Data Passing to emit captured data values from the filter plugin
    • data passing feature: steps/plugins declare "exported" variables to automatically uplift to wider scope
  • add ability for Job definitions to define exported variables, using capture data values
  • add ability for Job definitions to define global filters?

Misc

  • disallow overriding core ${option.x} values..
  • filters only operate on log type?
  • log filter plugin context should emit new log entries in original logging context
  • shared context in executioncontext leaky?
  • job import plugin validation
  • documentation
    • new plugin types (log filter, content converter)
    • new bundled plugins
    • data capture and log filtering
  • ant upgrade
  • bug: edit job, remove global filters, does not save
  • data capture: filter to NORMAL log data
  • job refs: enable correct capture plugins for sub jobs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment