- job copy page should edit log filters
- job xml encode/decode of workflow and step log filters correctly
- add "highlight output" filter plugin
- add "quiet output" filter plugin
worked on build fixes
prerelease branch created, worked on build completion
Add validation of wf and step log filters in job save/update actions.
Log filters only operate on 'log' events.
Log filter edit works in job create form.
Experimental: globa expand for all node values of a data variable:
${data.key*} #all values for all nodes, separated by comma
${data.key*X} #all values for all nodes, separated by X delimiter
Updates to Data type filter plugin.
Add @SelectLabels
annotation for java @PluginProperty
definitions.
Add colorized replcement text for Mask Passwords plugin
Improvements to Tabular data plugin.
Add new "Render Datatype" log filter plugin:
- allows log message to mark a start/end to a datatype, e.g. html, markdown, csv, json
Add support for "Workflow level" log filter configuration.
Add a "Log context data" step to stub plugin.
bugfix: rundeck log format: multiline log events should not conjoin first two lines in rendered output
improvements to bundled spring-bean based plugin loading/registration.
More work on Gui support for log filters in job editor.
More work on Gui support for log filters in job editor.
Gui support for log filters in job editor.
bug fix: remove option in job editor causes JS error
mask passwords plugin: allow specifying replacement string key value capture plugin: allow specifying regex
- Job editor: add workflow step duplicate button
- begin gui support for step log filters in job editor
Fixes for content converter plugins, and simple data capture log filter plugin.
fix: log events emitted by log filter plugins should be within correct context.
fix: node-step context data is uplifted to node-context after the step is complete
fix regression: WorkflowEngineExecutor package name reverted due to plugin compatibility requirement.
More Data Capture work.
- log data stored at correct context for node steps
- Created a simple Key/Value capture plugin:
- format:
RUNDECK:DATA:[KEY]=[VALUE]
- format:
Proceeded working on Data Capture
- add core components for creating and configuring log filter plugins
- add more data to the context object available to plugins
- add domain class support for storing plugin config for step types
- update the log filter override mechanism: allow a global as well as step-specific override this is to enable project and job global filters. one reason is the MaskPasswords plugin below*
- created a MaskPasswordsLogFilter plugin, as POC for the filter mechanism
- obscures any log text that contains a known password value.
- also handles unix escaped/quoted variants
- can be applied at job/global level, so that e.g. DEBUG level output (env vars, etc), which does not come via specific Step output, can be handled
Did some refactoring of BaseWorkflowExecutor.
I ended up not using this code, because the
result was not as nice as I had wished. The goal was to reduce the complexity of the
executeWorkflowStep
method.
I tried creating a "Collaboration" pattern:
- use a common "Collaboration" type to manage multiple actors
- it contains a shared object which can be used to exchange data
- it is configured with multiple "Collaborators" which can then be processed in order
Collaborators allow "prepare","process","postprocess", and "finish" actions, so that they can perform setup steps, as well as wrap-up steps after other collaborators.
this all mostly worked, however I did not like several things about it:
- the logic of the entire flow is now convoluted among multiple types, and phases of the operation (prepare,process,postprocess, etc) making it hard to reason about what happens when
- some type safety is lost unless the shared object types are explicit. making them explicit reduces the "dynamic" flexibility of the collaboration pattern
the things i liked about the idea:
- the logical pieces of the
executeWorkflowStep
method were broken into composable chunks, idaelly making testing each chuck in isolation much simpler - new chunks could be added easily
- potentially the logic or some part of it could be made pluggable
I think the "Collaboration" idea has some merit somehow, but i will have to revisit it and think more about it.
Begin work on Data Capture.
Add thread bound "logging manager" which can be used to swap in the plugin capture system in place of normal streaming log writer.# May 10
Added shared data context variable expansion for:
- Job reference arguments
- Job reference node filter
Added shared data context variable expansion for:
- scripts using syntax
@step:group.key/node@
instead of${step:group.key@node}
- script URL steps
- step/nodestep plugins
-
update "Data content converter" plugins
-
update shared data context expansion
- fix multi data context "resolve" in case intermediate scopes have no data
- add "strict" flag and set it to true when any scope is specified in variable expansion template
Add ContentConverterPlugin
implementation to render json/csv/markdown/html into logs directly
Added ability to specify scope when plugins output data values.
create PR rundeck/rundeck#2482
Did a lot of work on DataContext. The shared data context now has the step data separated into "views" where a view is either global, a specific node step (number+node) or a specific node. When evaluating variable expansion, the current node is assumed for the context, but when loading the data, the scope can be widened to find a variable in a wider scope. Or the variable can specify the exact scope to use.
Todos: (added to goals)
Add lombok to core dependencies, finding it quite useful.
Added tentative syntax for expanding variables in various contexts:
'abc' | 'abc'
'${a.b}' | 'global'
'${a.b@^}' | 'global'
'${a.b@node1}' | 'node1'
'${2:a.b@node1}' | 'step2 node1'
'${2:a.b}' | 'step2'
'${2:a.b}' | 'step2'
Added a test "stub" step and node step plugin, to inject output variables into the data context, allowing properties, json, or yaml input text.# May 1
todo: DataContext refactor to support qualified shared context (Step, node)
worked on DataContext
9:25am: Some more refactoring so far, mostly cleanup. Not happy with the massive code sprawl in unhelpful package names.
Created dev blog.
10am:
Fixing tests related to refactoring...
12:30pm:
Propagating thread interruption caused inadverant test failures: fix by Future.get on all futures instead of Thread.wait
4:30pm
Additional refactoring and clenaup
Previous work: add workflow engine feature: pass a shared data object between steps, allow results of one step to be merged to the shared data object, and allow the shared object to produce a data value as input to newly executing operations.
Today I did a bunch of refactoring. I spent some time (sidetrack) updating Grails to 2.5.6, so that finally Java 8 lambdas work correctly. The problem was with the "spring loaded" dependency in 2.4.4 not being able to handle lamda/method references in code loaded via spring beans. The update to Grails 2.5.6 fixed this by updating the dependency. Had to fix a number of minor issues for the upgrade, but finally it works.
Introduced some lambdas/method references, replaced guava functional stuff with Java 8.
The main "processOperations" method in WorkflowEngine was too complex: SonarLint flagged it. So I spent some time
trying to reduce that complexity. Extracted a number of methods, and tried to simplify the loop logic.
This left me with a number of methods all passing the same variables around. So I decided to construct a new class
that encapsulates the common variables, which also reduced the complexity a bit.
Not satisfied with the WorkflowEngine/WorkflowEngineOperationsProcessor interdependencies right now, but at least
the complexity is reduced.
Added a test: verify the shared data object is passed to operations as expected, and merges the data result of previous steps.