You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently Chimney attempts to derive as much as possible with a single expression. It inline results on Scala 2 it uses an approach where when implicit is summoned it is expected that only user-provided implicit would be used. This is done because:
if user provided an implicit to override transformation, we have no choice, we would just use it
however, if it was an implicit provided by auto-derivation, then the resulting expr would be wrapped in
while we are able to avoid all that unneessary allocation and return
derived expression
directly
in versions prior to 0.8.0-M1 it was achieved by summoning Transformer... and checking how it was created and discarding it if it was created by Transformer.derive implicit. This resulted in potentially exponential complexity as for each field and each subtype this transformer could be summonned, possibly derived... and then discarded.
So 0.8.0-M1 split Transformer and Transformer.AutoDerived to prevent this. However, it complicated the API in a way that doesn't appeach in other derivation libraries, so the proposed solution is to:
require users to import an implicit with automatic derivation themselves
remove distinction between Transformer and Transformer.AutoDerived
create a very detailed section in readme about automatic derivation and how it affects performance, because this change would quite often introduce tons of allocations which where not there before one users import auto._
Alternatively:
we could preserve distinction between those 2 types
introduce a separate method for summoning and using ONLY Transformer and not Transformer.AutoDerived limitting its usage to values explicitly provided by users
create some implicit returning Transformer rather than Transformer.AutoDerived which could be used when user would use something like:
If we managed to separate auto from semiautomatic - one way or another - we could consider separating:
dsl
internal.runtime
internal.compiletime
so that users could require in runtime only type classes and a few utilities used in generated code, and all macros could be required only for the compilation.
we let used add a custom function with returns To type
this function is called instead of a constructor in ProductToProductRule
we make sure that all the existing modifiers and flags still work
Necessary steps
Adding custom function to DSL
To avoid gigantic changes to codebase, API, DSL we would assume that function HAS to have the same names of parameters used as values that appear in created type. Only then we would be able to call .withField* modifiers, with IDE support intact and making it working on Scala 2.12, 2.13 and 3. We can relax this requirement to say that each field with for which we use an override can only be overriden if there is such field defined for it (so that we could use it in DSL).
We would also have to check that types match between provided function and To fields. This would require us to store all inputs in type-level as some sort of Params.Value[B, "fieldNameB", Params.Value[A, "fieldNameA", Params.Empty]] type-level list.
Then we could implement a whitebox macro/transparent inline for .withConstructor((fieldNameA: A, fieldNameB: B) => ...) which would parse the AST, the value into runtimeDataStore and params types into TransformerCfg. This macro (if possible) should also parse things like .withConstructor(new Class(_, _, _)) (overriden constructor) or .withConstructor(Object.apply).
Using custom constructor in derivation
We can refactor TransformProductToProductRule so that matching Type[To] on Product.Constructor(arguments, parameters) could be skipped and we would use constructor extracted from RuntimeDataStore directl.
Testing
Virtually whole suite of TotalTransformerProductSpec would have to be copy pasted as TotalTransformerConstructorSpec and:
each .transformInto would have to be removed
each .into.transform would have to be replaced with .into.withConstructor().transform
each manually provided constructor would have to modify values to make sure that default constructor wasn't used instead
each of the above would have to be done for { () => ... }, new Target(_, _, _) and HelperObject.apply syntaxes to check that each of these works
Things to consider
how many helpers we want to add?
if we stay at a.into[B].withConstrcutor(...).transform things will be pretty simple
if we start considering thing like a.transformIntoVia[B](...) then each such feature would call for explicit helper. e.g. what if we wanted to use custom constructor with merging of 3 case classes? Where do we draw the line?
as a second step, later on in the future, we could allow .withConstructorPartial to enable support for smart constructors
Patchers piggy-back on top of Transformers abstractions, so if Transformers supported (Foo, Bar) into Baz then Patchers could be implemented internally as (Patch, A) into A
currently Patchers check if all fields are used while Transformers doesn't, so if Transformers implemented policy checking, then Patchers could just use a default different policy than Transformers
making merge Transformers recursive would also made Patchers recursive
Patchers could have added extra rule for searching implicit Patchers
Necessary steps
Adding falback values to derivation
To avoid gigantic changes to codebase, API, DSL we would assume that Transformers and PartialTransformers have one main source value - which would be used in .withFieldComputed, .withFieldComputedPartial, .withFieldRenamed - and a list of "fallback" values which would be looked for a field if the main source value is missing it.
TransformationContext would have to introduce val fallbacks: Vector[ExistentialExpr]
as a intermediate fallback should be cleared when calling updateFromTo
take each value from the list and use Product.Extraction to disect it (we might consider caching it somehow to avoid running it separately for each field) to see which fields each value could provide
check if the sought fieldName is among available ones
From this moment on if DSL allowed adding fallback values and Gateway would pass it into TransformerContext then Chimney would support shallow case class merging.
Adding fallbacks to DSL
We have to keep in mind that Chimney already supports converting tuple to case class/another tuple, so we cannot just use (A, B).into[C] as the syntax for merging. Since fallback values would not be used in DSL in any other .with* method, we can just remember its type in some type-level list of TransformerCfg and allow adding them one by one e.g. like this:
I'd suggest we didn't store these values in runtimeData: RuntimeDataStore - we can have another Vector[Any] in TransformerInto and in PartialTransformerInto, so that:
in cases when [Partial]TransformerInto is not used, then similarly to From expression the fallback values could be assigned to some val so that accessing them would be easier and without wrapping
in cases when [Partial]TransformerInto is used, then similarly to src: From expression the fallback values could be extracted from Vector[Any] and assigned to vals
TransformerInto and PartialTransformerInto would have to be extended to contain val fallbacks: RuntimeDataStore and some methods for updating it (similarly to val runtimeData: RuntimeDataStore)
to avoid conflicts and subtle bugs we should create third type-level list, next to TransformerCfg and TransformerFlags e.g.
this list could be parsed before parsing configurations so that we could extract these values from val falbacks (if needed) and cache their value in vals
each of fallbacks would have to have its not used warnings suppressed
each val would have to create a reference to it and wrap it in ExistentialExpr to remember its type
finally, these expressions would be used to initialize TransformationContext
Making fallbacks recursive
When calling deriveRecursiveTransformationExpr with a new context we are cleaning all fallbacks. From now on, we have to modify updateFromTo to take a new list of fallbacks, with .fieldName appended to each value which has such field, and removing from list each expression which doesn't have this value. Then we have to update each occurence of updateFromTo to update these values, or explicitly set it to empty (e.g. for collections and sealed traits).
Introducing policy checking
Because of Patchers existence and the way we want to implement them with merge transformers, we need at least 3 different policy options:
do not check fields usage (we need a catcher name) - the default value for Transformers and Partial Transformers
check if all fields in the main source were used - the default value for Patchers
check if all fields were used - would require using all filelds from source and its fallbacks
Later on we may need to add some option to add an exception, e.g. to not complain if field was unused in fallback because it existed in main source, or to implement #161 Allow to ignore specific field in patchers.
since this is relevant only to TransformProductToProductRule, only this rule has to be updated to remember which values from source were used
however we need to pay special attention because we are not distincting between vals from the class and vals from its parents and this will be relevant here, so we need to introduce that distinction in SourceType
then we need to create a check working in parallel to generating each value, since we want to provide all errors at once, and if we know that not all values were used we should not prevent displaying that information if some field transformation couldn't be derived
this check should only be performed if ctx.config contains the relevant policy
Rewriting Patchers
At this point Patchers implementation can be rewritten:
config parsing could be left
derivation for Patchers would transform PatchingContext into TransformationContext
patch: Patch would become From, obj: A would become the only ExistentialExpr in fallback values, A would be used as To
unless .ignoreRedundantPatcherFields was used the TransformationContext would set policy requiring using all fields from primary value
rule handling Options in Patcher in a Patcher-specific way
probably remove some (all?) rules for Options used in Transformers derivation
Testing
We would have to add tests:
to check that uneeded fallback value doesn't produce warnings
to check that uneeded fallback value doesnt interfere with each field override flag
to check that used fallback doesn't interefere with overrides
to check that added policy options work and create compilation errors
to check that all of the above works with nested case class transformation with several values
including AnyVals on the path
to check that Patchers work predicatably in nested patching
Things to consider
should TransformImplicitRule attempt to summon for fallback values if it fail for main value?
should we add a flag for recursive fallback values usage? this would let us have a flag patching as default (keeping code backward compatible) and enabling recursive patching explictly
next step should we create specialized extension methods for various arities of tuples?:
it looks useful but if we start adding such helpers for each such feature what will happen if we e.g. want to merge 3 case classes via custom constructor? where is the line?
Currently when using .withFieldConst, .withFieldComputed, .withFieldRenamed and their partial counterparts, we have a limitation that each path must be in form of either:
value => value.field
_.field
If the rename/value provision would have to happen in nesting, we are forced to create an implicit, so that it would be summoned inside the outer value transformation.
To address this we could:
replace FieldName <: String in DSL with some path type, e.g:
transformation context and recursive derivation would use non-empty list of paths and match on a content of a single-element path rather than String directly
we would have to replace FieldName <: String in TransformerCfg
we would have to update whitebox macros in both Scala 2 and 3
Modifying derivation
we would have to update TransformationContext to have a non-empty list of paths rather than String
ProductToProductRule would have to compare not all values in fieldOverrides but only those which contain a single value (meaning we are at the same nesting level)
new Context creation would have to drop outermost layer of overrides and filter out the remaining overrides checking if they override the field that is currently transformed
Testing
We could have to add tests:
to check that arbitrarily number of nesting works
to check that several overrides, sharing the same prefix work and that overrides don't propagate to where they shouldn't go
Things to consider
a future steps we could add similar operations like QuickLens have to support overriding a particular value in collecion/each value in a collection or in a particular subtype
this however, would require us to change the structure of field- and corproductOverrides because they could be interleaved
overrides of particular values in collection could only support build-in collections (too much of a pain to support everything) and they could reaaaally complicate the collections code