Skip to content

Instantly share code, notes, and snippets.

@intaxwashere
Last active June 8, 2024 12:08
Show Gist options
  • Save intaxwashere/e9b1f798427686b46beab2521d7efbcf to your computer and use it in GitHub Desktop.
Save intaxwashere/e9b1f798427686b46beab2521d7efbcf to your computer and use it in GitHub Desktop.
Custom Thunks Unreal Engine TL;DR

Custom thunks TL;DR

This smol post assumes you worked on a custom thunk implementation before but no idea how it works, why you're using cursed macros from 1990s etc. If you don't have any programming experience or relatively new to Unreal world, it's likely you might not understand anything from this post, not because concepts are too difficult to grasp, but rather because this post is written for the people who has an understanding of how Unreal works since a while and want to expand their knowledge of custom thunks implementation.

Part 1:

  • A thunk is a function that you can save and call later, so if you had an array of TFunction<void()>s, you would have an array of custom thunks that you can bind/unbind new function pointers to existing TFunctions.
  • Custom thunks of Blueprints are the same, they're a fancy array/list of function pointers. Imagine for each node you placed to graph, Blueprints have a place for that node in it's list of custom thunks. For example the + node in Blueprints that sums two floats is UKismetMathLibrary::AddAdd_Float function, declared in KismetMathLibrary.h.
  • Engine associates and links UKismetMathLibrary::AddAdd_Float with + node automatically because its a BlueprintCallable function. BPVM knows whenever it needs to evaluate + node, it calls UKismetMathLibrary::AddAdd_Float function.
  • Basically, BP compiler serializes an empty UFunction* pointer to bytecode, it tells BPVM which native function to call when we reach the + node to sum two floats.
  • During loading, engine links the native functions to those UFunction* pointer, so when BPVM reaches +, the assigned UKismetMathLibrary::AddAdd_Float gets executed.
  • When you connect nodes together in the graph, after compilation, the nodes are baked into a linear function call list.
  • When you mark a UFUNCTION as CustomThunk, you tell the engine "Hey, I have this function and please don't automatically generate a thunk for me, I'll have a custom one that I wrote, just link it to function call from blueprint graph please."
  • All UFUNCTION(BlueprintCallable) functions are thunks, their declaration and definition is generated by UHT.
  • Reading .gen.cpp and finding related DEFINE_FUNCTION macro of your native UFUNCTION(BlueprintCallable) could give you an idea of what custom thunks are doing if you're a complete starter.
  • Important: Real difficulty of custom thunks comes from FProperty interactions, and managing FFrame's MostRecentPropertyAddress value etc.
  • A UFUNCTION(BlueprintCallable) knows what type of parameters it has, how many parameters it has and which of them are input/output parameters.
  • At the point your custom thunk starts executing, BPVM prepares a local "stack" for you to step onto parameters. More details below.

Part 2:

  • Every function call in BPVM (whether native function or script function/event) handled the same way:
  • BPVM reads the related UFunction* that get linked in runtime we mentioned earlier, and reads its properties, like what kind of params it has and total size of its parameters.
  • A memory block is allocated before function call happens (via FMemory__Alloca__Aligned macro). Size of this memory block is equal to total size of the paramters of your function. This is what we call "parms memory".
  • When a function call happens, all the existing parameters in stack/graph that created for the function parameter is copied to said memory block. For example, if you have a local float variable in a BP function, and if thats plugged to a pin of a node that is a custom thunk, that float variable is copied to "parms memory" block BPVM allocated just before this step.
  • This "parms memory" can be accessed through FFrame::Locals, because FFrame::Locals is a uint8* pointer that points to the allocated "parms memory". When you use PARAM_PASSED_BY_VAL macros or use Stack.StepCompiledIn<FProperty>(Object, &Thing) you actually walk through Locals. This part is a bit confusing and it's normal if you're lost, but bear with me.
  • So lets say you have a void Function(float A, double B) as BlueprintCallable, total parameter size of your function parameters is sizeof(float) + sizeof(double), which is 12.
  • Your first PARAM_PASSED_BY_VAL call inside of the custom thunk to read float A parameter would increment instruction pointer by 4, because BP compiler knows a float has 4 size to read it you need to read 4 bytes from the stack. Then when you want to read the double B, and Locals + 4 would point to that parameter's memory address. And since instruction pointer already points to 4, reading 8 more bytes would give us the value of the double parameter.
  • To recap, variables from BP graph are copied to a parms memory block, that memory block is referenced through FFrame::Locals property which is what those unreadable macros like PARMS_PASSED_BY_VAL use to traverse through variables in it.
  • After you process your function, you return values like this *reinterpret_cast<YourType*>(RESULT_PARAM) = YourReturnValue;. RESULT_PARAM is a special macro that actually is a pointer that points to your return variable in the stack, similar to parms memory but provided separately because of the design choice of how BPVM is structured. You dont need to worry about details of this one, what you're doing is practically equal to doing return YourReturnValue, but you just add a simple reinterpret_cast into it. If your function does not return anything, you dont need to interact with RESULT_PARAM.
UFUNCTION(BlueprintCallable, CustomThunk)
float Sum(float A, double B) { check(0); return 0; } // this should NEVER get called. CustomThunks are routed to special definition we generate via DEFINE_FUNCTION macro.

DEFINE_FUNCTION(UYourClass::YourFunctionName)
{
    // this macro expands to:
    // float A;																
    // Stack.StepCompiledIn<FFloatProperty>(&A);
    PARAM_PASSED_BY_VAL(A, FFloatProperty, float);

    // this macro expands to:
    // float B;																
    // Stack.StepCompiledIn<FFloatProperty>(&B);
    PARAM_PASSED_BY_VAL(B, FFloatProperty, float);
    
    // sometimes, depending on your parameter type, you might need to call something else than PARAM_PASSED_BY_VAL
    // see Script.h and .gen.cpp files for further details.
    
    // increment instruction pointer unless it's null.
    // this is required to mark we finished traversing parameters through stack.
    P_FINISH;

    P_NATIVE_BEGIN; // this macro lets profiler/insights know we're now evaluating a native logic, so it wont be displayed as Blueprint time.
    const float Result = A + B;
    P_NATIVE_END; // let profiler know native logic has ended.

    // we know our return type is float, and RESULT_PARAM is a void* that points to the return parameter of this function in the "parms memory"
    // it's size is same as float too, because thats how BPVM works, so we want to re-interpret it as float and set our result variable to it.
    *reinterpret_cast<float*>(RESULT_PARAM) = Result;
}

Cool information: P in P_ prefix stands for "portable". Back at the days, during early days of interpreters, bytecode was called as "pcode", because it was more portable than compiled code. After 90s, this slowly evolved into the term "bytecode" because single instruction is only big as a single byte, often represented as an enum in the code.

Another fun fact: This specific "parms memory" thing I explained and how all functions are handled the same way inside of the VM is specifically what makes Blueprints is a slow language compared to others. Not to mention instead of having a traditional switch/while loop based interpreter, implementing a function pointer based (the thunks) interpreter also doesn't help at all. Read more in this link if you're interested: https://intaxwashere.github.io/blueprint-performance/

There are many details I couldnt mention because this is a TL;DR but ping me if you need to know anything specific in detail.

Further reading would be looking at source code and see how FFrame is declared.

Then also understanding how FProperty system works also help: https://intaxwashere.github.io/blueprint-access/

After you get the how stack logic works, like how/why we have to traverse through parameteres with cursed macros etc, try to understand how to access parameters via FFrame::MostRecentPropertyAddress, when it is valid, when it can be used, especially the MostRecentPropertyContainer because its essential to work with when you need to do something more complex than what we did above. It's needed when interacting with FProperty's manually.

You also might want to check UObject::ProcessEvent, it'll be though read, dont expect to understand all of it, or try to make sense of everything. Butjust get an idea of what it does.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment