Skip to content

Instantly share code, notes, and snippets.

@paulbatum
Last active September 2, 2021 05:30
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save paulbatum/c301e8ca07b2561db91030a1566383fa to your computer and use it in GitHub Desktop.
Save paulbatum/c301e8ca07b2561db91030a1566383fa to your computer and use it in GitHub Desktop.
secretless-azure-functions

Creating a Secretless Function App using Managed Identity

My goal is to make a fully secretless function app, and then add triggers for an Azure Storage queue and a Service Bus queue.

Step 1: Create the Function App without Azure Files

The first step is to create the function app, and configure it without Azure Files (because Azure Files doesn't support managed identity for SMB file shares). There is some documentation on this here. The portal doesn't provide that option today, but I can take the ARM template generated by the portal and modify it for my needs. So I go the through the create wizard, but instead of hitting Create, I hit the template button:

image

This brings up the following screen. I select deploy:

image

But I'm not actually deploying yet! First I'll edit:

image

In the editor, I'm removing the two Azure Files settings WEBSITE_CONTENTAZUREFILECONNECTIONSTRING and WEBSITE_CONTENTSHARE

image

I then save my changes and go back to the deploy UX. I made sure the right parameters are selected (I had to re-enter my resource group), and then review+ create and then create.

image

I wait for the function app to finish deployment, and then I go open it in the portal. The first thing I notice is a warning about storage. That’s expected because I removed azure files. It will go away once I setup run-from-package.

image

Step 2: Enable Managed Identity

To start making this app secretless, I have to enable managed identity. For this walkthrough I'm using the system assigned identity. The documentation on this topic is here. I select identity, pick the system assigned tab, and turn it on:

image

Now I am going to start adding role assignments. I want this function app to be able to access blob content in its storage account without a key so I make this role assignment using the Azure role assignments button in the above screenshot.

image

Note - Storage Blob Data Contributor is likely a better role as its scope is more limited. The docs will be likely updated to recommend this role the future.

This will also allow me to use managed identity to pull the package from this blob account, so when I configure run-from-package I won't need to use a SAS.

Step 3: Deploy a function using Run From Package

Lets prepare the package. I'll start with just a timer triggered function to keep things simple and confirm I configured things correctly so that I don’t need any secrets for AzureWebJobsStorage. So I go into VS, create a new V3 function app and select timer trigger:

image

I right click the project and select publish, but VS doesn't know how to publish a package to blob. I'll need to do that manually. So I select the folder profile and hit publish:

image

I go to the filesystem and zip the output:

image

I then go to the storage account that is associated with this function app (the one I added Storage Blob Data Owner for), create a container for my packages, and then upload the package:

image

I then go to the blob properties and grab the URL: image

Now I switch back to the function app, go to its configuration and add the WEBSITE_RUN_FROM_PACKAGE appsetting and paste in the URL. The run-from-package feature is documented here.

image

Next I want to confirm that my function app is able to load the package. The easiest way to do this is to go to the functions list. I should be able to see the timer triggered function, and run it: image

It works!

Step 4: Use Managed Identity for AzureWebJobsStorage

The next step is to switch AzureWebJobsStorage to be secretless. This should already work because the function app has the Storage Blob Data Owner role. To do this, I just need to specify the account name. So I rename the setting to AzureWebJobsStorage__accountName and leave just the account name in the value:

image

I confirm the timer is still running after making this change and I'm done. I've successfully removed the secrets in my function app by configuring it without Azure Files, and using managed identity to load the application content from blob storage using run-from-package.

Next up, some optional steps.

Step 5 Update the extension bundle

This step applies to JavaScript, Python, PowerShell and Java only.

Some of the steps below cover using identity based connections for triggers. For .NET users, this involves referencing the new NuGet packages (steps are outlined below). For JavaScript, Python, PowerShell and Java, you don't manage NuGet packages, instead you use an extension bundle. To use identity based connections, you'll want to switch over to the preview extension bundle. To do this, edit your host.json and configure the extension bundle has follows:

  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle.Preview",
    "version": "[3.*, 4.0.0)"
  }

Optional Step A: Use Managed Identity to access Key Vault

There's one more key-like setting in my Function App, and that’s my App Insights connection string. Now this is not technically a secret (it contains the instrumentation key, which is not always protectable - see here), but I thought it would be an educational exercise to move this setting to Key Vault and used managed identity to access it.

First I created a Key Vault, and add the secret. I call the secret "AppInsights" and paste in the connection string for the value.

image

I then switch the Key Vault over to use Azure role-based access control:

image

I then add a role assignment in my Function App to give it Key Vault Secrets User:

image

Now I can use a Key Vault reference in my application settings. Documentation for that is here.

image

I switch over to App Insights, open live metrics, and confirm that my Key Vault reference is working and my data is still flowing. Everything seems to be working fine:

image

The Function App is now fully secretless, but it currently only has a timer trigger i.e. its not being triggered by an external source. That’s what I'll cover next. But first, a screenshot of the appsettings in my app with nothing redacted, because there's no secrets:

image

Optional Step B: Add a Storage Queue Trigger

Lets add a storage queue trigger. I'm going to assume that I already have this queue data somewhere else, so I'm going to create a new storage account to represent that. I then go into the role assignments for the function app and add the Storage Queue Data Contributor role for this new storage account:

image

Next, reference the new extensions. For .NET, see the next paragraph. For other languages see the extension bundle section above.

I make sure to update my storage extension to 5.x. This is the new storage extension for functions that uses the newest version of the Azure Storage SDK for .NET. There was a blog post about that here. At the time of writing, the newest version of the library is 5.0.0-beta4:

image

This queue trigger uses a connection called "QueueConnection", so I need to configure the function app with the account name, similar to what I did above for AzureWebJobsStorage:

image

I'm also using an environment variable for the name of the queue within the account (InputQueueName in the above screenshot), but its just a convenience and not necessary.

For every connection that is used as a trigger, I must also add the credential setting that specifies that managed identity is used. So I add a QueueConnection_credential setting with value managedIdentity.

image

A future update to Azure Functions will remove this requirement when using system assigned identities. Once this update is live, simply setting __accountName would be enough.

My function app has been updated to have the necessary role to access the queue using managed identity, and its been configured to know what account to access, and it has a queue triggered function that uses the new extension that has support for managed identity. The only remaining step is to publish the changes. I repeat the folder publishing step, zip the content, upload it to storage calling it "queue.zip" and update my run from package URL:

image

Now I go to the queue in the portal, and I add a message:

image

image

I wait a bit and then refresh and the message has been read:

image

Optional Step C: Add a Service Bus Queue Trigger

Again I start with role assignments, adding the function app as a Azure Service Bus Data Owner:

image

I have to configure the function app with the details of the namespace. My connection is going to be called ServiceBusConnection so I add a ServiceBusConnection__fullyQualifiedNamespace setting:

image

For every connection that is used as a trigger, I must also add the credential setting that specifies that managed identity is used. So I add a ServiceBusConnection_credential setting with value managedIdentity.

image

I add a service bus queue triggered function, again making sure to use the newest extension which is Microsoft.Azure.WebJobs.Extensions.ServiceBus version 5.0.0-beta4 at time of writing:

image

I go through similar publish steps as before. Here's my appsettings now:

image

Optional Step D: Use Managed Identity for local development

To test the service bus trigger I added above, I need to drop a message in the service bus queue, but the portal won't let me do that. There are plenty of other ways to write a message to a service bus queue, but this seems like a good opportunity to try using managed identity locally. First I go into VS and make sure that its configured to use my account:

image

Next I go into the portal and give myself the Azure Service Bus Data Owner role - the same as what I did for my function app above: image

Now I make a separate function app that is going to host a HTTP triggered function that uses an output binding to write to the queue. Again, making sure to use the newest extension package: image

I modify my local.settings.json so that my local environment knows which namespace and queue the message has to be written to:

image

I then set this function app as my startup project and hit F5. It outputs the URL for my function, and I call it using my browser:

image

Without configuring any connection string, my local debug session is able to write the message to the queue, and then I check app insights and I see my function ran and picked up the message:

image

This concludes the walkthrough.

Notes on Event Hubs and Blob Trigger

You can follow similar steps for Event Hubs and Blob trigger. Event Hubs would follow the same patterns as Service Bus, while blob is a little more complicated. The functions host uses queues internally to run the blob trigger, so you would need to make sure that the function app has both blob and queue role assignments, for both the account that is configured for AzureWebJobsStorage, and the account that contains the blobs you're triggering on.

Links

Managed identity in Azure Functions

Identity based connections in Azure Functions

Connecting to host storage with an Identity

Creating a Function App without Azure Files

Run Azure Functions from a package file

Use Key Vault references in Azure Functions

Azure SDK blog post about the new extensions

Configuring the account used by Visual Studio for local development

Functions documentation for local development

GitHub issue were this scenario is discussed

@mdddev
Copy link

mdddev commented Sep 1, 2021

Hi Paul, that is very informative, thanks! I am keen to try this with a function app that contains both queue and blob-triggered functions. Unfortunately, the app refuses to load when doing this, having followed this tutorial. I have raised an issue for this with the team here. Do you see any deviations from your methodology?

@paulbatum
Copy link
Author

@mdddev Just to confirm - these steps worked fine for you when doing just blob, or just queue, but the combination is what didn't work?

@mdddev
Copy link

mdddev commented Sep 2, 2021

Exactly, both work individually but not together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment