Back in August last year, Microsoft announced1 a new feature for their “Entra ID” Identity Management solution, called “API-driven inbound provisioning”. This API enables fully automated provisioning of user accounts from any trusted source, for example a HCM platform, to either Microsoft Entra or an on-premises Active Directory. API-driven provisioning is based on standard SCIM2 schema attributes and is meant to be used in addition to Entra’s lifecycle workflows.3
The provisioning data flow, as outlined in fig. 1 blow, would include:
In this article we will showcase an implementation of API-driven inbound provisioning using Azure Logic App. It is meant as an exemplary starting point for your own workflow, not as a full reference implementation. Our example is loosely based on Microsoft’s “QuickStart with Azure Logic App”.4
Microsoft also provides a pre-built integration for Workday HCM or SuccessFactors, so you don’t have to implement them yourself:
API-driven inbound provisioning requires a Microsoft Entra ID P1 (formerly Azure Active Directory P1) license.
We will need a few Azure resources for our demo:
To create the resources, you can use the Portal, Azure CLI or PowerShell. Creating a Logic App in Azure CLI or PowerShell requires a workflow definition6 file, which you will find in the appendix below.
We use a PowerShell script to create the required resources:
$ProjectName = "HcmProvisioning"
$Location = "westeurope" # Azure region for our resources
$RandomId = Get-Random -Minimum -1000 -Maximum 9999 # We'll use this to make our storage account name globally unique
$StorageName = "st" + $ProjectName.ToLower() + $RandomId
$StorageTableName = "employeedemo"
$ResourceGroupName = "rg-$ProjectName"
$LogicAppName = "logic-$ProjectName"
# Create a resource group for our Azure resources
New-AzResourceGroup -Location $Location -Name $ResourceGroupName
# Create the storage account...
$StorageAccount = New-AzStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageName -Location $Location -SkuName "Standard_LRS"
# ... and our employee table
$StorageContext = $StorageAccount.Context
New-AzStorageTable -Name $StorageTableName -Context $StorageContext
# Create a Logic App, based on our "empty" workflow definition
$LogicApp = New-AzLogicApp -ResourceGroupName $ResourceGroupName -Name $LogicAppName -Location $Location -DefinitionFilePath "./workflow-definition.json"
For our example we will use the table “employeedemo” we just created in Azure storage as data source. Grab Microsoft’s demo data set from GitHub and use Azure Storage Explorer to import it to our table.
The table contents will look like this (not all columns shown):
To enable “inbound provisioning” in our Entra Tenant, we must configure an application from the Enterprise Gallery apps. Log in to the Microsoft Entra admin center and navigate to Identity > Applications > Enterprise applications (Fig. 3). Click on “New application”, enter “API-driven” in the search field, and select “API-drive provisioning to Microsoft Entra ID” (Fig. 4). Keep its default name and click “Create”.
Open the newly created app and select “Provisioning” (Fig. 5), go to the “Provisioning” blade, and click “Get started”. Switch the Provisioning mode to “Automatic” and click on “Save”.
After the initial provisioning configuration is created you will see additional options in the “Provisioning” blade (Fig. 6):
Complete the configuration by navigating to the “Overview” blade and click “Start provisioning” to place the provisioning job in listen mode. Expand the “View technical information” section and copy the “Provisioning API Endpoint”, as we will need this URI later.
The Logic App will be using a “Service Principal” in Microsoft Entra to authenticate against the services we will access. Therefore, we will enable a “System-assigned Managed Identity”7 for our Logic App. Then we will use PowerShell to assign two MS Graph permissions to the managed identity.
Navigate to the Logic App in Azure Portal and open the “Identity” blade under “Settings” (Fig. 7). Switch Status to “on” and click “Save” to enable the managed identity. This will create a service principal with the same name (“logic-HcmProvisioning”) in Entra.
Next, we will use a PowerShell script and MS Graph to assign permissions8 to the service principal:
Install-Module -Name "Microsoft.Graph" -Scope AllUsers
Connect-MgGraph -Scopes "Application.Read.All","AppRoleAssignment.ReadWrite.All,RoleManagement.ReadWrite.Directory"
$GraphApp = Get-MgServicePrincipal -Filter "AppId eq '00000003-0000-0000-c000-000000000000'"
# Find our Logic App's managed service principal
$ManagedSp = Get-MgServicePrincipal -Filter "DisplayName eq '$LogicAppName'" # the service principal has the same name as the logic app we created earlier
# Search for app role permission "SynchronizationData-User.Upload" and assign it to our service principal
$PermissionName = "SynchronizationData-User.Upload"
$AppRole = $GraphApp.AppRoles | Where-Object {$_.Value -eq $PermissionName -and $_.AllowedMemberTypes -contains "Application"}
New-MgServicePrincipalAppRoleAssignment -PrincipalId $ManagedSp.Id -ServicePrincipalId $ManagedSp.Id -ResourceId $GraphApp.Id -AppRoleId $AppRole.Id
# Search for app role permission "AuditLog.Read.All" and assign it to our service principal
$PermissionName = "AuditLog.Read.All"
$AppRole = $GraphApp.AppRoles | Where-Object {$_.Value -eq $PermissionName -and $_.AllowedMemberTypes -contains "Application"}
New-MgServicePrincipalAppRoleAssignment -PrincipalId $ManagedSp.Id -ServicePrincipalId $ManagedSp.Id -ResourceId $GraphApp.Id -AppRoleId $AppRole.Id
Additionally, we will assign the role “Storage Table Data Reader” for our Azure table to the service principal:
New-AzRoleAssignment -PrincipalId $ManagedSp.Id -RoleDefinitionName "Storage Table Data Reader" -ResourceName $StorageName -ResourceType "Microsoft.Storage/storageAccounts" -ResourceGroupName $ResourceGroupName
You can check the service principal’s permissions in Entra by navigating to “Enterprise applications” > “logic-HcmProvisioning” (remove the default filter “Application type == Enterprise application”), in the Application’s “Permissions” blade (Fig. 8) and in the Azure storage account’s “Access Control” (Fig. 9):
Now that we have dealt with all prerequisites, we can develop our Logic App workflow. Navigate to our Logic App in Azure Portal and open the “Logic app designer” blade. At the moment, our Logic App only has two items:
Edit the “Compose ProvisioningAPI Settings” action and replace the “Uri”-property with your “Provisioning API Endpoint”. You can find the endpoint information in the “API-drive provisioning to Microsoft Entra ID” Enterprise application under Provisioning > Overview > View technical information.
The action inputs now should look like this:
{
"Audience": "https://graph.microsoft.com",
"Uri": "https://graph.microsoft.com/beta/servicePrincipals/abdefghi-1234-5678-jklm-nopqrstuvwxy/synchronization/jobs/API2AAD.b649b126ed68488484d86210336bb33f.21803b3e-5c34-4571-b37a-bc1ce9ec3556/bulkUpload"
}
Add a new action “Get entities (V2)” of type “Azure Table Storage”. Enter a connection name (e.g. “conn-sthcmprovisioning”), choose “Logic Apps Managed Identity” as Authentication Type and click “Create”.
In the “Parameters” section,
To make our workflow a bit more readable click the action’s title and rename it to “Get employee data from storage table”.
We need to iterate over our employees and create a SCIM/JSON object for every single one.
To do that,
Next, add another action inside our “For each employee” block:
{
"bulkId": "@{guid()}",
"data": {
"schemas": [
"urn:ietf:params:scim:schemas:core:2.0:User",
"urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
],
"userName": "@{items('For_each_employee')?['UserID']}",
"displayName": "@{items('For_each_employee')?['FullName']}",
"externalId": "@{items('For_each_employee')?['WorkerID']}",
"name": {
"familyName": "@{items('For_each_employee')?['LastName']}",
"givenName": "@{items('For_each_employee')?['FirstName']}"
},
"active": @{if(equals(items('For_each_employee')?['WorkerStatus'],'Active'),true,false)}
},
"method": "POST",
"path": "/Users"
}
Now we need to create a single “bulk” SCIM object containing all employees.
Add a “Join” action (Type “Data Operations”):
Add another “Compose” action. We will use a mix of JSON syntax and an expression to construct our final SCIM object.
{
"schemas": [
"urn:ietf:params:scim:api:messages:2.0:BulkRequest"
],
"Operations": @{json(concat('[',body('Join_employee_SCIM'),']'))},
"failOnErrors": null
}
In our final step we will send our bulk SCIM object to the API-driven inbound provisioning endpoint.
Add a “HTTP” action and
Rename the HTTP action to “POST SCIM payload to API”.
You can find the complete workflow definition in the appendix of this article.
We are now ready to execute our Logic App workflow. Select “Run” and wait until the flow has posted the SCIM payload to the API endpoint. You should see HTTP status “202” (Accepted), which means the API has received our request and will start processing it shortly.
Navigate to the API-driven inbound provisioning application in Microsoft Entra, select “Provisioning” and click the “Provisioning logs” blade to see a detailed log of your provisioned users (Fig. 11):
Now that you have completed our exemplary implementation you maybe ask yourself: How can I use my employee data and how do I access my own “system of record”?
Due to the flexible nature of Logic Apps and its extensive number of connectors, you are not limited to using Azure Table storage as your data source. Here is a list of alternative scenarios you could implement in your own environment:
Therefore, you just have to replace our action “Get employee data from storage table” with an appropriate action for your own data source.
Although this was just a quick demo, it lays the foundation for other “joiner-mover-leaver” (JML) lifecycle workflows in Microsoft Entra ID.9
If you need assistance for the implementation of your own Entra ID lifecycle workflows, just give us a call!
When creating a Logic App using PowerShell or Azure CLI, we are required to supply a “workflow definition”. Use this “empty” definition to create a basic workflow containing a trigger and one simple action:
{
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose_ProvisioningAPI_Settings": {
"inputs": {
"Audience": "https://graph.microsoft.com",
"Uri": "<Your API-driven inbound provisioning URI>"
},
"runAfter": {},
"type": "Compose"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"Recurrence_0500am_every_day": {
"evaluatedRecurrence": {
"frequency": "Day",
"interval": 1,
"schedule": {
"hours": [
"5"
]
},
"timeZone": "W. Europe Standard Time"
},
"recurrence": {
"frequency": "Day",
"interval": 1,
"schedule": {
"hours": [
"5"
]
},
"timeZone": "W. Europe Standard Time"
},
"type": "Recurrence"
}
}
}
You can use this anonymized workflow definition as reference for your own workflow.
{
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose_ProvisioningAPI_Settings": {
"inputs": {
"Audience": "https://graph.microsoft.com",
"Uri": "https://graph.microsoft.com/beta/servicePrincipals/abdefghi-1234-5678-jklm-nopqrstuvwxy/synchronization/jobs/API2AAD.b649b126ed68488484d86210336bb33f.21803b3e-5c34-4571-b37a-bc1ce9ec3556/bulkUpload"
},
"runAfter": {},
"type": "Compose"
},
"Compose_SCIM_bulk_payload": {
"inputs": {
"Operations": "@json(\r\n concat('[',body('Join_employee_SCIM'),']')\r\n)",
"failOnErrors": null,
"schemas": [
"urn:ietf:params:scim:api:messages:2.0:BulkRequest"
]
},
"runAfter": {
"Join_employee_SCIM": [
"Succeeded"
]
},
"type": "Compose"
},
"For_each_employee": {
"actions": {
"Compose_employee_SCIM": {
"inputs": {
"bulkId": "@{guid()}",
"data": {
"active": "@if(equals(items('For_each_employee')?['WorkerStatus'],'Active'),true,false)",
"displayName": "@{items('For_each_employee')?['FullName']}",
"externalId": "@{items('For_each_employee')?['WorkerID']}",
"name": {
"familyName": "@{items('For_each_employee')?['LastName']}",
"givenName": "@{items('For_each_employee')?['FirstName']}"
},
"schemas": [
"urn:ietf:params:scim:schemas:core:2.0:User",
"urn:ietf:params:scim:schemas:extension:enterprise:2.0:User"
],
"userName": "@{items('For_each_employee')?['UserID']}"
},
"method": "POST",
"path": "/Users"
},
"type": "Compose"
}
},
"foreach": "@body('Get_employee_data_from_storage_table')?['value']",
"runAfter": {
"Get_employee_data_from_storage_table": [
"Succeeded"
]
},
"type": "Foreach"
},
"Get_employee_data_from_storage_table": {
"inputs": {
"host": {
"connection": {
"name": "@parameters('$connections')['azuretables']['connectionId']"
}
},
"method": "get",
"path": "/v2/storageAccounts/@{encodeURIComponent(encodeURIComponent('sthcmprovisioning1234'))}/tables/@{encodeURIComponent('employeedemo')}/entities"
},
"runAfter": {
"Compose_ProvisioningAPI_Settings": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Join_employee_SCIM": {
"inputs": {
"from": "@outputs('Compose_employee_SCIM')",
"joinWith": ","
},
"runAfter": {
"For_each_employee": [
"Succeeded"
]
},
"type": "Join"
},
"POST_SCIM_payload_to_API": {
"inputs": {
"authentication": {
"audience": "@{outputs('Compose_ProvisioningAPI_Settings')?['Audience']}",
"type": "ManagedServiceIdentity"
},
"body": "@outputs('Compose_SCIM_bulk_payload')",
"headers": {
"Content-Type": "application/scim+json"
},
"method": "POST",
"uri": "@outputs('Compose_ProvisioningAPI_Settings')?['Uri']"
},
"runAfter": {
"Compose_SCIM_bulk_payload": [
"Succeeded"
]
},
"type": "Http"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"Recurrence_0500am_every_day": {
"evaluatedRecurrence": {
"frequency": "Day",
"interval": 1,
"schedule": {
"hours": [
"5"
]
},
"timeZone": "W. Europe Standard Time"
},
"recurrence": {
"frequency": "Day",
"interval": 1,
"schedule": {
"hours": [
"5"
]
},
"timeZone": "W. Europe Standard Time"
},
"type": "Recurrence"
}
}
}