- 31 Oct 2024
- 3 Minutes to read
- Print
- DarkLight
- PDF
Top Tips
- Updated on 31 Oct 2024
- 3 Minutes to read
- Print
- DarkLight
- PDF
Before going into the samples we have below, here are a few top tips we think you will find useful.
Include the time
When you are creating a query for the BAM feature you will need to include one of the below 2 fields:
- TimeStamp (App Insights)
- TimeGenerated (Log Analytics)
These fields are default fields on these data sources and we will use them to execute the time filters when querying to search for records.
Mapping status fields for the transaction
You will often want to show a status which you can then map to Turbo360 so it displays a coloured indicator to the user of the status. This configuration is set in the below image at either shape or transaction level.
One of the things I like to do is to setup a field in my KQL query where i will provide this mapping with a case statement. This means I can control the logic in my query if I want to use different fields to indicate that the process is complete, in progress.
Below is an example of how I can do this in KQL:
| extend OverallStatus = case(
EndSuccess == 'True', 'Succeeded',
EndSuccess == 'False', 'Failed',
'In Progress')
Start, End and Duration
In Turbo360 you can map columns to the start, end and duration fields which are default (optional) fields to display which we will use in the user interface if you provide them. They can be used at both transaction and stage level.
The first tip is that if your stage is very short running you may choose to skip this bit and just focus on matching a log event to the shape and its status. You will be very likely to want to use these at transaction level however.
At transaction level, I normally use the timestamp on the start event and the timestamp on the end event to then workout what the overall transaction start and end times are.
I would then use the extend operator to produce a new field as shown below to subtract the start time from the end time and have a calculated duration which I can then return back to Turbo360.
| extend Duration = EndTime - StartTime
Links to Azure
You can add a link to your query which will give you a url for the Azure Portal which is the url to open the resource for the right run id. Below are some examples of that.
Logic App Consumption
In this example I get the base url for the Logic App run and then I will dynamically append the run id to it in the output query.
//This is the url from Azure Portal for this logic app for a run history but with the run id taken off as we will append it later dynamically
let inputLogicAppRunPortalUrl = "https://portal.azure.com/#view/Microsoft_Azure_EMA/DesignerEditorConsumption.ReactView/id/%2Fsubscriptions%2F08a281b8-3b07-4219-a517-b11230e9b34f%2FresourceGroups%2FEAI_App_EmployeeBenefitsFiles%2Fproviders%2FMicrosoft.Logic%2Fworkflows%2FEmployeeBenefits-To-BenefitsManagement-Partner/location/northeurope/showGoBackButton~/true/isReadOnly~/true/isMonitoringView~/true/runId/%2Fsubscriptions%2F08a281b8-3b07-4219-a517-b11230e9b34f%2FresourceGroups%2FEAI_App_EmployeeBenefitsFiles%2Fproviders%2FMicrosoft.Logic%2Fworkflows%2FEmployeeBenefits-To-BenefitsManagement-Partner%2Fruns%2F";
AzureDiagnostics
| where ResourceProvider == "MICROSOFT.LOGIC"
| where ResourceGroup == "EAI_APP_EMPLOYEEBENEFITSFILES"
| where resource_workflowName_s == "EmployeeBenefits-To-BenefitsManagement-Partner"
| where ResourceType == "WORKFLOWS/RUNS/ACTIONS"
| where OperationName == "Microsoft.Logic/workflows/workflowActionCompleted"
| where Resource == "HTTP_-_GET_EMPLOYEE_BENEFITS_DATASET"
| extend FileName = trackedProperties_fileName_s
| extend WorkFlowName = resource_workflowName_s
| extend WorkFlowRunID = resource_runId_s
//Here I concat the url to show the portal url using the earlier parameter
| extend PortalUrl = strcat(inputLogicAppRunPortalUrl, resource_runId_s)
Data Factory
In the below example I took the url from a run of my data factory and used the string concat function in KQL to inject in the run id dynamically from the query.
If you refer to the Data Factory sample you will see this in use.
| extend PortalUrl = strcat("https://adf.azure.com/en/monitoring/pipelineruns/", runId_g, "?factory=%2Fsubscriptions%2F", inputSubscriptionId, "%2FresourceGroups%2F", inputResourceGroupName, "%2Fproviders%2FMicrosoft.DataFactory%2Ffactories%2F", inputDataFactory)
Sampling
Remember that some of the logging approaches you may be using might not offer guaranteed delivery. An example of this might be:
- App Insights you might have sampling turned on
- Log Analytics and App Insights may not guarantee delivery under load
If you need those guaranteed delivery logs then consider the Turbo360 BAM API & its push model. If you are happy to reuse the logging you might already have in place but are looking to get more then use the pull model described here