Skip to main content
  1. Posts/

Auxiliary Logs in Azure Log Analytics

·2141 words·11 mins· loading · loading · ·
English Logging Azure Log Analytics Expert
Table of Contents
Auxliliary Logs - This article is part of a series.
Part 1: This Article

Microsoft provides a new type of Log Analytics tables called Auxiliary Logs. Currently this is in Public Preview and I did some lab testing which I documented in this blog post.

Introduction

In August 2024 Microsoft announced public preview of a new type of Log Analytics table: Auxiliary Logs . Those type of tables are low cost compared to the traditional Analytics Logs and are really useful for compliance needs if you need to store high volume logs in your sentinel environment. But those have also a drawback: There are only limited query capabilities and also the query performance is much slower. Currently Microsoft is still developing things around those new Auxiliary Logs. In this blog post I will share my experiences with those type of Log Analytics tables.

Preparation and Goal

The goal is to have a cheap log data store for high volume logs. For later I want to ingest high volume logs of my Open Souce Firewall based on OPNsense . For better understanding of the mechanism and to shorten the dumps I will document the process with a simplified table. It is well suited for your own tries.

I assume that you are familiar working with PowerShell 7.x and have an appropriate environment installed.

For sending logs to Sentinel I will use the Log Analytics Workspace ingestion API. I will explain following steps:

  • Create necessary table with the plan Auxiliary (which leads to the table type “Auxiliary”).
  • Get data into your newly created table (by using a Data Collection Rule, DCR for short)

We will create following elements:

  • TestDataAuxiliary_CL: Our custom log (hence postfix _CL) table with type Auxiliary Logs. The table will be super simple and consists only of three columns: TimeGenerated (datetime), Message (string) and Info (string). You can extend the table with more columns later.
  • DCR-Generic-CollectionRule: Data Collection Rule which sends log data to the custom log table TestDataAuxiliary_CL. This is the only way how you can send data to this table type.
  • LA-TestData-Log-Ingestion: App Registration to authenticate against Entra ID to be able to send data to the log ingestion endpoint.

Auxiliary Log table creation

Right now it is not super easy to create such tables. You have to use the API because the web interface does not present options to convert nor create such tables. So let’s get started.

If you are not familiar with defining tables in JSON just send data to your Log Analytics workspace and a normal table gets generated. Then you can modify the columns within the GUI and after finishing you can export the JSON definition over the REST API.
Find details below .

To be able to create a new table with the plan Auxiliary you have to use the new API version 2023-01-01-preview (see https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table-auxiliary#create-a-custom-table-with-the-auxiliary-plan) .

Beside of using the new API you have to supply a JSON with the right definitions:

$auxiliaryTableParams = @'
{
  "properties": {
    "totalRetentionInDays": 365,
    "plan": "Auxiliary",
    "schema": {
      "name": "TestDataAuxiliary_CL",
      "columns": [
          { "name": "TimeGenerated", "type": "datetime" },
          { "name": "Message", "type": "string" },
          { "name": "Info", "type": "string" }
      ]
    },
  },
}
'@

Invoke-AzRestMethod -Path "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-security-logs-prod-001/providers/Microsoft.OperationalInsights/workspaces/log-security-prod-001/tables/TestDataAuxiliary_CL?api-version=2023-01-01-preview" -Method PUT -payload $auxiliaryTableParams

If command succeds you will get the new table type in your environment:

azure-auxiliary-log-custom-table.png
The new created table TestDataAuxiliary_CL with type Auxiliary.

Your table definition must contain the field TimeGenerated and is essential for working with Auxiliary Log tables.

Ingestion of Logs

Create Data Collection Rule (DCR)

To be able to send log data to your table you have to define a Data Collection Rule (DCR) which validates and maps the incoming data to the desired table. In this example we name it DCR-Generic-CollectionRule.

$dcrParams = @'
{
  "location": "switzerlandnorth",
  "kind": "Direct",
  "properties": {
    "description": "A direct ingestion rule for TestData logs",
    "streamDeclarations": {
      "Custom-TestDataAuxiliary": {
        "columns": [
          { "name": "TimeGenerated", "type": "datetime" },
          { "name": "Message", "type": "string" },
          { "name": "Info", "type": "string" }
        ]
      }
    },
    "destinations": {
      "logAnalytics": [
        {
          "workspaceResourceId": "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-security-logs-prod-001/providers/Microsoft.OperationalInsights/workspaces/log-security-prod-001",
          "name": "log-security-prod-001"
        }
      ]
    },
    "dataFlows": [
      {
        "streams": [
          "Custom-TestDataAuxiliary"
        ],
        "destinations": [
          "log-security-prod-001"
        ],
        "outputStream": "Custom-TestDataAuxiliary_CL"
      }
    ]
  }
}
'@

Invoke-AzRestMethod -Path "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-security-logs-prod-001/providers/Microsoft.Insights/dataCollectionRules/DCR-Generic-CollectionRule?api-version=2023-03-11" -Method PUT -payload $dcrParams

App Registration & Permission

To be able to send log data to the created data ingestion point (created by your Data Collection Rule, DCR) you need to have an App Registration and assign the Role Monitoring Metrics Publisher to your defined App Registration. Follow this steps:

  • Create App Registration with Name LA-TestData-Log-Ingestion
  • Create a secret
  • Go to your Data Collection Rule, assign role Monitoring Metrics Publisher

app-registration-LA-TestData-Log-Ingestion.png
App Registration to provide access to Data Collection Rule log ingestion endpoint.

Get Log Ingestion API Endpoint

As mentioned before Data Collection Rules (DCR) with mode Direct will get an Log Ingestion API without creating a Data Collection Endpoint (DCE) before. To get the log ingestion API endpoint you have to show your Data Collection Rule as JSON and make sure that you switch the API version in the drop down to 2023-03-11. As soon as you switch to this version your JSON will contain a key with the name logsIngestion (path: propertiesendpoints).

dcr-definition-json-definition.png
In the Data Collection Rule provides a link to the JSON definition.

dcr-json-definition.png
The JSON definition of the Data Collection Rule contains the Log Ingestion URI (but only in API version 2023-03-11)

Full Ingestion URI

To send data to your Log Analytics table you have to create an URI with several information pieces. The final URI looks like {Endpoint}/dataCollectionRules/{DCR Immutable ID}/streams/{Stream Name}?api-version=2023-01-01

dcr-definition-immutable-id.png
The immutable id of the Data Collection Rule can be found in the overview page of the rule.
dcr-definition-stream.png
The stream name can be also found in the JSON definition of the Data Collection Rule (DCR).

So the final URI would be: https://dcr-generic-collectionrule-5cba-switzerlandnorth.logs.c1.ingest.monitor.azure.com/dataCollectionRules/dcr-1194f5543fa22422b02439b0ca51ad3f/streams/Custom-TestDataAuxiliary?api-version=2023-01-01

Sending test data to your Auxiliary Log table

Now the final step is to test if we are able to send log data to our newly defined Auxiliary Log table. For this I will provide here a kind of template for a Linux with bash environment:

tenatnid="<your tenant id>"
clientid="<your client id>"
clientsecret="<your client secret>"
endpoint="<your endpoint URL>"
immutableid="<your immutable id>"
streamname="Custom-TestDataAuxiliary"
uri="$endpoint/dataCollectionRules/$immutableid/streams/$streamname?api-version=2023-01-01"

curl -X POST -H "Content-Type: application/x-www-form-urlencoded" \
  -d "grant_type=client_credentials&client_id=$clientid&client_secret=$clientsecret&scope=https://monitor.azure.com/.default" \
  "https://login.microsoftonline.com/$tenatnid/oauth2/v2.0/token"

This will get you the so called Access Token. Use it for the next step:

access_token="<your access token>"
Note that this Access Token is only valid for typical 3599 seconds! Afterwards you have to fetch a new token.

Now finally you can define your test data you want to send to your table and send it with curl:

json_payload='[
  {
    "TimeGenerated": "2024-11-20T18:13:06.570354Z",
    "Message": "Houston, we have a problem",
  	"Info": "sekureco42.ch"
  }
]'

curl -vvvv -H "Authorization: Bearer $access_token" -H "Content-Type: application/json" --request POST -d "$json_payload" $uri

You will get a detailed output and there you should see a line with < HTTP/2 204 - this means that data was successfully sent to your Data Collection Rule endpoint (and if everything went fine you will see this data in your table. But note that it could take up to 5 to 10 minutes until data will be shown.).

azure-auxiliary-log-query.png
Finally you can query the Auxiliary Log table as other normal tables. But beware that each query will be charged according searched volume.

Troubleshooting

Table can not be created

If you play around with different table types that suddenly you are not able to create a table any more and that you get an error like this:

StatusCode : 400
Content    : {
               "error": {
                 "code": "InvalidParameter",
                 "message": "Change in table plan is limited to once a week. Last update was on 11/13/2024 22:06:22.
             Operation Id: '3f111c99831714616e08993bdf54c2bd'"
               }
             }

The only way around that is to create a table with a different name or wait until the time limit is over.

Table can not be deleted

If you try to delete the auxiliary log table with the Data Collection Rule API you could get following error:

PS C:\Users\rOger> Invoke-AzRestMethod -Path "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-security-logs-prod-001/providers/Microsoft.OperationalInsights/workspaces/log-security-prod-001/tables/OPNsense_CL?api-version=2023-01-01-preview" -Method DELETE

StatusCode : 400
Content    : {
               "error": {
                 "code": "InvalidParameter",
                 "message": "Changing Classic table OPNsense_CL schema by using DataCollectionRuleBased tables api is
             forbidden, please migrate the table first or use the appropriate classic table api. Operation Id:
             '0036afa70f943084b94bac9a03e44dbd'"
               }
             }

The you have to migrate the table first to the new Data Collection Rule API. This can easily done, see section How to delete an Auxiliary Log Table .

Log Data can not be ingested

You get an error like:

OperationFailed: The authentication token provided does not have access to ingest data for the data collection rule with immutable Id ‘dcr-1234567890abcdef1234567890abcdef’

You have to assign role Monitoring Metrics Publisher to your app LA-TestData-Log-Ingestion.

Query of Auxiliary Logs table is not supported

If you get an error like:

Query of Auxiliary Logs table is not supported with the API ‘https://api.loganalytics.io/v1/workspaces/{workspaceId}/query' . You can use ‘https://api.loganalytics.io/v1/workspaces/{workspaceId}/search' instead. To learn more on queries on Auxiliary Logs table, read here: https://aka.ms/auxiliaryLogsQueryAPI . If the issue persists, please open a support ticket. Request id: 06fc2f56-b916-460f-b136-f10d82df0344

Most probably you where not patient enough to wait until data arrived in the table. Just wait some minutes until you try your next search.

FAQ

What happens if I do not provide JSON data with all defined fields?

If you do not send all fields you defined in your DCR the log will be accepted and missing fields are shown as “empty”.

If you omit field TimeGenerated the whole row will be discarded (without error)!

What happens if I do send additional fields in the JSON which are not defined?

If you provide additional fields you didn’t specify in the DCR those get ignored.

Are fields case sensitive?

YES, they are. Example if you send such a payload:

json_payload='[
  {
    "TimeGenerated": "2024-11-21T11:13:06.570354Z",
    "MeSsAgE": "Houston, we have a case sensitive problem",
  	"Info": "Case Sensitive Test"
  }
]'

Only field TimeGenerated and Info will be accepted and saved in your table. The field MeSsAgE will be ignored and not saved!

What happens if the referenced table in the DCR does not exist?

From client point of view you will get a success message (Code 204) that data was delivered; but this does not mean that data was successfully saved somewhere…

Can errors while log ingestion to a DCR be detected?

As far as I know it is not possible to detect nor to see which errors happened. Exceptions are errors shown in your Log Analytics Workspace → Monitoring → Insights → Health, eg. once I saw there a message like “The following fields’ values controlScores of type M365SecureScore_CL have been trimmed to the max allowed size, 32766 bytes. Please adjust your input accordingly. (1)”.

How to export an existing table definition

A good starting point for your own data table could be an export of an existing table. This example shows how to export the table with the name OPNsense_CL over the API:

Invoke-AzRestMethod -Path "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-playground-logs-prod-001/providers/Microsoft.OperationalInsights/workspaces/log-security-prod-001/tables/OPNsense_CL?api-version=2023-01-01-preview" -Method GET

Result will be:

{
  "properties": {
    "totalRetentionInDays": 90,
    "archiveRetentionInDays": 0,
    "plan": "Analytics",
    "retentionInDaysAsDefault": true,
    "totalRetentionInDaysAsDefault": true,
    "schema": {
      "tableSubType": "Classic",
      "name": "OPNsense_CL",
      "tableType": "CustomLog",
      "columns": [
        {
          "name": "action_s",
          "type": "string",
          "displayName": "action_s",
          "isDefaultDisplay": false,
          "isHidden": false
        },
        {
          "name": "appname_s",
          "type": "string",
          "displayName": "appname_s",
          "isDefaultDisplay": false,
          "isHidden": false
        },

        ...

        {
          "name": "icmp_type_s",
          "type": "string",
          "displayName": "icmp_type_s",
          "isDefaultDisplay": false,
          "isHidden": false
        }
      ],
      "standardColumns": [
        {
          "name": "TenantId",
          "type": "guid",
          "isDefaultDisplay": false,
          "isHidden": false
        },

        ...

        {
          "name": "RawData",
          "type": "string",
          "isDefaultDisplay": false,
          "isHidden": false
        }
      ],
      "solutions": [
        "LogManagement"
      ],
      "isTroubleshootingAllowed": true
    },
    "provisioningState": "Succeeded",
    "retentionInDays": 90
  },
  "id": "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-playground-logs-prod-001/prov
iders/Microsoft.OperationalInsights/workspaces/log-security-prod-001/tables/OPNsense_CL",
  "name": "OPNsense_CL"
}

How to delete an Auxiliary Log table

Before we can delete the auxiliary log table over the Data Collection Rule API we have to migrate the table. After this migration we are able to delete the table with the same API as creation of it.

Migrate table to new API (please note the method POST and the added URL part /migrate):

Invoke-AzRestMethod -Path "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-security-logs-prod-001/providers/Microsoft.OperationalInsights/workspaces/log-security-prod-001/tables/OPNsense_CL/migrate?api-version=2023-01-01-preview" -Method POST

Then you can invoke the delete API call:

Invoke-AzRestMethod -Path "/subscriptions/a2aeb284-51bd-4807-adbe-94095a10b175/resourceGroups/rg-security-logs-prod-001/providers/Microsoft.OperationalInsights/workspaces/log-security-prod-001/tables/OPNsense_CL?api-version=2023-01-01-preview" -Method DELETE

Summary

Auxiliary Logs are the new Kid on the block. You can store high volume logs for a cheap price (typically about $0.22/GB/day). This article showed you how to define such tables and how you can ingest data to it. Even those table type is cheap you have in mind that each query will be charged. But for many use cases this limitation is still cheaper than sending the data to a normal Analytics table.

Further Reading

Auxliliary Logs - This article is part of a series.
Part 1: This Article