Exporter for Azure Data Lake

This documentation references external sources. Nexthink does not have control over the accuracy of third-party documentation, nor any external updates or changes that might create inconsistencies with the information presented on this page. Please report any errors or inconsistencies to Nexthink Support.

Data Export allows you to export Nexthink data insights into Azure Data Lake via CSV files, comma separated and in UTF-8 format.

Configure Azure Data Lake Storage Gen2 data destinations to store data and create a Data Export in Nexthink web interface to distribute it.

Prerequisites

Before setting up the date exporter in the Nexthink web interface, you must first:

Creating a Storage Account in Azure Portal

Log in to the Azure portal:

To create a file system using the general-purpose v2 storage account (not a data lake storage gen1) in the Azure portal, follow these steps:

  1. In the Azure portal menu, select All services.

  2. In the list of resources, type Storage Accounts. As you begin typing, the list filters the content based on your input.

  3. Select Storage Accounts.

  4. Select Add in the Storage Accounts window.

  5. Select the subscription for which you want to create the storage account.

  6. Select Create new under the Resource group field. Enter the name of your new resource group. If a Resource group already exists, select it from the dropdown list.

  1. Enter the name of your storage account. The name you choose must be unique across Azure. The name also must be between 3 and 24 characters in length and can include numbers and lowercase letters only.

  2. Select a location for your storage account or use the default location.

  3. Fill in the information for the rest of the tabs, Advanced, Networking, Data Protection, Encryption and Tags.

    • Disable Soft delete for blobs in the Data Protection tab

    • Enable Hierarchical namespace in the Advanced tab

  4. Select Review + Create to review your storage account settings and create the account.

Creating a data lake Container within a Storage Account

  1. Locate your newly created storage account under Storage accounts.

  2. Select the storage account you want to use.

    • You must create a new container.

  3. Select Containers, add a new container and enter a meaningful name for it, for example, openbridge-data-lake.

  4. Make sure access is set to Private (no anonymous access).

  5. Click on Create.

Registering an application in Azure Portal

Register an application with the Microsoft identity platform and apply the valid role assignment for access.

  1. Register a new application in the Azure portal.

  2. Select the account type based on your business requirements.

  1. Assign the Storage Blob Data Owner or Storage Blob Data Contributor role to the service principal, which grants the service principal full access to blob data rights. Assign other blob data roles according to your business requirements. For the details of built-in role permissions, please refer to the Azure built-in roles documentation on the Microsoft website.

Configuring Azure DL connector credentials in Nexthink

  1. Select Administration from the main menu.

  2. Click on Connector credentials from the Integrations section of the navigation menu.

  1. Click on the New credential button located in the top-right of the Credentials page.

  2. Fill in the Name.

  3. Select HTTPS as the protocol.

  4. Select OAuth 2.0 - Client Credentials as the authorization mechanism.

  5. Fill in the credential values with the Storage and Application information from the Azure portal. See the image and values listed below.

Refer to the Connector credentials documentation for more information.

  • URL address: https://$STORAGE_ACCOUNT_NAME.dfs.core.windows.net

    • The $STORAGE_ACCOUNT_NAME name of the Storage created in Step 1.

  • Access token URL: https://login.microsoftonline.com/$TENANT_ID/oauth2/v2.0/token $TENANT_ID: Directory (tenant) ID of the application created in Step 3.

    • In the Overview section, copy the Directory (tenant) ID.

  • Client ID: Select Application (client) ID from the Overview section of the application created in Step 3.

  • Client secret: Value taken from the Certificates & secrets section of the application created in Step 3.

  • Scope: https://storage.azure.com/.default

Creating an Azure DL data exporter in Nexthink

You must have administrative permissions to configure the data exporter. Refer to the Adding users documentation for more information about user role.

To access Data Export:

  1. Select Administration from the main menu.

  2. Select Outbound connectors from the Integrations section of the navigation panel.

  1. Select Data Exporter from the list of Outbound connectors. When you access the Data exporter page for the first time, there are no elements on the page. Once you create a data exporter, it is listed on the page along with the total number of created data exporters.

  2. Click on the New exporter button located in the upper-right corner of the page to create a new data exporter.

  1. Select the Azure DL data exporter type.

General tab

  • Name: enter a meaningful name for the data exporter.

  • Description: enter a meaningful description of the goal of the data exporter.

  • Active: switch on the toggle to enable the exporter.

  • Credentials: define credentials from the third-party tool that the data exporter sends the data to. Refer to Connector credentials for more information.

  • Container: enter the name of the container in Azure Data Lake to push the data to.

  • Maximum file size: define the maximum file size generated by Data Export. If the data set from a specific NQL query is larger than the specified file size, it splits it into several separate files.

Data tab

  • Scheduling frequency: define how often the system executes the NQL query and exports data. The available options are:

    • Hourly: The system triggers the data export data based on the value selected in the drop-down menu. For instance, every 1h, 2h, 3h, 4h, 6h or 12h.

    • Daily: The system triggers the data export every day at 00:00 of the timezone where the Nexthink cloud instance is located.

    • Weekly: The system triggers the data export weekly, on the selected day at 00:00 of the timezone where the Nexthink cloud instance is located.

For Hourly and Daily scheduling frequencies, the system waits 20 minutes to execute the data exporter to allow the previous bucket to close properly for data completeness.

You must select a value for the Recurrence option because the system does not generate a default value and does not indicate that the value is missing during the validation process.

  • NQL query: define the data exported from Nexthink web interface into the destination using an NQL query.

  • File name: Enter the filename created in the destination. Underscore is the only special character supported, for example: IT_lake_exporter.

  • Directory (optional): define the directory within the container credentials from the third-party tool that the data is exported to. If the directory doesn’t exist, the system will automatically create it.

Testing the Azure DL data exporter

Click the Tests load up to 20 records button to validate the connection to Azure Data Lake before saving the configuration.

  • If the NQL query and the connection are valid, a message appears indicating that the query results has been successfully delivered.

  • If the NQL query or the connection is invalid, a message appears informing about the error details.

Refer to the Managing data exporters documentation to learn more about data exporters creation, editing, deleting and disabling.

Last updated

#451: 2024.8-Overview of integration DOC

Change request updated