Skip to main content
Skip table of contents

Exporter for Azure Data Lake

This documentation references external sources. Nexthink does not have control over the accuracy of third-party documentation, nor any external updates or changes that might create inconsistencies with the information presented on this page. Please report any errors or inconsistencies to Nexthink Support.

Data Export allows you to export Nexthink data insights into Azure Data Lake via CSV files, comma separated and in UTF-8 format.

Configure Azure Data Lake Storage Gen2 data destinations to store data and create a Data Export in Nexthink web interface to distribute it.

1. Create a Storage Account in Azure Portal

Log in to the Azure portal:

To create a file system using the general-purpose v2 storage account (not a data lake storage gen1) in the Azure portal, follow these steps:

  • In the Azure portal menu, select All services.

  • In the list of resources, type Storage Accounts. As you begin typing, the list filters the content based on your input.

  • Select Storage Accounts.

  • Select Add in the Storage Accounts window.

  • Select the subscription for which you want to create the storage account.

  • Select Create new under the Resource group field. Enter the name of your new resource group. If a Resource group already exists, select it from the dropdown list.

  • Enter the name of your storage account. The name you choose must be unique across Azure. The name also must be between 3 and 24 characters in length and can include numbers and lowercase letters only.

  • Select a location for your storage account or use the default location.

  • Fill in the information for the rest of the tabs, Advanced, Networking, Data Protection, Encryption and Tags.

    • Disable Soft delete for blobs in the Data Protection tab

    • Enable Hierarchical namespace in the Advanced tab

  • Select Review + Create to review your storage account settings and create the account.

2. Create a data lake Container within a Storage Account

  • Locate your newly created storage account under Storage accounts.

  • Select the storage account you want to use.

  • You will need to create a new container.

  • Select Containers, add a new container and enter a meaningful name for it, for example openbridge-data-lake.

  • Make sure access is set to Private (no anonymous access).

  • Click on Create.

3. Register an application in Azure Portal

Register an application with the Microsoft identity platform and apply the valid role assignment for access.

  • Register a new application in the Azure portal.

  • Select the account type based on your business requirements.

  • Assign the Storage Blob Data Owner or Storage Blob Data Contributor role to the service principal, which grants the service principal full access to blob data rights. Assign other blob data roles according to your business requirements. For the details of built-in role permissions, please refer to the Azure built-in roles documentation on the Microsoft website.

4. Configuring Credentials in the Nexthink web interface

  • Select Administration from the main menu.

  • Click on Connector credentials from the Integrations section of the navigation menu.

Accessing Connector credentials

  • Click on the New credential button located in the top-right of the Credentials page.

  • Select OAuth 2.0 as the authorization mechanism

  • Fill in the credential values with the Storage and Application information from the Azure portal.

Instance URL: https://$STORAGE_ACCOUNT_NAME.dfs.core.windows.net

$STORAGE_ACCOUNT_NAME: name of the Storage created in Step 1

Access token URL: https://login.microsoftonline.com/$TENANT_ID/oauth2/v2.0/token
$TENANT_ID: Directory (tenant) ID of the application created in Step 3.
In the Overview section, copy the Directory (tenant) ID:

Client ID: Select Application (client) ID from the Overview section of the application created in Step 3.

Client secret: Value taken from the Certificates & secrets section of the application created in Step 3.

Scope: https://storage.azure.com/.default

Refer to the Connector credentials documentation for more information on how to manage the credentials.

5. Creating a new Data Export in Nexthink web interface

In order to configure the data exporter, you must have administrative permissions. Refer to the Adding users documentation for more information about user profiles.

To access Data Export:

  • Select Administration from the main menu.

  • Select Outbound connectors from the Integrations section of the navigation panel.

Accessing outbound connectors

  • Select Data Exporter from the list of Outbound connectors. When you access the Data exporter page for the first time, there are no elements on the page. Once you create a data exporter, it is listed on the page along with the total number of created data exporters.

  • Click on the New exporter button located in the upper-right corner of the page to create a new data exporter.

General tab

General tab of the New exporter
  • Name: enter a meaningful name for the data exporter.

  • Description: enter a meaningful description of the goal of the data exporter.

  • Credentials: define credentials from the third-party tool that the data exporter exports the data to. Refer to Connector credentials for more information.

  • Destination: define the third-party tool that the data exporter exports the data to.

When you select Azure Data Lake from the destination drop-down, the following values appear:

  • Container: enter the name of the container in Azure Data Lake to push the data to.

  • Maximum file size: define the maximum file size generated by Data Export. If the data set from a specific NQL query is larger than the specified file size, it splits it into two separate files.

Configuration tab

Scheduling frequency tab
  • Scheduling frequency: select the frequency to execute the list of queries. The available options are: hourly, daily and weekly.

  • Query Name: enter the name of the query, which is also the name of the file created in the destination. The name can contain only an underscore as a special character, for example, IT_lake_exporter.

  • NQL query: define the data exported from Nexthink web interface into the destination using an NQL query.

  • Directory: define the directory within the container credentials from the third-party tool that the data is exported to. If the directory doesn’t exist, the system will automatically create it.

Scheduling Frequency

This component sets how often the system executes NQL queries. The available options are:

  • Hourly: executes the list of queries based on the value selected in the drop-down menu. For instance, every 1h, 2h, 3h and more.

  • Daily: executes the list of queries every day at 00:00 UTC+2 time.

  • Weekly: executes the list of queries every Wednesday at 00:00 UTC+2 time. Select only one option at a time.

Regardless of the chosen frequency, the execution of the data exporter might be delayed to ensure data completeness. This guarantees there is no data missing at the time of the export process.

You must select a value for the Recurrence option because the system does not generate a default value and currently cannot prompt you that the value is missing during the validation process.

Add Query

Add several NQL queries to the same data exporter by clicking on the Add query button.

Currently, only one NQL query can be visualized at a time. In the future you will be able to visualize multiple NQL queries at once.

Test Data Exporter

Click on the Test Load up to 20 records button to send the first 20 results of executing the NQL query into the destination configured in the General tab.

  • If the NQL query is valid, a message appears indicating that the query has been successfully executed.

  • If the NQL query is invalid, a message appears indicating that the query could not be executed.

The successful message reflects that the NQL query that has been inserted is valid, not that the file has been sent out successfully.

Delete

Click on the Delete button to remove NQL queries when they are no longer needed. A pop-up appears to confirm the action.

The action of removing an NQL query cannot be undone. If you need to retrieve the information, you will have to recreate the NQL query from scratch.

6. Editing an existing Data Export

  • Navigate to the Data exporters page.

  • Hover over a specific exporter and select the edit icon on the right side of the table row.

Edit icon
  • The exporter edit page opens with a pre-filled form containing information regarding the data exporter. You can modify all available fields. Saving or canceling the action takes you back to the data exporter page.

  • The system validates the fields when you save the form.

7. Removing an existing Data Export

Remove the data exporters when they are no longer necessary. This limits the amount of information that is sent out to third-party tools.

  • Navigate to the Data exporters page.

  • Hover over a specific exporter and select the delete icon on the right side of the table row.

  • A pop-up appears to confirm the action.

The action of removing a data exporter cannot be undone. If you need to retrieve the information, you will have to create the data exporter from scratch.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.