# Other GenAI Features

## Overview

In addition to Nexthink Assist and Nexthink Insights Nexthink leverages GenAI in other parts of Infinity to enhance content development. The following feature is based on GenAI:

* [AI Localization for Adopt Guides](https://docs.nexthink.com/platform/user-guide/adopt/guide-creation-and-management-from-nexthink-applications/creating-guides#localizing-content-with-ai)

The LLMs used are provided by Amazon Bedrock within the same AWS account and geographical area (continent) where customer data is typically stored and processed. OpenAI is *not* used for the features listed on this page.

## Frequently asked questions

<details>

<summary>Do the LLMs process Personal Data or any type of sensitive information?</summary>

Large language models (LLMs) are not meant to process personal data or sensitive information.

</details>

<details>

<summary>Are there third parties involved?</summary>

For the feature listed on this page, Nexthink uses the large language models (LLMs) provided by Amazon Bedrock. OpenAI is not involved.

</details>

<details>

<summary>Where is data processed by the LLMs?</summary>

The large language models (LLMs) provided by Amazon Bedrock are used within the same AWS account and geographical area (continent) where Nexthink's customer data is typically stored and processed.

</details>

<details>

<summary>Can LLMs leverage my organization data to train their models?</summary>

No, the large language models (LLMs) used across Nexthink Infinity Platform cannot and do not use Customer Data via their APIs to train or improve their models.&#x20;

</details>

<details>

<summary>Are large language models (LLMs) responding to end-user prompts, or are they powered by predefined prompts from Nexthink?</summary>

The large language models (LLMs) are triggered using predefined logic and prompts built into Nexthink’s backend services running in AWS. End-users cannot view or modify these prompts.

Additionally, the LLM **does not have access to any Nexthink data**. This means that even if a user attempts to query sensitive information, the LLM will not be able to access or infer anything from the platform.

</details>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.nexthink.com/legal/global-ai-hub/other-genai-features.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
