Skip to content

Commit ca416e9

Browse files
authored
Fixes automatic import and AI Assistant pages (#756)
Minor edits to clean up AI pages: automatic import, AI Assistant, AI Assistant knowledge base.
1 parent 6e607bf commit ca416e9

File tree

3 files changed

+42
-101
lines changed

3 files changed

+42
-101
lines changed

solutions/security/ai/ai-assistant-knowledge-base.md

Lines changed: 19 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -6,28 +6,16 @@ mapped_urls:
66

77
# AI Assistant Knowledge Base
88

9-
% What needs to be done: Align serverless/stateful
10-
11-
% Use migrated content from existing pages that map to this page:
12-
13-
% - [x] ./raw-migrated-files/security-docs/security/ai-assistant-knowledge-base.md
14-
% - [ ] ./raw-migrated-files/docs-content/serverless/ai-assistant-knowledge-base.md
15-
16-
% Internal links rely on the following IDs being on this page (e.g. as a heading ID, paragraph ID, etc):
17-
18-
$$$enable-knowledge-base$$$
19-
20-
$$$knowledge-base-add-knowledge-index$$$
21-
229
AI Assistant’s Knowledge Base feature enables AI Assistant to recall specific documents and other specified information. This information, which can include everything from the location of your datacenters to the latest threat research, provides additional context that can improve the quality of AI Assistant’s responses to your queries. This topic describes how to enable and add information to Knowledge Base.
2310

2411
::::{note}
25-
When you upgrade from {{elastic-sec}} version 8.15 to a newer version, information previously stored by AI Assistant will be lost.
12+
{{stack}} users: when you upgrade from {{elastic-sec}} version 8.15 to a newer version, information previously stored by AI Assistant will be lost.
2613
::::
2714

2815

2916
::::{admonition} Requirements
30-
* To use Knowledge Base, you need the `Elastic AI Assistant: All` privilege. To edit global Knowledge Base entries (information that will affect the AI Assistant experience for other users in the {{kib}} space), you need the `Allow Changes to Global Entries` privilege.
17+
* To use Knowledge Base, the `Elastic AI Assistant: All` privilege.
18+
* To edit global Knowledge Base entries (information that will affect the AI Assistant experience for other users in the {{kib}} space), the `Allow Changes to Global Entries` privilege.
3119
* You must [enable machine learning](/solutions/security/advanced-entity-analytics/machine-learning-job-rule-requirements.md) with a minimum ML node size of 4 GB.
3220

3321
::::
@@ -114,20 +102,9 @@ Add an individual document to Knowledge Base when you want AI Assistant to remem
114102
5. In the **Markdown text** field, enter the information you want AI Assistant to remember.
115103
6. If it should be **Required knowledge**, select the option. Otherwise, leave it blank. Alternatively, you can simply send a message to AI Assistant that instructs it to "Remember" the information. For example, "Remember that I changed my password today, October 24, 2024", or "Remember we always use the Threat Hunting Timeline template when investigating potential threats". Entries created in this way are private to you. By default they are not required knowledge, but you can make them required by instructing AI Assistant to "Always remember", for example "Always remember to address me as madam", or "Always remember that our primary data center is located in Austin, Texas".
116104

117-
Refer to the following video for an example of adding a document to Knowledge Base from the settings menu.
118-
119-
::::{admonition}
120-
<script type="text/javascript" async src="https://play.vidyard.com/embed/v4.js"></script>
121-
<img
122-
style="width: 100%; margin: auto; display: block;"
123-
class="vidyard-player-embed"
124-
src="https://play.vidyard.com/rQsTujEfikpx3vv1vrbfde.jpg"
125-
data-uuid="rQsTujEfikpx3vv1vrbfde"
126-
data-v="4"
127-
data-type="inline"
128-
/>
129-
</br>
130-
::::
105+
Refer to the following video for an example of adding a document to Knowledge Base from the settings menu (click to play video).
106+
107+
[![Add knowledge document video](https://play.vidyard.com/rQsTujEfikpx3vv1vrbfde.jpg)](https://videos.elastic.co/watch/rQsTujEfikpx3vv1vrbfde?)
131108

132109

133110

@@ -154,20 +131,10 @@ Indices added to Knowledge Base must have at least one field mapped as [semantic
154131
:alt: Knowledge base's Edit index entry menu
155132
:::
156133

157-
Refer to the following video for an example of adding an index to Knowledge Base.
158-
159-
::::{admonition}
160-
<script type="text/javascript" async src="https://play.vidyard.com/embed/v4.js"></script>
161-
<img
162-
style="width: 100%; margin: auto; display: block;"
163-
class="vidyard-player-embed"
164-
src="https://play.vidyard.com/Q5CjXMN4R2GYLGLUy5P177.jpg"
165-
data-uuid="Q5CjXMN4R2GYLGLUy5P177"
166-
data-v="4"
167-
data-type="inline"
168-
/>
169-
</br>
170-
::::
134+
Refer to the following video for an example of adding an index to Knowledge Base (click to play video).
135+
136+
137+
[![Add knowledge index video](https://play.vidyard.com/Q5CjXMN4R2GYLGLUy5P177.jpg)](https://videos.elastic.co/watch/Q5CjXMN4R2GYLGLUy5P177?)
171138

172139

173140

@@ -185,23 +152,22 @@ First, you’ll need to set up a web crawler to add the desired data to an index
185152
1. From the **Search** section of {{kib}}, find **Web crawlers** in the navigation menu or use the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
186153
2. Click **New web crawler**.
187154

188-
1. Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**.
189-
2. Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**.
155+
* Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**.
156+
* Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**.
190157

191158
3. The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**.
192159

193-
::::{note}
194-
Remember, each index added to Knowledge Base must have at least one semantic text field.
195-
::::
196-
160+
::::{note}
161+
Remember, each index added to Knowledge Base must have at least one semantic text field.
162+
::::
197163

198-
1. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint***, select `elastic-security-ai-assistant-elser2`. Click ***Add field**, then **Save mapping**.
164+
* Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**.
199165

200166
4. Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule.
201167
5. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to [Web crawler content extraction rules](https://www.elastic.co/guide/en/enterprise-search/current/crawler-extraction-rules.html).
202168

203-
1. Click **Add crawl rule** again. Under **Policy***, select `Disallow`. Under ***Rule***, select `Regex`. Under ***Path pattern**, enter `.*`. Click **Save**.
204-
2. Under **Policy**, select `Allow`. Under **Rule***, select `Contains`. Under ***Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. Make sure this rule appears below the rule created in the previous step on the list.
169+
1. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**.
170+
2. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. Make sure this rule appears below the rule created in the previous step on the list.
205171
3. Click **Crawl**, then **Crawl all domains on this index**. A success message appears. The crawl process will take longer for larger data sources. Once it finishes, your new web crawler’s index will contain documents provided by the crawler.
206172

207173
6. Finally, follow the instructions to [add an index to Knowledge Base](/solutions/security/ai/ai-assistant-knowledge-base.md#knowledge-base-add-knowledge-index). Add the index that contains the data from your new web crawler (`threat_intelligence_feed_1` in this example).
@@ -210,15 +176,4 @@ Your new threat intelligence data is now included in Knowledge Base and can info
210176

211177
Refer to the following video for an example of creating a web crawler to ingest threat intelligence data and adding it to Knowledge Base.
212178

213-
::::{admonition}
214-
<script type="text/javascript" async src="https://play.vidyard.com/embed/v4.js"></script>
215-
<img
216-
style="width: 100%; margin: auto; display: block;"
217-
class="vidyard-player-embed"
218-
src="https://play.vidyard.com/eYo1e1ZRwT2mjfM7Yr9MuZ.jpg"
219-
data-uuid="eYo1e1ZRwT2mjfM7Yr9MuZ"
220-
data-v="4"
221-
data-type="inline"
222-
/>
223-
</br>
224-
::::
179+
[![Add knowledge via web crawler video](https://play.vidyard.com/eYo1e1ZRwT2mjfM7Yr9MuZ.jpg)](https://videos.elastic.co/watch/eYo1e1ZRwT2mjfM7Yr9MuZ?)

solutions/security/ai/ai-assistant.md

Lines changed: 7 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -6,19 +6,6 @@ mapped_urls:
66

77
# AI Assistant
88

9-
% What needs to be done: Align serverless/stateful
10-
11-
% Use migrated content from existing pages that map to this page:
12-
13-
% - [x] ./raw-migrated-files/security-docs/security/security-assistant.md
14-
% - [ ] ./raw-migrated-files/docs-content/serverless/security-ai-assistant.md
15-
16-
% Internal links rely on the following IDs being on this page (e.g. as a heading ID, paragraph ID, etc):
17-
18-
$$$configure-ai-assistant$$$
19-
20-
$$$ai-assistant-anonymization$$$
21-
229
The Elastic AI Assistant utilizes generative AI to bolster your cybersecurity operations team. It allows users to interact with {{elastic-sec}} for tasks such as alert investigation, incident response, and query generation or conversion using natural language and much more.
2310

2411
:::{image} ../../../images/security-assistant-basic-view.png
@@ -32,11 +19,12 @@ The Elastic AI Assistant is designed to enhance your analysis with smart dialogu
3219

3320

3421
::::{admonition} Requirements
35-
* The Elastic AI Assistant and Generative AI connector are available in {{stack}} versions 8.8.1 and later. The Generative AI connector is renamed to OpenAI connector in 8.11.0.
36-
* This feature requires an [Enterprise subscription](https://www.elastic.co/pricing).
37-
* To use AI Assistant, you need at least the **Elastic AI Assistant : All** and **Actions and Connectors : Read** [privileges](/deploy-manage/users-roles/cluster-or-deployment-auth/kibana-privileges.md).
38-
* To set up AI Assistant, you need the **Actions and Connectors : All** [privilege](/deploy-manage/users-roles/cluster-or-deployment-auth/kibana-privileges.md).
39-
* You need a [generative AI connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md), which AI Assistant uses to generate responses.
22+
* {{stack}} users: {{stack}} version 8.8.1 or later. Also note the Generative AI connector was renamed to OpenAI connector in 8.11.0.
23+
* {{stack}} users: an [Enterprise subscription](https://www.elastic.co/pricing).
24+
* {{serverless-short}} users: a [Security Analytics Complete subscription](/deploy-manage/deploy/elastic-cloud/project-settings.md).
25+
* To use AI Assistant, the **Elastic AI Assistant : All** and **Actions and Connectors : Read** [privileges](/deploy-manage/users-roles/cluster-or-deployment-auth/kibana-privileges.md).
26+
* To set up AI Assistant, the **Actions and Connectors : All** [privilege](/deploy-manage/users-roles/cluster-or-deployment-auth/kibana-privileges.md).
27+
* A [generative AI connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md), which AI Assistant uses to generate responses.
4028

4129
::::
4230

@@ -148,7 +136,7 @@ To modify Anonymization settings, you need the **Elastic AI Assistant: All** pri
148136
The **Anonymization** tab of the Security AI settings menu allows you to define default data anonymization behavior for events you send to AI Assistant. Fields with **Allowed*** toggled on are included in events provided to AI Assistant. ***Allowed*** fields with ***Anonymized** set to **Yes** are included, but with their values obfuscated.
149137

150138
::::{note}
151-
You can access anonymization settings directly from the **Attack Discovery** page by clicking the settings (![Settings icon](../../../images/security-icon-settings.png "")) button next to the model selection dropdown menu.
139+
You can access anonymization settings directly from the **Attack Discovery** page by clicking the settings (![Settings icon](../../../images/security-icon-settings.png "title=70%")) button next to the model selection dropdown menu.
152140
::::
153141

154142

solutions/security/get-started/automatic-import.md

Lines changed: 16 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,10 @@ Click [here](https://elastic.navattic.com/automatic-import) to access an interac
2929

3030
::::{admonition} Requirements
3131
* A working [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md). Recommended models: `Claude 3.5 Sonnet`; `GPT-4o`; `Gemini-1.5-pro-002`.
32-
* An [Enterprise](https://www.elastic.co/pricing) subscription.
32+
* {{stack}} users: An [Enterprise](https://www.elastic.co/pricing) subscription.
33+
* {{serverless-short}} users: a [Security Analytics Complete subscription](/deploy-manage/deploy/elastic-cloud/project-settings.md).
3334
* A sample of the data you want to import, in a structured or unstructured format (including JSON, NDJSON, and Syslog).
34-
* To import data from a REST API, have its OpenAPI specification (OAS) file ready.
35+
* To import data from a REST API: its OpenAPI specification (OAS) file.
3536

3637
::::
3738

@@ -47,32 +48,29 @@ Using Automatic Import allows users to create new third-party data integrations
4748
1. In {{elastic-sec}}, click **Add integrations**.
4849
2. Under **Can’t find an integration?** click **Create new integration**.
4950

50-
:::{image} ../../../images/security-auto-import-create-new-integration-button.png
51-
:alt: The Integrations page with the Create new integration button highlighted
52-
:::
51+
:::{image} ../../../images/security-auto-import-create-new-integration-button.png
52+
:alt: The Integrations page with the Create new integration button highlighted
53+
:::
5354

5455
3. Click **Create integration**.
5556
4. Select an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
56-
5. Define how your new integration will appear on the Integrations page by providing a **Title**, **Description***, and ***Logo**. Click **Next**.
57+
5. Define how your new integration will appear on the Integrations page by providing a **Title**, **Description**, and **Logo**. Click **Next**.
5758
6. Define your integration’s package name, which will prefix the imported event fields.
5859
7. Define your **Data stream title**, **Data stream description**, and **Data stream name**. These fields appear on the integration’s configuration page to help identify the data stream it writes to.
5960
8. Select your [**Data collection method**](asciidocalypse://docs/beats/docs/reference/filebeat/configuration-filebeat-options.md). This determines how your new integration will ingest the data (for example, from an S3 bucket, an HTTP endpoint, or a file stream).
6061

61-
::::{admonition} Importing CEL data
62-
:class: note
63-
64-
If you select **API (CEL input)**, you’ll have the additional option to upload the API’s OAS file here. After you do, the LLM will use it to determine which API endpoints (GET only), query parameters, and data structures to use in the new custom integration. You will then select which API endpoints to consume and your authentication method before uploading your sample data.
65-
66-
::::
62+
::::{admonition} Importing CEL data
63+
:class: note
64+
If you select **API (CEL input)**, you’ll have the additional option to upload the API’s OAS file here. After you do, the LLM will use it to determine which API endpoints (GET only), query parameters, and data structures to use in the new custom integration. You will then select which API endpoints to consume and your authentication method before uploading your sample data.
65+
::::
6766

6867
9. Upload a sample of your data. Make sure to include all the types of events that you want the new integration to handle.
6968

70-
::::{admonition} Best practices for sample data
71-
* For JSON and NDJSON samples, each object in your sample should represent an event, and you should avoid deeply nested object structures.
72-
* The more variety in your sample, the more accurate the pipeline will be. Include a wide range of unique log entries instead of just repeating the same type of entry. Automatic Import will select up to 100 different events from your sample to use as the basis for the new integration.
73-
* Ideally, each field name should describe what the field does.
74-
75-
::::
69+
::::{admonition} Best practices for sample data
70+
* For JSON and NDJSON samples, each object in your sample should represent an event, and you should avoid deeply nested object structures.
71+
* The more variety in your sample, the more accurate the pipeline will be. Include a wide range of unique log entries instead of just repeating the same type of entry. Automatic Import will select up to 100 different events from your sample to use as the basis for the new integration.
72+
* Ideally, each field name should describe what the field does.
73+
::::
7674

7775
10. Click **Analyze logs**, then wait for processing to complete. This may take several minutes.
7876
11. After processing is complete, the pipeline’s field mappings appear, including ECS and custom fields.

0 commit comments

Comments
 (0)