Skip to content

updated asset links and images #23021

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 27, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 20 additions & 20 deletions tutorials/cp-aibus-ber-custom-data/cp-aibus-ber-custom-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,23 +32,23 @@ In the service key you created for Business Entity Recognition in the previous t

1. To access the Business Entity Recognition Swagger UI, add **`/api/v1`** to the `url` value, paste it into any web browser and press **Enter**.

<!-- border -->![BER](png-files/service-key-details.png)
<!-- border -->![BER](service-key-details.png)

2. To be able to use the Swagger UI endpoints, you need to authorize yourself. In the top right corner, click **Authorize**.

<!-- border -->![BER](png-files/swagger.png)
<!-- border -->![BER](swagger.png)

3. Get the `access_token` value created in the previous tutorial: [Get OAuth Access Token for Business Entity Recognition Using Any Web Browser](cp-aibus-ber-web-oauth-token), then add **bearer** in front of it, and enter in the **Value** field.

```
bearer <access_token>
```

<!-- border -->![BER](png-files/Authorize.png)
<!-- border -->![BER](Authorize.png)

4. Click **Authorize** and then click **Close**.

<!-- border -->![BER](png-files/Authorize2.png)
<!-- border -->![BER](Authorize2.png)



Expand All @@ -61,17 +61,17 @@ Use the **POST /datasets** endpoint to create a dataset that will be used to tra

2. Click **Try it out**.

<!-- border -->![BER](png-files/post-datasets-1.png)
<!-- border -->![BER](post-datasets-1.png)

3. In **payload**, enter a `description` for your dataset, `"Tutorial dataset"`, for example.

4. Click **Execute**.

<!-- border -->![BER](png-files/post-datasets-2.png)
<!-- border -->![BER](post-datasets-2.png)

5. Copy the **`datasetId`** from the **Response body**.

<!-- border -->![BER](png-files/post-datasets-3.png)
<!-- border -->![BER](post-datasets-3.png)



Expand All @@ -86,11 +86,11 @@ To see the details of your newly created dataset, use the **GET /datasets/{`data

3. Enter the **`datasetId`** obtained in the previous step and click **Execute**.

<!-- border -->![BER](png-files/get-datasets-1.png)
<!-- border -->![BER](get-datasets-1.png)

You should receive a response like below. Among the `datasetId` and the `description`, you see the `documentCount`. The number of documents include the training data files that you'll upload in the next step.

<!-- border -->![BER](png-files/get-datasets-2.png)
<!-- border -->![BER](get-datasets-2.png)



Expand All @@ -103,9 +103,9 @@ Please bear in mind that Business Entity Recognition requires your data to be in

>As an alternative to uploading your own JSON file to the service, you can use the following sample files (right click on the link, then click ***Save link as*** to download the files locally):

>- [Sample Training Data 1](https://raw.githubusercontent.com/SAPDocuments/Tutorials/master/tutorials/cp-aibus-ber-custom-data/data/Tutorial_training_data_1.json)
>- [Sample Training Data 1](https://raw.githubusercontent.com/sap-tutorials/Tutorials/master/tutorials/cp-aibus-ber-custom-data/Tutorial_training_data_1.json)

>- [Sample Training Data 2](https://raw.githubusercontent.com/SAPDocuments/Tutorials/master/tutorials/cp-aibus-ber-custom-data/data/Tutorial_training_data_2.json)
>- [Sample Training Data 2](https://raw.githubusercontent.com/sap-tutorials/Tutorials/master/tutorials/cp-aibus-ber-custom-data/Tutorial_training_data_2.json)

>Please repeat this step twice and upload one document each time. The more data is available the better predictions the model can give you.

Expand All @@ -120,11 +120,11 @@ To upload documents, do the following:

4. Click **Execute**.

<!-- border -->![BER](png-files/post-datasets-docs-1.png)
<!-- border -->![BER](post-datasets-docs-1.png)

5. Copy the **`documentId`** from the **Response body**.

<!-- border -->![BER](png-files/post-datasets-docs-2.png)
<!-- border -->![BER](post-datasets-docs-2.png)



Expand All @@ -143,11 +143,11 @@ Using the `datasetId` obtained in the previous step, you can obtain the details

5. Click **Execute**.

<!-- border -->![BER](png-files/get-datasets-docs-1.png)
<!-- border -->![BER](get-datasets-docs-1.png)

You should receive a response, with the document details, like below. This shows you the size of the document in bytes.

<!-- border -->![BER](png-files/get-datasets-docs-2.png)
<!-- border -->![BER](get-datasets-docs-2.png)



Expand Down Expand Up @@ -178,11 +178,11 @@ Once the training documents are uploaded, you can submit a training job. This tr

4. Click **Execute**.

<!-- border -->![BER](png-files/post-training-jobs-1.png)
<!-- border -->![BER](post-training-jobs-1.png)

5. Copy the **`jobId`** from the **Response body**. This allows you to check the status of the training.

<!-- border -->![BER](png-files/post-training-jobs-2.png)
<!-- border -->![BER](post-training-jobs-2.png)

This indicates that your training job has been successfully submitted.

Expand All @@ -206,13 +206,13 @@ To check whether your training already succeeded, you can use the **GET /trainin

4. Click **Execute**.

<!-- border -->![BER](png-files/get-training-jobs-1.png)
<!-- border -->![BER](get-training-jobs-1.png)

You should receive a response like below. The status `RUNNING` indicates that the training is still in progress. In case the status is `PENDING`, then the training has not started yet.

<!-- border -->![BER](png-files/get-training-jobs-2.png)
<!-- border -->![BER](get-training-jobs-2.png)

You may check the status now and then. Please note that the training may take up to 5 hours. Afterwards, the training status changes to `SUCCEEDED`. Along with that, you receive all the capabilities of the model, that are the entities the model can recognize.

<!-- border -->![BER](png-files/get-training-jobs-3.png)
<!-- border -->![BER](get-training-jobs-3.png)

Original file line number Diff line number Diff line change
Expand Up @@ -37,15 +37,15 @@ You'll use Swagger UI, via any web browser, to call the Data Attribute Recommend

In the service key you created for Data Attribute Recommendation in the previous tutorial: [Use Free Tier to Set Up Account for Data Attribute Recommendation and Get Service Key](cp-aibus-dar-booster-free-key) or [Use Trial to Set Up Account for Data Attribute Recommendation and Get Service Key](cp-aibus-dar-booster-key), you find a section called `swagger` (as highlighted in the image below) with three entries, called `dm` (data manager), `mm` (model manager) and `inference`. You'll use all three Swagger UIs throughout the tutorials.

<!-- border -->![Service Key](png-files/service-key-details.png)
<!-- border -->![Service Key](service-key-details.png)

For this tutorial, copy the URL of the Swagger UI for `dm` and open it in a browser tab.

>After finishing this tutorial, keep the Swagger UI for `dm` open to perform the clean up tasks in [Use the Invoice Object Recommendation Business Blueprint to Predict Financial Objects](cp-aibus-dar-swagger-ior-predict).

1. To be able to use the Swagger UI endpoints, you need to authorize yourself. In the top right corner, click **Authorize**.

<!-- border -->![Authorize](png-files/swagger-authorize.png)
<!-- border -->![Authorize](swagger-authorize.png)

2. Get the `access_token` value created in the previous tutorial: [Get OAuth Access Token for Data Attribute Recommendation Using Any Web Browser](cp-aibus-dar-web-oauth-token), then add **Bearer** (with capitalized "B") in front of it, and enter in the **Value** field.

Expand All @@ -55,7 +55,7 @@ For this tutorial, copy the URL of the Swagger UI for `dm` and open it in a brow

3. Click **Authorize** and then click **Close**.

<!-- border -->![Authorize](png-files/swagger-token.png)
<!-- border -->![Authorize](swagger-token.png)



Expand Down Expand Up @@ -144,15 +144,15 @@ To create the dataset schema, proceed as follows:

1. In Swagger UI, expand the endpoint `POST /datasetSchemas` by clicking on it. Then click **Try it out**.

<!-- border -->![Dataset Schema Endpoint](png-files/dataset-schema-endpoint.png)
<!-- border -->![Dataset Schema Endpoint](dataset-schema-endpoint.png)

2. Copy the above dataset schema into the text area. Then click **Execute** to create it.

<!-- border -->![Dataset Schema Execute](png-files/dataset-schema-execute.png)
<!-- border -->![Dataset Schema Execute](dataset-schema-execute.png)

3. Further below, you find the response of the service. The response includes a representation of dataset schema that was just created. Additionally, the dataset schema received an `id`. Copy it locally as you'll need it in the next step.

<!-- border -->![Dataset Schema Response](png-files/dataset-schema-response.png)
<!-- border -->![Dataset Schema Response](dataset-schema-response.png)

You have successfully created a dataset schema.

Expand All @@ -167,15 +167,15 @@ To create the dataset, proceed as follows:

1. Expand the endpoint `POST /datasets` by clicking on it. Then click **Try it out**.

<!-- border -->![Dataset Endpoint](png-files/dataset-endpoint.png)
<!-- border -->![Dataset Endpoint](dataset-endpoint.png)

2. In the text area, replace the parameter `datasetSchemaId` with the `id` that you copied from the previous step and replace the parameter `name` with an appropriate name for you dataset, `ior_tutorial_dataset`, for example. Then click **Execute** to create the dataset.

<!-- border -->![Dataset Execute](png-files/dataset-execute.png)
<!-- border -->![Dataset Execute](dataset-execute.png)

3. In the response of the service, you find the `id` of your dataset. Copy it locally as you'll need it in the next steps and also in the next tutorial: [Use the Invoice Object Recommendation Business Blueprint to Train a Machine Learning Model](cp-aibus-dar-swagger-ior-model). Additionally, you find the `status` of the dataset. The status is `NO_DATA` as no data file has been uploaded yet.

<!-- border -->![Dataset Response](png-files/dataset-response.png)
<!-- border -->![Dataset Response](dataset-response.png)

You have successfully created a dataset.

Expand All @@ -187,25 +187,25 @@ You have successfully created a dataset.

The final step of this tutorial is to upload data to your dataset.

In this tutorial, you'll use this [dataset](https://github.com/SAPDocuments/Tutorials/raw/master/tutorials/cp-aibus-dar-swagger-ior-upload/data/Dataset_IOR.csv). Right click on the link, then click ***Save link as*** to open the file dialog. In the dialog, replace the file ending `txt` with `csv` as indicated below. Then save the file.
In this tutorial, you'll use this [dataset](https://raw.githubusercontent.com/sap-tutorials/Tutorials/master/tutorials/cp-aibus-dar-swagger-ior-upload/Dataset_IOR.csv). Right click on the link, then click ***Save link as*** to open the file dialog. In the dialog, replace the file ending `txt` with `csv` as indicated below. Then save the file.

<!-- border -->![Save File Dialog](png-files/save-file-dialog.png)
<!-- border -->![Save File Dialog](save-file-dialog.png)

In Swagger UI, proceed as follows to upload to the data:

1. Expand the endpoint `POST /datasets/{id}/data` by clicking on it. Then click **Try it out**.

<!-- border -->![Data Endpoint](png-files/data-endpoint.png)
<!-- border -->![Data Endpoint](data-endpoint.png)

2. Fill the parameter `id` with the `id` of your dataset that you previously copied.

3. Click **Choose File** below the parameter `Request body`. In the dialog that opens, select the IOR dataset that you just downloaded. Then click **Execute** to upload the data.

<!-- border -->![Data Execute](png-files/data-execute.png)
<!-- border -->![Data Execute](data-execute.png)

In the response, you'll see that the status of your dataset has changed to `VALIDATING`. The service is now validating the data that you have uploaded.

<!-- border -->![Data Response](png-files/data-response.png)
<!-- border -->![Data Response](data-response.png)

You have successfully uploaded data to your dataset.

Expand All @@ -218,15 +218,15 @@ To check the validation status of your data, proceed as follows:

1. Expand the endpoint `GET /datasets/{id}` by clicking on it. Then click **Try it out**.

<!-- border -->![Dataset Status Endpoint](png-files/dataset-status-endpoint.png)
<!-- border -->![Dataset Status Endpoint](dataset-status-endpoint.png)

2. Fill the parameter `id` with the `id` of your dataset. Click **Execute**.

<!-- border -->![Dataset Status Execute](png-files/dataset-status-execute.png)
<!-- border -->![Dataset Status Execute](dataset-status-execute.png)

3. In the response of the service, you find the status of your dataset. If the status is still `VALIDATING`, check back in a few minutes. If the status is `SUCCEEDED`, your data is valid. In case the status is either `INVALID_DATA` or `VALIDATION_FAILED`, create a new dataset and upload the data once again.

<!-- border -->![Dataset Status Response](png-files/dataset-status-response.png)
<!-- border -->![Dataset Status Response](dataset-status-response.png)

You have successfully created a dataset and uploaded data. You can now use the dataset to train a machine learning model.

Loading