Skip to content

Commit 6b1fef5

Browse files
authored
Merge pull request #20721 from rnagweka/master
push changes to Production
2 parents 94a6026 + 4c7b8d0 commit 6b1fef5

File tree

2 files changed

+11
-11
lines changed

2 files changed

+11
-11
lines changed

tutorials/data-lake-sof-multiple-directories/data-lake-sof-multiple-directories.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ parser: v2
1414

1515
## Prerequisites
1616
- Have access to a licensed managed or standalone SAP HANA data lake.
17-
- Installation of the HDLFSCLI. See tutorial [Getting Started with Data Lake Files HDLFSCLI](developers.sap.com/tutorials/data-lake-file-containers-hdlfscli)
17+
- Installation of the HDLFSCLI. See tutorial [Getting Started with Data Lake Files HDLFSCLI](data-lake-file-containers-hdlfscli)
1818
- Load some structured data files inside of a SAP HANA data lake File Container.
1919
- Already set up HDLFS Connection in Database Explorer - Setting Up HDLFS Connection In Database Explorer.
2020

@@ -28,7 +28,7 @@ parser: v2
2828

2929
Querying structured data files (CSV, ORC, Parquet) in a HANA Data Lake file container can be done using SQL on Files. Below you will find all of the steps required to start using SQL on Files.
3030
If you have not yet provisioned an SAP HANA data lake, here is a great tutorial on how to do so!
31-
Please go through the entire tutorial on [Use SOF to Query data from Single Directory](developers.sap.com/tutorials/data-lake-sof-single-directory)
31+
Please go through the entire tutorial on [Use SOF to Query data from Single Directory](data-lake-sof-single-directory)
3232

3333
You will be using the Orders table as a reference.
3434

@@ -64,7 +64,7 @@ O_COMMENT varchar(79)
6464
```
6565
<!-- border --> ![DBX Screenshot](image-1.png)
6666

67-
For information about the parameter definitions and supported data types, see [CREATE (Remote) TABLE Statement for Data Lake Relational Engine (HANA DB-Managed)](help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/24e694b566814ad285cb32fe3e5d3928.html?state=DRAFT&version=2022_1_QRC)
67+
For information about the parameter definitions and supported data types, see [CREATE (Remote) TABLE Statement for Data Lake Relational Engine (HANA DB-Managed)](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/24e694b566814ad285cb32fe3e5d3928.html?state=DRAFT&version=2022_1_QRC)
6868

6969

7070
Next, You will create a virtual table. Notice in the SQL below where the remote servers name goes and where the reference to the table in the Files Service goes. Over here, you will be creating an ORDERS VIRTUAL TABLE in HDLRE that points to the ORDERS table that you just created in SQL On Files service.
@@ -132,7 +132,7 @@ LOAD TABLE ORDERS(
132132
END;
133133
```
134134

135-
Make sure you have added your File Container connection in DBX. If not, one can go through the tutorial –[Setting Up HDLFS Connection In Database Explorer](developers.sap.com/tutorials/data-lake-hdlfs-dbx-connection)
135+
Make sure you have added your File Container connection in DBX. If not, one can go through the tutorial –[Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection)
136136

137137

138138
<!-- border --> ![DBX Screenshot](image-4.png)
@@ -165,7 +165,7 @@ O_COMMENT FROM COLUMN $8
165165

166166
Notice that directories are located using a 0-index. The `ORDERYEAR` column is directory `$0, ORDERMONTH` column is directory $1, and subsequent directories would be `$1, $2, ... $n`. This tells the parser to look at these directory levels to find the value for the corresponding column name. The value is parsed from what is placed after the **=** in the directory name.
167167

168-
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference - [ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed)](help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
168+
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference - [ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed)](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
169169

170170

171171

@@ -218,7 +218,7 @@ DROP SCHEMA HDLADMIN_TPCH_SQLONFILES IN FILES_SERVICE;
218218

219219
### Command line script to cleanup the file container
220220

221-
Connect to OpenSSL. Make sure you are all set up with the HDLFSCI tutorial with generating the certificates. If not, please go through the tutorial - [Getting Started with Data Lake Files HDLFSCLI](developers.sap.com/tutorials/data-lake-file-containers-hdlfscli)
221+
Connect to OpenSSL. Make sure you are all set up with the HDLFSCI tutorial with generating the certificates. If not, please go through the tutorial - [Getting Started with Data Lake Files HDLFSCLI](data-lake-file-containers-hdlfscli)
222222

223223
Just run the below command to see the files under your path in the File container
224224

tutorials/data-lake-sof-single-directory/data-lake-sof-single-directory.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ author_profile: https://github.com/rnagweka
1414
## Prerequisites
1515
- Have access to a licensed managed or standalone SAP HANA data lake.
1616
- Installation of the HDLFSCLI. See tutorial. [Getting Started with Data Lake Files HDLFSCLI | Tutorials for SAP Developers](data-lake-file-containers-hdlfscli).
17-
- Have some structured data files inside of a [SAP HANA data lake File Container](https://help.sap.com/docs/HANA_CLOUD_ALIBABA_CLOUD/683a53aec4fc408783bbb2dd8e47afeb/f4eae33ffb7a44f7af823ee6b70e3598.).
17+
- Have some structured data files inside of a [SAP HANA data lake File Container](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a89a80f984f21015b2b2c84d2498d36d/6e1dd06335704f4c96d48279ca1ed555.html?version=2021_4_QRC).
1818
- Have some data with you, which will be uploaded onto the SAP HANA data lake File container.
1919
- Already set up HDLFS Connection in Database Explorer.
2020

@@ -26,7 +26,7 @@ author_profile: https://github.com/rnagweka
2626

2727
Querying structured data files (CSV, ORC, Parquet) in a HANA Data Lake file container can be done using SQL on Files. Below you will find all of the steps required to start using SQL on Files.
2828

29-
If you have not yet provisioned an SAP HANA data lake, [here](hana-cloud-hdl-getting-started-1.) is a great tutorial on how to do so!
29+
If you have not yet provisioned an SAP HANA data lake, [here](hana-cloud-hdl-getting-started-1) is a great tutorial on how to do so!
3030

3131
---
3232

@@ -134,7 +134,7 @@ For the full syntax of clauses available to create an existing table, see [CREAT
134134
### Upload a file from HDLFS onto the Data Lake File container
135135

136136

137-
Make sure that you have everything setup with respect to HDLFSCLI. One can go through the tutorial for getting started with HDLFSCLI - [Getting Started with Data Lake Files HDLFSCLI | Tutorials for SAP Developers](data-lake-file-containers-hdlfscli.).
137+
Make sure that you have everything setup with respect to HDLFSCLI. One can go through the tutorial for getting started with HDLFSCLI - [Getting Started with Data Lake Files HDLFSCLI | Tutorials for SAP Developers](data-lake-file-containers-hdlfscli).
138138

139139
Use the below command to upload a local file onto the Data Lake -
140140

@@ -156,7 +156,7 @@ Verify that the files has been uploaded.
156156

157157
![Verify Files](image-2.png)
158158

159-
Make sure you have already set up a HDLFS Connection in Database Explorer. It will look something like below. To get to know how to setup a HDLFS Connection In Database Explorer go through the tutorial – [Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection.).
159+
Make sure you have already set up a HDLFS Connection in Database Explorer. It will look something like below. To get to know how to setup a HDLFS Connection In Database Explorer go through the tutorial – [Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection).
160160

161161
![Setting Up HDLFS Connection In Database Explorer](image-3.png)
162162

@@ -168,7 +168,7 @@ Add a data source, this can be done multiple times with multiple files.
168168

169169
Note that in this step the file path can lead to an exact file or it can lead to a directory. If it leads to a directory, SQL on Files will try to parse all the data files in that directory. To ensure that there are no parse errors, make sure that all the files in the directory match the schema of the table the data source it is being added to.
170170

171-
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference -[ ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed) and SQL on Files](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
171+
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference -[ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed) and SQL on Files](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
172172

173173

174174
```SQL

0 commit comments

Comments
 (0)