You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/data-lake-sof-multiple-directories/data-lake-sof-multiple-directories.md
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ parser: v2
14
14
15
15
## Prerequisites
16
16
- Have access to a licensed managed or standalone SAP HANA data lake.
17
-
- Installation of the HDLFSCLI. See tutorial [Getting Started with Data Lake Files HDLFSCLI](developers.sap.com/tutorials/data-lake-file-containers-hdlfscli)
17
+
- Installation of the HDLFSCLI. See tutorial [Getting Started with Data Lake Files HDLFSCLI](data-lake-file-containers-hdlfscli)
18
18
- Load some structured data files inside of a SAP HANA data lake File Container.
19
19
- Already set up HDLFS Connection in Database Explorer - Setting Up HDLFS Connection In Database Explorer.
20
20
@@ -28,7 +28,7 @@ parser: v2
28
28
29
29
Querying structured data files (CSV, ORC, Parquet) in a HANA Data Lake file container can be done using SQL on Files. Below you will find all of the steps required to start using SQL on Files.
30
30
If you have not yet provisioned an SAP HANA data lake, here is a great tutorial on how to do so!
31
-
Please go through the entire tutorial on [Use SOF to Query data from Single Directory](developers.sap.com/tutorials/data-lake-sof-single-directory)
31
+
Please go through the entire tutorial on [Use SOF to Query data from Single Directory](data-lake-sof-single-directory)
32
32
33
33
You will be using the Orders table as a reference.
34
34
@@ -64,7 +64,7 @@ O_COMMENT varchar(79)
64
64
```
65
65
<!-- border --> 
66
66
67
-
For information about the parameter definitions and supported data types, see [CREATE (Remote) TABLE Statement for Data Lake Relational Engine (HANA DB-Managed)](help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/24e694b566814ad285cb32fe3e5d3928.html?state=DRAFT&version=2022_1_QRC)
67
+
For information about the parameter definitions and supported data types, see [CREATE (Remote) TABLE Statement for Data Lake Relational Engine (HANA DB-Managed)](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/24e694b566814ad285cb32fe3e5d3928.html?state=DRAFT&version=2022_1_QRC)
68
68
69
69
70
70
Next, You will create a virtual table. Notice in the SQL below where the remote servers name goes and where the reference to the table in the Files Service goes. Over here, you will be creating an ORDERS VIRTUAL TABLE in HDLRE that points to the ORDERS table that you just created in SQL On Files service.
@@ -132,7 +132,7 @@ LOAD TABLE ORDERS(
132
132
END;
133
133
```
134
134
135
-
Make sure you have added your File Container connection in DBX. If not, one can go through the tutorial –[Setting Up HDLFS Connection In Database Explorer](developers.sap.com/tutorials/data-lake-hdlfs-dbx-connection)
135
+
Make sure you have added your File Container connection in DBX. If not, one can go through the tutorial –[Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection)
136
136
137
137
138
138
<!-- border --> 
@@ -165,7 +165,7 @@ O_COMMENT FROM COLUMN $8
165
165
166
166
Notice that directories are located using a 0-index. The `ORDERYEAR` column is directory `$0, ORDERMONTH` column is directory $1, and subsequent directories would be `$1, $2, ... $n`. This tells the parser to look at these directory levels to find the value for the corresponding column name. The value is parsed from what is placed after the **=** in the directory name.
167
167
168
-
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference - [ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed)](help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
168
+
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference - [ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed)](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
169
169
170
170
171
171
@@ -218,7 +218,7 @@ DROP SCHEMA HDLADMIN_TPCH_SQLONFILES IN FILES_SERVICE;
218
218
219
219
### Command line script to cleanup the file container
220
220
221
-
Connect to OpenSSL. Make sure you are all set up with the HDLFSCI tutorial with generating the certificates. If not, please go through the tutorial - [Getting Started with Data Lake Files HDLFSCLI](developers.sap.com/tutorials/data-lake-file-containers-hdlfscli)
221
+
Connect to OpenSSL. Make sure you are all set up with the HDLFSCI tutorial with generating the certificates. If not, please go through the tutorial - [Getting Started with Data Lake Files HDLFSCLI](data-lake-file-containers-hdlfscli)
222
222
223
223
Just run the below command to see the files under your path in the File container
0 commit comments