You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/data-lake-sof-multiple-directories/data-lake-sof-multiple-directories.md
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ parser: v2
14
14
15
15
## Prerequisites
16
16
- Have access to a licensed managed or standalone SAP HANA data lake.
17
-
- Installation of the HDLFSCLI. See tutorial [Getting Started with Data Lake Files HDLFSCLI](developers.sap.com/tutorials/data-lake-file-containers-hdlfscli)
17
+
- Installation of the HDLFSCLI. See tutorial [Getting Started with Data Lake Files HDLFSCLI](data-lake-file-containers-hdlfscli)
18
18
- Load some structured data files inside of a SAP HANA data lake File Container.
19
19
- Already set up HDLFS Connection in Database Explorer - Setting Up HDLFS Connection In Database Explorer.
20
20
@@ -28,7 +28,7 @@ parser: v2
28
28
29
29
Querying structured data files (CSV, ORC, Parquet) in a HANA Data Lake file container can be done using SQL on Files. Below you will find all of the steps required to start using SQL on Files.
30
30
If you have not yet provisioned an SAP HANA data lake, here is a great tutorial on how to do so!
31
-
Please go through the entire tutorial on [Use SOF to Query data from Single Directory](developers.sap.com/tutorials/data-lake-sof-single-directory)
31
+
Please go through the entire tutorial on [Use SOF to Query data from Single Directory](data-lake-sof-single-directory)
32
32
33
33
You will be using the Orders table as a reference.
34
34
@@ -64,7 +64,7 @@ O_COMMENT varchar(79)
64
64
```
65
65
<!-- border --> 
66
66
67
-
For information about the parameter definitions and supported data types, see [CREATE (Remote) TABLE Statement for Data Lake Relational Engine (HANA DB-Managed)](help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/24e694b566814ad285cb32fe3e5d3928.html?state=DRAFT&version=2022_1_QRC)
67
+
For information about the parameter definitions and supported data types, see [CREATE (Remote) TABLE Statement for Data Lake Relational Engine (HANA DB-Managed)](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/24e694b566814ad285cb32fe3e5d3928.html?state=DRAFT&version=2022_1_QRC)
68
68
69
69
70
70
Next, You will create a virtual table. Notice in the SQL below where the remote servers name goes and where the reference to the table in the Files Service goes. Over here, you will be creating an ORDERS VIRTUAL TABLE in HDLRE that points to the ORDERS table that you just created in SQL On Files service.
@@ -132,7 +132,7 @@ LOAD TABLE ORDERS(
132
132
END;
133
133
```
134
134
135
-
Make sure you have added your File Container connection in DBX. If not, one can go through the tutorial –[Setting Up HDLFS Connection In Database Explorer](developers.sap.com/tutorials/data-lake-hdlfs-dbx-connection)
135
+
Make sure you have added your File Container connection in DBX. If not, one can go through the tutorial –[Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection)
136
136
137
137
138
138
<!-- border --> 
@@ -165,7 +165,7 @@ O_COMMENT FROM COLUMN $8
165
165
166
166
Notice that directories are located using a 0-index. The `ORDERYEAR` column is directory `$0, ORDERMONTH` column is directory $1, and subsequent directories would be `$1, $2, ... $n`. This tells the parser to look at these directory levels to find the value for the corresponding column name. The value is parsed from what is placed after the **=** in the directory name.
167
167
168
-
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference - [ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed)](help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
168
+
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference - [ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed)](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
169
169
170
170
171
171
@@ -218,7 +218,7 @@ DROP SCHEMA HDLADMIN_TPCH_SQLONFILES IN FILES_SERVICE;
218
218
219
219
### Command line script to cleanup the file container
220
220
221
-
Connect to OpenSSL. Make sure you are all set up with the HDLFSCI tutorial with generating the certificates. If not, please go through the tutorial - [Getting Started with Data Lake Files HDLFSCLI](developers.sap.com/tutorials/data-lake-file-containers-hdlfscli)
221
+
Connect to OpenSSL. Make sure you are all set up with the HDLFSCI tutorial with generating the certificates. If not, please go through the tutorial - [Getting Started with Data Lake Files HDLFSCLI](data-lake-file-containers-hdlfscli)
222
222
223
223
Just run the below command to see the files under your path in the File container
- Have access to a licensed managed or standalone SAP HANA data lake.
16
16
- Installation of the HDLFSCLI. See tutorial. [Getting Started with Data Lake Files HDLFSCLI | Tutorials for SAP Developers](data-lake-file-containers-hdlfscli).
17
-
- Have some structured data files inside of a [SAP HANA data lake File Container](https://help.sap.com/docs/HANA_CLOUD_ALIBABA_CLOUD/683a53aec4fc408783bbb2dd8e47afeb/f4eae33ffb7a44f7af823ee6b70e3598.).
17
+
- Have some structured data files inside of a [SAP HANA data lake File Container](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a89a80f984f21015b2b2c84d2498d36d/6e1dd06335704f4c96d48279ca1ed555.html?version=2021_4_QRC).
18
18
- Have some data with you, which will be uploaded onto the SAP HANA data lake File container.
19
19
- Already set up HDLFS Connection in Database Explorer.
Querying structured data files (CSV, ORC, Parquet) in a HANA Data Lake file container can be done using SQL on Files. Below you will find all of the steps required to start using SQL on Files.
28
28
29
-
If you have not yet provisioned an SAP HANA data lake, [here](hana-cloud-hdl-getting-started-1.) is a great tutorial on how to do so!
29
+
If you have not yet provisioned an SAP HANA data lake, [here](hana-cloud-hdl-getting-started-1) is a great tutorial on how to do so!
30
30
31
31
---
32
32
@@ -134,7 +134,7 @@ For the full syntax of clauses available to create an existing table, see [CREAT
134
134
### Upload a file from HDLFS onto the Data Lake File container
135
135
136
136
137
-
Make sure that you have everything setup with respect to HDLFSCLI. One can go through the tutorial for getting started with HDLFSCLI - [Getting Started with Data Lake Files HDLFSCLI | Tutorials for SAP Developers](data-lake-file-containers-hdlfscli.).
137
+
Make sure that you have everything setup with respect to HDLFSCLI. One can go through the tutorial for getting started with HDLFSCLI - [Getting Started with Data Lake Files HDLFSCLI | Tutorials for SAP Developers](data-lake-file-containers-hdlfscli).
138
138
139
139
Use the below command to upload a local file onto the Data Lake -
140
140
@@ -156,7 +156,7 @@ Verify that the files has been uploaded.
156
156
157
157

158
158
159
-
Make sure you have already set up a HDLFS Connection in Database Explorer. It will look something like below. To get to know how to setup a HDLFS Connection In Database Explorer go through the tutorial – [Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection.).
159
+
Make sure you have already set up a HDLFS Connection in Database Explorer. It will look something like below. To get to know how to setup a HDLFS Connection In Database Explorer go through the tutorial – [Setting Up HDLFS Connection In Database Explorer](data-lake-hdlfs-dbx-connection).
160
160
161
161

162
162
@@ -168,7 +168,7 @@ Add a data source, this can be done multiple times with multiple files.
168
168
169
169
Note that in this step the file path can lead to an exact file or it can lead to a directory. If it leads to a directory, SQL on Files will try to parse all the data files in that directory. To ensure that there are no parse errors, make sure that all the files in the directory match the schema of the table the data source it is being added to.
170
170
171
-
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference -[ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed) and SQL on Files](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
171
+
One could also refer the ALTER TABLE ADD DATASOURCE doc for any further reference -[ALTER (Remote) TABLE ADD DATASOURCE Statement for Data Lake Relational Engine (HANA DB-Managed) and SQL on Files](https://help.sap.com/docs/SAP_HANA_DATA_LAKE/a898e08b84f21015969fa437e89860c8/e6e7243b09c34d48adf387e96f43c014.html?q=ADD%20DATASOURCE)
0 commit comments