You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: install/ci-vm/installation.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -36,7 +36,7 @@ Now, create a service account with sufficient permissions (at least "Google Batc
36
36
37
37
If you are not the owner of the GCP project you are working on, make sure you have sufficient permissions for creating and managing service accounts; if not, request the project owner for the same.
38
38
39
-
- Create a service account [here](https://cloud.google.com/storage/docs/creating-buckets)
39
+
- Create a service account [here](https://cloud.google.com/iam/docs/service-accounts-create)
40
40
- Choose the service account name as per your choice, but at least provide the role of "Google Batch Service Agent" to the account.
41
41
42
42
You might also want to understand roles in GCP, you can find the official documentation [here](https://cloud.google.com/iam/docs/understanding-roles).
Copy file name to clipboardExpand all lines: install/installation.md
+37-22Lines changed: 37 additions & 22 deletions
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@
5
5
* Nginx (Other possible when modifying the sample download section)
6
6
* Python 3 (Flask and other dependencies)
7
7
* MySQL
8
-
* Pure-FTPD with mysql
8
+
* Pure-FTPD with mysql (optional, only needed for FTP file uploads)
9
9
10
10
## Configuring Google Cloud Platform
11
11
@@ -40,9 +40,9 @@ For deployment of the platform on a Google Cloud VM instance, one would require
40
40
Windows Server 2019 Datacenter
41
41
- Boot type disk: Balanced persistent disk
42
42
- Size: 50GB
43
-
- Choose the service account as the service account you just created for the platform.
44
-
- Select the "Allow HTTP traffic" and "Allow HTTPS traffic" checkboxes.
45
-
- Navigate to Advanced options -> Networking -> Network Interfaces -> External IPv4 address, and click on Create IP Address and reserve a new static external IP address for the platform.
43
+
- Navigate to Security and choose the service account as the service account you just created for the platform.
44
+
- Navigate to Network and select the "Allow HTTP traffic" and "Allow HTTPS traffic" checkboxes.
45
+
- Under Network Interfaces -> default, reserve a new static external IPv4 address for the platform.
46
46
47
47
2. Setting up firewall settings
48
48
@@ -67,26 +67,27 @@ Windows Server 2019 Datacenter
Place the service account key file that you generated earlier at the root of the sample-platform folder.
77
78
78
-
### Mounting the bucket
79
+
####Mounting the bucket
79
80
80
81
Mounting on Linux OS can be done using [Google Cloud Storage FUSE](https://cloud.google.com/storage/docs/gcs-fuse).
81
82
82
83
Steps:
83
-
- Install gcsfuse using [official documentation](https://github.com/GoogleCloudPlatform/gcsfuse/blob/master/docs/installing.md) or using the following script
84
+
- Install gcsfuse using [official documentation](https://cloud.google.com/storage/docs/cloud-storage-fuse/install) or using the following script
- Now, there are multiple ways to mount the bucket, official documentation [here](https://github.com/GoogleCloudPlatform/gcsfuse/blob/master/docs/mounting.md).
90
+
- Now, there are multiple ways to mount the bucket, official documentation [here](https://cloud.google.com/storage/docs/cloud-storage-fuse/mount-bucket).
90
91
91
92
For Ubuntu and derivatives, assuming `/repository` to be the location of samples to be configured, an entry can be added to `/etc/fstab` file, replace _GCS_BUCKET_NAME_ with the name of the bucket created for the platform:
92
93
```
@@ -99,32 +100,33 @@ Steps:
99
100
sudo mount /repository
100
101
```
101
102
102
-
You may check if the mount was successful and if the bucket is accessible by running `ls /repository` command.
103
+
Note that **this directory needs to be accessible by the `www-data` user**, you can verify if the mount was successful by running `sudo -u www-data ls /repository`
103
104
104
105
#### Troubleshooting: Mounting of Bucket
105
106
106
107
In case you get "permission denied" for `/repository`, you can check for the following reasons:
107
108
1. Check if the service account created has access to the GCS bucket.
108
109
2. Check the output of `sudo mount /repository` command.
110
+
3. Check the directory permissions for `/repository`
109
111
110
-
Place the service account key file at the root of the sample-platform folder.
111
112
112
113
#### MySQL installation
113
114
The platform has been tested for MySQL v8.0 and Python 3.7 to 3.9.
114
115
115
116
It is recommended to install python and MySQL beforehand to avoid any inconvenience. Here is the [installation link](https://www.digitalocean.com/community/tutorials/how-to-install-mysql-on-ubuntu-22-04) of MySQL on Ubuntu 22.04.
116
117
117
-
Next, navigate to the `install` folder and run `install.sh` with root
118
+
#### Installing The Platform
119
+
Next, navigate to the `install` folder and run `install.sh` with root
118
120
permissions.
119
121
120
122
```
121
123
cd sample-platform/install/
122
124
sudo ./install.sh
123
-
```
125
+
```
124
126
125
127
The `install.sh` will begin downloading and updating all the necessary dependencies. Once done, it'll ask to enter some details in order to set up the sample-platform. After filling in these details, the platform should be ready for use.
126
128
127
-
Please read the below troubleshooting notes in case of any error or doubt.
129
+
When the domain is asked during installation, enter the domain name that will run the platform. E.g., if the platform will run locally, enter `localhost` as the server name.
128
130
129
131
### Windows
130
132
@@ -156,9 +158,6 @@ or platform configuration.**
156
158
it's **recommended** to use a valid certificate.
157
159
[Let's Encrypt](https://letsencrypt.org/) offers free certificates. For local
158
160
testing, a self-signed certificate can be enough.
159
-
* When the server name is asked during installation, enter the domain name
160
-
that will run the platform. E.g., if the platform will run locally, enter
161
-
`localhost` as the server name.
162
161
* In case of a `502 Bad Gateway` response, the platform didn't start
163
162
correctly. Manually running `bootstrap_gunicorn.py` (as root!) can help to
164
163
determine what goes wrong. The snippet below shows how this can be done:
@@ -189,19 +188,27 @@ After the completion of the automated installation of the platform, the followin
189
188
- `TestResults/` - Direction containing regression test results
190
189
- `vm_data/` - Directory containing test-specific subfolders, each folder containing files required for testing to be passed to the VM instance, test files and CCExtractor build artefact.
191
190
192
-
Now for tests to run, we need to download the [CCExtractor testsuite](https://github.com/CCExtractor/ccx_testsuite) release file, extract and put it in `TestData/ci-linux` and `TestData/ci-windows` folders.
191
+
Now for tests to run, we need to download the [CCExtractor testsuite](https://github.com/CCExtractor/ccx_testsuite) release file, extract and put it in the `TestData/ci-linux` and `TestData/ci-windows` folders.
193
192
194
-
## GCS configuration to serve file downloads using Signed URLs
193
+
You also need to create a shell script named `ccextractortester` and place it in both `ci-linux` and `ci-windows`. This script is meant to launch the testsuite binary, here is what it should look like for Linux:
194
+
```sh
195
+
#!/bin/bash
196
+
exec mono CCExtractorTester.exe "$@"
197
+
```
195
198
196
-
To serve file downloads directly from the private GCS bucket, Signed download URLs have been used.
199
+
## Setting up GitHub webhooks
197
200
198
-
The `serve_file_download` function in the `utility.py` file implements the generation of a signed URL for the file to be downloaded that would expire after a configured time limit (maximum limit: 7 days) and redirects the client to the URL.
201
+
Now that the server is running, you can queue new tests either manually (via `/custom/`) or automatically through GitHub Actions.
199
202
200
-
For more information about Signed URLs, you can refer to the [official documentation](https://cloud.google.com/storage/docs/access-control/signed-urls).
203
+
To queue a test whenever a new commit/PR is made, you need to create a GitHub [webhook](https://docs.github.com/en/webhooks/about-webhooks) on the ccextractor repository (or fork of it):
204
+
- Set the payload URL to `https://<your_domain>/start-ci`
205
+
- Set the content type to JSON.
206
+
- Enter the same secret that you used during installation (`GITHUB_CI_KEY`)
207
+
- Select the Push, PR and Issue events as triggers.
201
208
202
209
## Setting up cron job to run tests
203
210
204
-
Now the server being running, new tests would be queued and therefore a cron job is to be setup to run those tests.
211
+
To run the new tests that are being queued up, a cron job is required.
205
212
The file `mod_ci/cron.py` is to be run in periodic intervals. To setup a cron job follow the steps below:
206
213
1. Open your terminal and enter the command `sudo crontab -e`.
207
214
2. To setup a cron job that runs this file every 10 minutes, append this at the bottom of the file
@@ -210,6 +217,14 @@ The file `mod_ci/cron.py` is to be run in periodic intervals. To setup a cron jo
210
217
```
211
218
Change the `/var/www/sample-plaform` directory, if you have installed the platform in a different directory.
212
219
220
+
## GCS configuration to serve file downloads using Signed URLs
221
+
222
+
To serve file downloads directly from the private GCS bucket, Signed download URLs have been used.
223
+
224
+
The `serve_file_download` function in the `utility.py` file implements the generation of a signed URL for the file to be downloaded that would expire after a configured time limit (maximum limit: 7 days) and redirects the client to the URL.
225
+
226
+
For more information about Signed URLs, you can refer to the [official documentation](https://cloud.google.com/storage/docs/access-control/signed-urls).
227
+
213
228
## File upload size for HTTP
214
229
215
230
In order to accept big files through HTTP uploads, some files need to be
0 commit comments