You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/GoogleCloudPlatform/java-docs-samples&page=editor&open_in_editor=pubsublite/streaming-analytics/README.md)
> _When you enable Cloud Dataflow, which uses Compute Engine, a default Compute Engine service account with the Editor role (`roles/editor`) is created._
24
-
2. You can skip this step if you are trying this example in a Google Cloud environment like Cloud Shell.
Otherwise, [create](https://cloud.google.com/iam/docs/creating-managing-service-accounts#iam-service-accounts-create-gcloud) a user-managed service account and grant it the following roles on your project:
27
+
> _When you enable Cloud Dataflow, which uses Compute Engine, a default
28
+
> Compute Engine service account with the Editor role (`roles/editor`) is
29
+
> created._
30
+
31
+
1. You can skip this step if you are trying this example in a Google Cloud
a user-managed service account and grant it the following roles on your
37
+
project:
27
38
-`roles/dataflow.admin`
28
39
-`roles/pubsublite.viewer`
29
40
-`roles/pubsublite.subscriber`
30
41
-`roles/logging.viewer`
31
42
32
-
Then [create](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-gcloud) a service account key and point `GOOGLE_APPLICATION_CREDNETIALS` to your downloaded key file.
[Publish] some messages to your Lite topic. Then check for files in your Cloud Storage bucket.
129
+
[Publish] some messages to your Lite topic. Then check for files in your Cloud
130
+
Storage bucket.
103
131
104
132
```sh
105
133
gsutil ls "gs://$BUCKET/samples/output*"
106
134
```
107
135
108
136
## (Optional) Creating a custom Dataflow template
109
-
With a [`metadata.md`](metadata.md), you can create a [Dataflow Flex template]. Custom Dataflow Flex templates can be shared. You can run them with different input parameters.
110
137
111
-
1. Create a fat JAR. You should see `target/pubsublite-streaming-bundled-1.0.jar` as an output.
112
-
```sh
113
-
mvn clean package -DskipTests=true
114
-
ls -lh
115
-
```
138
+
With a [`metadata.md`](metadata.md), you can create a [Dataflow Flex template].
139
+
Custom Dataflow Flex templates can be shared. You can run them with different
140
+
input parameters.
141
+
142
+
1. Create a fat JAR. You should see
143
+
`target/pubsublite-streaming-bundled-1.0.jar` as an output.
144
+
145
+
```sh
146
+
mvn clean package -DskipTests=true
147
+
ls -lh
148
+
```
149
+
150
+
1. Provide names and locations for your template file and template container
1. Stop the pipeline. If you use `DirectRunner`, `Ctrl+C` to cancel. If you use
189
+
`DataflowRunner`, [click](https://console.cloud.google.com/dataflow/jobs) on
190
+
the job you want to stop, then choose "Cancel".
133
191
134
-
4. Run a job with the custom Flex template using `gcloud` or in Cloud Console.
135
-
> Note: Pub/Sub Lite allows only one subscriber to pull messages from one partition. If your Pub/Sub Lite topic has only one partition and you use a subscription attached to that topic in more than one Dataflow jobs, only one of them will get messages.
136
-
```sh
137
-
gcloud dataflow flex-template run "pubsublite-to-gcs-`date +%Y%m%d`" \
1.Stop the pipeline. If you use `DirectRunner`, `Ctrl+C` to cancel. If you use `DataflowRunner`, [click](https://console.cloud.google.com/dataflow/jobs) on the job you want to stop, then choose "Cancel".
Copy file name to clipboardExpand all lines: pubsublite/streaming-analytics/metadata.json
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
{
2
-
"name": "Pub/Sub Lite to Clous Storage",
2
+
"name": "Pub/Sub Lite to Cloud Storage",
3
3
"description": "An Apache Beam streaming pipeline that reads messages from Pub/Sub Lite, applies fixed windowing on the messages, and writes the results to files on Cloud Storage.",
0 commit comments