Skip to content

Commit 95e3b84

Browse files
committed
update readme 1
1 parent 77661db commit 95e3b84

File tree

1 file changed

+12
-147
lines changed

1 file changed

+12
-147
lines changed

README.md

Lines changed: 12 additions & 147 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,14 @@
1-
# stac-fastapi-elasticsearch-opensearch (sfeos)
1+
# stac-fastapi-mongo
22

3-
## Elasticsearch, Opensearch and Mongo backends for the stac-fastapi project
3+
## Mongo backend for the stac-fastapi project built on top of the [sfeos](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch) core api library.
4+
5+
### Note: This is presently in development and for now is mostly a proof of concept showing that other databases can be plugged into the [sfeos](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch) core api library, besides for elasticsearch and opensearch.
46

5-
[![PyPI version](https://badge.fury.io/py/stac-fastapi.elasticsearch.svg)](https://badge.fury.io/py/stac-fastapi.elasticsearch)
67

78
To install from PyPI:
89

910
```shell
10-
pip install stac_fastapi.elasticsearch
11-
```
12-
or
13-
```
14-
pip install stac_fastapi.opensearch
11+
pip install stac_fastapi.mongo
1512
```
1613

1714
#### For changes, see the [Changelog](CHANGELOG.md)
@@ -22,13 +19,7 @@ pip install stac_fastapi.opensearch
2219
To install the classes in your local Python env, run:
2320

2421
```shell
25-
pip install -e 'stac_fastapi/elasticsearch[dev]'
26-
```
27-
28-
or
29-
30-
```shell
31-
pip install -e 'stac_fastapi/opensearch[dev]'
22+
pip install -e 'stac_fastapi/mongo[dev]'
3223
```
3324

3425

@@ -45,30 +36,20 @@ pre-commit run --all-files
4536
## Build Elasticsearh API backend
4637

4738
```shell
48-
docker-compose up elasticsearch
49-
docker-compose build app-elasticsearch
39+
docker-compose up mongo
40+
docker-compose build app-mongo
5041
```
5142

52-
## Running Elasticsearh API on localhost:8080
53-
54-
```shell
55-
docker-compose up app-elasticsearch
56-
```
57-
58-
By default, docker-compose uses Elasticsearch 8.x and OpenSearch 2.11.1.
59-
If you wish to use a different version, put the following in a
60-
file named `.env` in the same directory you run docker-compose from:
43+
## Running Mongo API on localhost:8084
6144

6245
```shell
63-
ELASTICSEARCH_VERSION=7.17.1
64-
OPENSEARCH_VERSION=2.11.0
46+
docker-compose up app-mongo
6547
```
66-
The most recent Elasticsearch 7.x versions should also work. See the [opensearch-py docs](https://github.com/opensearch-project/opensearch-py/blob/main/COMPATIBILITY.md) for compatibility information.
6748

6849
To create a new Collection:
6950

7051
```shell
71-
curl -X "POST" "http://localhost:8080/collections" \
52+
curl -X "POST" "http://localhost:8084/collections" \
7253
-H 'Content-Type: application/json; charset=utf-8' \
7354
-d $'{
7455
"id": "my_collection"
@@ -85,134 +66,18 @@ returned from the `/collections` route contains a `next` link with the token tha
8566
get the next page of results.
8667

8768
```shell
88-
curl -X "GET" "http://localhost:8080/collections?limit=1&token=example_token"
69+
curl -X "GET" "http://localhost:8084/collections?limit=1&token=example_token"
8970
```
9071

9172
## Testing
9273

9374
```shell
9475
make test
9576
```
96-
Test against OpenSearch only
97-
98-
```shell
99-
make test-opensearch
100-
```
10177

102-
Test against Elasticsearch only
103-
104-
```shell
105-
make test-elasticsearch
106-
```
10778

10879
## Ingest sample data
10980

11081
```shell
11182
make ingest
11283
```
113-
114-
## Elasticsearch Mappings
115-
116-
Mappings apply to search index, not source.
117-
118-
119-
## Managing Elasticsearch Indices
120-
121-
This section covers how to create a snapshot repository and then create and restore snapshots with this.
122-
123-
Create a snapshot repository. This puts the files in the `elasticsearch/snapshots` in this git repo clone, as
124-
the elasticsearch.yml and docker-compose files create a mapping from that directory to
125-
`/usr/share/elasticsearch/snapshots` within the Elasticsearch container and grant permissions on using it.
126-
127-
```shell
128-
curl -X "PUT" "http://localhost:9200/_snapshot/my_fs_backup" \
129-
-H 'Content-Type: application/json; charset=utf-8' \
130-
-d $'{
131-
"type": "fs",
132-
"settings": {
133-
"location": "/usr/share/elasticsearch/snapshots/my_fs_backup"
134-
}
135-
}'
136-
```
137-
138-
The next step is to create a snapshot of one or more indices into this snapshot repository. This command creates
139-
a snapshot named `my_snapshot_2` and waits for the action to be completed before returning. This can also be done
140-
asynchronously, and queried for status. The `indices` parameter determines which indices are snapshotted, and
141-
can include wildcards.
142-
143-
```shell
144-
curl -X "PUT" "http://localhost:9200/_snapshot/my_fs_backup/my_snapshot_2?wait_for_completion=true" \
145-
-H 'Content-Type: application/json; charset=utf-8' \
146-
-d $'{
147-
"metadata": {
148-
"taken_because": "dump of all items",
149-
"taken_by": "pvarner"
150-
},
151-
"include_global_state": false,
152-
"ignore_unavailable": false,
153-
"indices": "items_my-collection"
154-
}'
155-
```
156-
157-
To see the status of this snapshot:
158-
159-
```shell
160-
curl http://localhost:9200/_snapshot/my_fs_backup/my_snapshot_2
161-
```
162-
163-
To see all the snapshots:
164-
165-
```shell
166-
curl http://localhost:9200/_snapshot/my_fs_backup/_all
167-
```
168-
169-
To restore a snapshot, run something similar to the following. This specific command will restore any indices that
170-
match `items_*` and rename them so that the new index name will be suffixed with `-copy`.
171-
172-
```shell
173-
curl -X "POST" "http://localhost:9200/_snapshot/my_fs_backup/my_snapshot_2/_restore?wait_for_completion=true" \
174-
-H 'Content-Type: application/json; charset=utf-8' \
175-
-d $'{
176-
"include_aliases": false,
177-
"include_global_state": false,
178-
"ignore_unavailable": true,
179-
"rename_replacement": "items_$1-copy",
180-
"indices": "items_*",
181-
"rename_pattern": "items_(.+)"
182-
}'
183-
```
184-
185-
Now the item documents have been restored in to the new index (e.g., `my-collection-copy`), but the value of the
186-
`collection` field in those documents is still the original value of `my-collection`. To update these to match the
187-
new collection name, run the following Elasticsearch Update By Query command, substituting the old collection name
188-
into the term filter and the new collection name into the script parameter:
189-
190-
```shell
191-
curl -X "POST" "http://localhost:9200/items_my-collection-copy/_update_by_query" \
192-
-H 'Content-Type: application/json; charset=utf-8' \
193-
-d $'{
194-
"query": {
195-
"match_all": {}
196-
},
197-
"script": {
198-
"lang": "painless",
199-
"params": {
200-
"collection": "my-collection-copy"
201-
},
202-
"source": "ctx._source.collection = params.collection"
203-
}
204-
}'
205-
```
206-
207-
Then, create a new collection through the api with the new name for each of the restored indices:
208-
209-
```shell
210-
curl -X "POST" "http://localhost:8080/collections" \
211-
-H 'Content-Type: application/json' \
212-
-d $'{
213-
"id": "my-collection-copy"
214-
}'
215-
```
216-
217-
Voila! You have a copy of the collection now that has a resource URI (`/collections/my-collection-copy`) and can be
218-
correctly queried by collection name.

0 commit comments

Comments
 (0)