Skip to content

Commit 48ff87f

Browse files
committed
Merge branch 'pre/beta' of https://github.com/VinciGit00/Scrapegraph-ai into pre/beta
2 parents 03ffebc + c016efd commit 48ff87f

29 files changed

+599
-256
lines changed

CHANGELOG.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,15 @@
1+
## [1.7.0-beta.11](https://github.com/VinciGit00/Scrapegraph-ai/compare/v1.7.0-beta.10...v1.7.0-beta.11) (2024-06-17)
2+
3+
4+
### Features
5+
6+
* **telemetry:** add telemetry module ([080a318](https://github.com/VinciGit00/Scrapegraph-ai/commit/080a318ff68652a3c81a6890cd40fd20c48ac6d0))
7+
8+
9+
### Docs
10+
11+
* refactor graph section and added telemetry ([39bf4c9](https://github.com/VinciGit00/Scrapegraph-ai/commit/39bf4c960d703a321af64e3b1b41ca9a1a15794e))
12+
113
## [1.7.0-beta.10](https://github.com/VinciGit00/Scrapegraph-ai/compare/v1.7.0-beta.9...v1.7.0-beta.10) (2024-06-17)
214

315

docs/source/conf.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,4 +36,7 @@
3636
"source_repository": "https://github.com/VinciGit00/Scrapegraph-ai/",
3737
"source_branch": "main",
3838
"source_directory": "docs/source/",
39-
}
39+
'navigation_with_keys': True,
40+
'sidebar_hide_name': False,
41+
}
42+

docs/source/index.rst

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -22,9 +22,6 @@
2222
:caption: Scrapers
2323

2424
scrapers/graphs
25-
scrapers/llm
26-
scrapers/graph_config
27-
scrapers/benchmarks
2825

2926
.. toctree::
3027
:maxdepth: 2

docs/source/scrapers/graphs.rst

Lines changed: 8 additions & 221 deletions
Original file line numberDiff line numberDiff line change
@@ -3,224 +3,11 @@ Graphs
33

44
Graphs are scraping pipelines aimed at solving specific tasks. They are composed by nodes which can be configured individually to address different aspects of the task (fetching data, extracting information, etc.).
55

6-
There are several types of graphs available in the library, each with its own purpose and functionality. The most common ones are:
7-
8-
- **SmartScraperGraph**: one-page scraper that requires a user-defined prompt and a URL (or local file) to extract information using LLM.
9-
- **SearchGraph**: multi-page scraper that only requires a user-defined prompt to extract information from a search engine using LLM. It is built on top of SmartScraperGraph.
10-
- **SpeechGraph**: text-to-speech pipeline that generates an answer as well as a requested audio file. It is built on top of SmartScraperGraph and requires a user-defined prompt and a URL (or local file).
11-
- **ScriptCreatorGraph**: script generator that creates a Python script to scrape a website using the specified library (e.g. BeautifulSoup). It requires a user-defined prompt and a URL (or local file).
12-
13-
There are also two additional graphs that can handle multiple sources:
14-
15-
- **SmartScraperMultiGraph**: similar to `SmartScraperGraph`, but with the ability to handle multiple sources.
16-
- **ScriptCreatorMultiGraph**: similar to `ScriptCreatorGraph`, but with the ability to handle multiple sources.
17-
18-
With the introduction of `GPT-4o`, two new powerful graphs have been created:
19-
20-
- **OmniScraperGraph**: similar to `SmartScraperGraph`, but with the ability to scrape images and describe them.
21-
- **OmniSearchGraph**: similar to `SearchGraph`, but with the ability to scrape images and describe them.
22-
23-
24-
.. note::
25-
26-
They all use a graph configuration to set up LLM models and other parameters. To find out more about the configurations, check the :ref:`LLM` and :ref:`Configuration` sections.
27-
28-
29-
.. note::
30-
31-
We can pass an optional `schema` parameter to the graph constructor to specify the output schema. If not provided or set to `None`, the schema will be generated by the LLM itself.
32-
33-
OmniScraperGraph
34-
^^^^^^^^^^^^^^^^
35-
36-
.. image:: ../../assets/omniscrapergraph.png
37-
:align: center
38-
:width: 90%
39-
:alt: OmniScraperGraph
40-
|
41-
42-
First we define the graph configuration, which includes the LLM model and other parameters. Then we create an instance of the OmniScraperGraph class, passing the prompt, source, and configuration as arguments. Finally, we run the graph and print the result.
43-
It will fetch the data from the source and extract the information based on the prompt in JSON format.
44-
45-
.. code-block:: python
46-
47-
from scrapegraphai.graphs import OmniScraperGraph
48-
49-
graph_config = {
50-
"llm": {...},
51-
}
52-
53-
omni_scraper_graph = OmniScraperGraph(
54-
prompt="List me all the projects with their titles and image links and descriptions.",
55-
source="https://perinim.github.io/projects",
56-
config=graph_config,
57-
schema=schema
58-
)
59-
60-
result = omni_scraper_graph.run()
61-
print(result)
62-
63-
OmniSearchGraph
64-
^^^^^^^^^^^^^^^
65-
66-
.. image:: ../../assets/omnisearchgraph.png
67-
:align: center
68-
:width: 80%
69-
:alt: OmniSearchGraph
70-
|
71-
72-
Similar to OmniScraperGraph, we define the graph configuration, create multiple of the OmniSearchGraph class, and run the graph.
73-
It will create a search query, fetch the first n results from the search engine, run n OmniScraperGraph instances, and return the results in JSON format.
74-
75-
.. code-block:: python
76-
77-
from scrapegraphai.graphs import OmniSearchGraph
78-
79-
graph_config = {
80-
"llm": {...},
81-
}
82-
83-
# Create the OmniSearchGraph instance
84-
omni_search_graph = OmniSearchGraph(
85-
prompt="List me all Chioggia's famous dishes and describe their pictures.",
86-
config=graph_config,
87-
schema=schema
88-
)
89-
90-
# Run the graph
91-
result = omni_search_graph.run()
92-
print(result)
93-
94-
SmartScraperGraph & SmartScraperMultiGraph
95-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
96-
97-
.. image:: ../../assets/smartscrapergraph.png
98-
:align: center
99-
:width: 90%
100-
:alt: SmartScraperGraph
101-
|
102-
103-
First we define the graph configuration, which includes the LLM model and other parameters. Then we create an instance of the SmartScraperGraph class, passing the prompt, source, and configuration as arguments. Finally, we run the graph and print the result.
104-
It will fetch the data from the source and extract the information based on the prompt in JSON format.
105-
106-
.. code-block:: python
107-
108-
from scrapegraphai.graphs import SmartScraperGraph
109-
110-
graph_config = {
111-
"llm": {...},
112-
}
113-
114-
smart_scraper_graph = SmartScraperGraph(
115-
prompt="List me all the projects with their descriptions",
116-
source="https://perinim.github.io/projects",
117-
config=graph_config,
118-
schema=schema
119-
)
120-
121-
result = smart_scraper_graph.run()
122-
print(result)
123-
124-
**SmartScraperMultiGraph** is similar to SmartScraperGraph, but it can handle multiple sources. We define the graph configuration, create an instance of the SmartScraperMultiGraph class, and run the graph.
125-
126-
SearchGraph
127-
^^^^^^^^^^^
128-
129-
.. image:: ../../assets/searchgraph.png
130-
:align: center
131-
:width: 80%
132-
:alt: SearchGraph
133-
|
134-
135-
Similar to SmartScraperGraph, we define the graph configuration, create an instance of the SearchGraph class, and run the graph.
136-
It will create a search query, fetch the first n results from the search engine, run n SmartScraperGraph instances, and return the results in JSON format.
137-
138-
139-
.. code-block:: python
140-
141-
from scrapegraphai.graphs import SearchGraph
142-
143-
graph_config = {
144-
"llm": {...},
145-
"embeddings": {...},
146-
}
147-
148-
# Create the SearchGraph instance
149-
search_graph = SearchGraph(
150-
prompt="List me all the traditional recipes from Chioggia",
151-
config=graph_config,
152-
schema=schema
153-
)
154-
155-
# Run the graph
156-
result = search_graph.run()
157-
print(result)
158-
159-
160-
SpeechGraph
161-
^^^^^^^^^^^
162-
163-
.. image:: ../../assets/speechgraph.png
164-
:align: center
165-
:width: 90%
166-
:alt: SpeechGraph
167-
|
168-
169-
Similar to SmartScraperGraph, we define the graph configuration, create an instance of the SpeechGraph class, and run the graph.
170-
It will fetch the data from the source, extract the information based on the prompt, and generate an audio file with the answer, as well as the answer itself, in JSON format.
171-
172-
.. code-block:: python
173-
174-
from scrapegraphai.graphs import SpeechGraph
175-
176-
graph_config = {
177-
"llm": {...},
178-
"tts_model": {...},
179-
}
180-
181-
# ************************************************
182-
# Create the SpeechGraph instance and run it
183-
# ************************************************
184-
185-
speech_graph = SpeechGraph(
186-
prompt="Make a detailed audio summary of the projects.",
187-
source="https://perinim.github.io/projects/",
188-
config=graph_config,
189-
schema=schema
190-
)
191-
192-
result = speech_graph.run()
193-
print(result)
194-
195-
196-
ScriptCreatorGraph & ScriptCreatorMultiGraph
197-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
198-
199-
.. image:: ../../assets/scriptcreatorgraph.png
200-
:align: center
201-
:width: 90%
202-
:alt: ScriptCreatorGraph
203-
204-
First we define the graph configuration, which includes the LLM model and other parameters.
205-
Then we create an instance of the ScriptCreatorGraph class, passing the prompt, source, and configuration as arguments. Finally, we run the graph and print the result.
206-
207-
.. code-block:: python
208-
209-
from scrapegraphai.graphs import ScriptCreatorGraph
210-
211-
graph_config = {
212-
"llm": {...},
213-
"library": "beautifulsoup4"
214-
}
215-
216-
script_creator_graph = ScriptCreatorGraph(
217-
prompt="Create a Python script to scrape the projects.",
218-
source="https://perinim.github.io/projects/",
219-
config=graph_config,
220-
schema=schema
221-
)
222-
223-
result = script_creator_graph.run()
224-
print(result)
225-
226-
**ScriptCreatorMultiGraph** is similar to ScriptCreatorGraph, but it can handle multiple sources. We define the graph configuration, create an instance of the ScriptCreatorMultiGraph class, and run the graph.
6+
.. toctree::
7+
:maxdepth: 4
8+
9+
types
10+
llm
11+
graph_config
12+
benchmarks
13+
telemetry

docs/source/scrapers/telemetry.rst

Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
===============
2+
Usage Analytics
3+
===============
4+
5+
ScrapeGraphAI collects **anonymous** usage data by default to improve the library and guide development efforts.
6+
7+
**Events Captured**
8+
9+
We capture events in the following scenarios:
10+
11+
1. When a ``Graph`` finishes running.
12+
2. When an exception is raised in one of the nodes.
13+
14+
**Data Collected**
15+
16+
The data captured is limited to:
17+
18+
- Operating System and Python version
19+
- A persistent UUID to identify the session, stored in ``~/.scrapegraphai.conf``
20+
21+
Additionally, the following properties are collected:
22+
23+
.. code-block:: python
24+
25+
properties = {
26+
"graph_name": graph_name,
27+
"llm_model": llm_model_name,
28+
"embedder_model": embedder_model_name,
29+
"source_type": source_type,
30+
"execution_time": execution_time,
31+
"error_node": error_node_name,
32+
}
33+
34+
For more details, refer to the `telemetry.py <https://github.com/VinciGit00/Scrapegraph-ai/blob/main/scrapegraphai/telemetry/telemetry.py>`_ module.
35+
36+
**Opting Out**
37+
38+
If you prefer not to participate in telemetry, you can opt out using any of the following methods:
39+
40+
1. **Programmatically Disable Telemetry**:
41+
42+
Add the following code at the beginning of your script:
43+
44+
.. code-block:: python
45+
46+
from scrapegraphai import telemetry
47+
telemetry.disable_telemetry()
48+
49+
2. **Configuration File**:
50+
51+
Set the ``telemetry_enabled`` key to ``false`` in ``~/.scrapegraphai.conf`` under the ``[DEFAULT]`` section:
52+
53+
.. code-block:: ini
54+
55+
[DEFAULT]
56+
telemetry_enabled = False
57+
58+
3. **Environment Variable**:
59+
60+
- **For a Shell Session**:
61+
62+
.. code-block:: bash
63+
64+
export SCRAPEGRAPHAI_TELEMETRY_ENABLED=false
65+
66+
- **For a Single Command**:
67+
68+
.. code-block:: bash
69+
70+
SCRAPEGRAPHAI_TELEMETRY_ENABLED=false python my_script.py
71+
72+
By following any of these methods, you can easily opt out of telemetry and ensure your usage data is not collected.

0 commit comments

Comments
 (0)