You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ScrapeGraphAI is a *web scraping* python library that uses LLM and direct graph logic to create scraping pipelines for websites and local documents (XML, HTML, JSON, etc.).
@@ -46,11 +46,14 @@ The documentation for ScrapeGraphAI can be found [here](https://scrapegraph-ai.r
46
46
Check out also the Docusaurus [here](https://scrapegraph-doc.onrender.com/).
47
47
48
48
## 💻 Usage
49
-
There are three main scraping pipelines that can be used to extract information from a website (or local file):
49
+
There are multiple standard scraping pipelines that can be used to extract information from a website (or local file):
50
50
-`SmartScraperGraph`: single-page scraper that only needs a user prompt and an input source;
51
51
-`SearchGraph`: multi-page scraper that extracts information from the top n search results of a search engine;
52
52
-`SpeechGraph`: single-page scraper that extracts information from a website and generates an audio file.
53
-
-`SmartScraperMultiGraph`: multiple page scraper given a single prompt
53
+
-`ScriptCreatorGraph`: single-page scraper that extracts information from a website and generates a Python script.
54
+
55
+
-`SmartScraperMultiGraph`: multi-page scraper that extracts information from multiple pages given a single prompt and a list of sources;
56
+
-`ScriptCreatorMultiGraph`: multi-page scraper that generates a Python script for extracting information from multiple pages given a single prompt and a list of sources.
54
57
55
58
It is possible to use different LLM through APIs, such as **OpenAI**, **Groq**, **Azure** and **Gemini**, or local models using **Ollama**.
Copy file name to clipboardExpand all lines: docs/source/scrapers/graphs.rst
+8-184Lines changed: 8 additions & 184 deletions
Original file line number
Diff line number
Diff line change
@@ -3,187 +3,11 @@ Graphs
3
3
4
4
Graphs are scraping pipelines aimed at solving specific tasks. They are composed by nodes which can be configured individually to address different aspects of the task (fetching data, extracting information, etc.).
5
5
6
-
There are several types of graphs available in the library, each with its own purpose and functionality. The most common ones are:
7
-
8
-
- **SmartScraperGraph**: one-page scraper that requires a user-defined prompt and a URL (or local file) to extract information using LLM.
9
-
- **SmartScraperMultiGraph**: multi-page scraper that requires a user-defined prompt and a list of URLs (or local files) to extract information using LLM. It is built on top of SmartScraperGraph.
10
-
- **SearchGraph**: multi-page scraper that only requires a user-defined prompt to extract information from a search engine using LLM. It is built on top of SmartScraperGraph.
11
-
- **SpeechGraph**: text-to-speech pipeline that generates an answer as well as a requested audio file. It is built on top of SmartScraperGraph and requires a user-defined prompt and a URL (or local file).
12
-
- **ScriptCreatorGraph**: script generator that creates a Python script to scrape a website using the specified library (e.g. BeautifulSoup). It requires a user-defined prompt and a URL (or local file).
13
-
14
-
With the introduction of `GPT-4o`, two new powerful graphs have been created:
15
-
16
-
- **OmniScraperGraph**: similar to `SmartScraperGraph`, but with the ability to scrape images and describe them.
17
-
- **OmniSearchGraph**: similar to `SearchGraph`, but with the ability to scrape images and describe them.
18
-
19
-
20
-
.. note::
21
-
22
-
They all use a graph configuration to set up LLM models and other parameters. To find out more about the configurations, check the :ref:`LLM` and :ref:`Configuration` sections.
23
-
24
-
25
-
.. note::
26
-
27
-
We can pass an optional `schema` parameter to the graph constructor to specify the output schema. If not provided or set to `None`, the schema will be generated by the LLM itself.
28
-
29
-
OmniScraperGraph
30
-
^^^^^^^^^^^^^^^^
31
-
32
-
.. image:: ../../assets/omniscrapergraph.png
33
-
:align:center
34
-
:width:90%
35
-
:alt:OmniScraperGraph
36
-
|
37
-
38
-
First we define the graph configuration, which includes the LLM model and other parameters. Then we create an instance of the OmniScraperGraph class, passing the prompt, source, and configuration as arguments. Finally, we run the graph and print the result.
39
-
It will fetch the data from the source and extract the information based on the prompt in JSON format.
40
-
41
-
.. code-block:: python
42
-
43
-
from scrapegraphai.graphs import OmniScraperGraph
44
-
45
-
graph_config = {
46
-
"llm": {...},
47
-
}
48
-
49
-
omni_scraper_graph = OmniScraperGraph(
50
-
prompt="List me all the projects with their titles and image links and descriptions.",
51
-
source="https://perinim.github.io/projects",
52
-
config=graph_config,
53
-
schema=schema
54
-
)
55
-
56
-
result = omni_scraper_graph.run()
57
-
print(result)
58
-
59
-
OmniSearchGraph
60
-
^^^^^^^^^^^^^^^
61
-
62
-
.. image:: ../../assets/omnisearchgraph.png
63
-
:align:center
64
-
:width:80%
65
-
:alt:OmniSearchGraph
66
-
|
67
-
68
-
Similar to OmniScraperGraph, we define the graph configuration, create multiple of the OmniSearchGraph class, and run the graph.
69
-
It will create a search query, fetch the first n results from the search engine, run n OmniScraperGraph instances, and return the results in JSON format.
70
-
71
-
.. code-block:: python
72
-
73
-
from scrapegraphai.graphs import OmniSearchGraph
74
-
75
-
graph_config = {
76
-
"llm": {...},
77
-
}
78
-
79
-
# Create the OmniSearchGraph instance
80
-
omni_search_graph = OmniSearchGraph(
81
-
prompt="List me all Chioggia's famous dishes and describe their pictures.",
82
-
config=graph_config,
83
-
schema=schema
84
-
)
85
-
86
-
# Run the graph
87
-
result = omni_search_graph.run()
88
-
print(result)
89
-
90
-
SmartScraperGraph & SmartScraperMultiGraph
91
-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
92
-
93
-
.. image:: ../../assets/smartscrapergraph.png
94
-
:align:center
95
-
:width:90%
96
-
:alt:SmartScraperGraph
97
-
|
98
-
99
-
First we define the graph configuration, which includes the LLM model and other parameters. Then we create an instance of the SmartScraperGraph class, passing the prompt, source, and configuration as arguments. Finally, we run the graph and print the result.
100
-
It will fetch the data from the source and extract the information based on the prompt in JSON format.
101
-
102
-
.. code-block:: python
103
-
104
-
from scrapegraphai.graphs import SmartScraperGraph
105
-
106
-
graph_config = {
107
-
"llm": {...},
108
-
}
109
-
110
-
smart_scraper_graph = SmartScraperGraph(
111
-
prompt="List me all the projects with their descriptions",
112
-
source="https://perinim.github.io/projects",
113
-
config=graph_config,
114
-
schema=schema
115
-
)
116
-
117
-
result = smart_scraper_graph.run()
118
-
print(result)
119
-
120
-
**SmartScraperMultiGraph** is similar to SmartScraperGraph, but it can handle multiple sources. We define the graph configuration, create an instance of the SmartScraperMultiGraph class, and run the graph.
121
-
122
-
SearchGraph
123
-
^^^^^^^^^^^
124
-
125
-
.. image:: ../../assets/searchgraph.png
126
-
:align:center
127
-
:width:80%
128
-
:alt:SearchGraph
129
-
|
130
-
131
-
Similar to SmartScraperGraph, we define the graph configuration, create an instance of the SearchGraph class, and run the graph.
132
-
It will create a search query, fetch the first n results from the search engine, run n SmartScraperGraph instances, and return the results in JSON format.
133
-
134
-
135
-
.. code-block:: python
136
-
137
-
from scrapegraphai.graphs import SearchGraph
138
-
139
-
graph_config = {
140
-
"llm": {...},
141
-
"embeddings": {...},
142
-
}
143
-
144
-
# Create the SearchGraph instance
145
-
search_graph = SearchGraph(
146
-
prompt="List me all the traditional recipes from Chioggia",
147
-
config=graph_config,
148
-
schema=schema
149
-
)
150
-
151
-
# Run the graph
152
-
result = search_graph.run()
153
-
print(result)
154
-
155
-
156
-
SpeechGraph
157
-
^^^^^^^^^^^
158
-
159
-
.. image:: ../../assets/speechgraph.png
160
-
:align:center
161
-
:width:90%
162
-
:alt:SpeechGraph
163
-
|
164
-
165
-
Similar to SmartScraperGraph, we define the graph configuration, create an instance of the SpeechGraph class, and run the graph.
166
-
It will fetch the data from the source, extract the information based on the prompt, and generate an audio file with the answer, as well as the answer itself, in JSON format.
ScrapeGraphAI collects **anonymous** usage data by default to improve the library and guide development efforts.
6
+
7
+
**Events Captured**
8
+
9
+
We capture events in the following scenarios:
10
+
11
+
1. When a ``Graph`` finishes running.
12
+
2. When an exception is raised in one of the nodes.
13
+
14
+
**Data Collected**
15
+
16
+
The data captured is limited to:
17
+
18
+
- Operating System and Python version
19
+
- A persistent UUID to identify the session, stored in ``~/.scrapegraphai.conf``
20
+
21
+
Additionally, the following properties are collected:
22
+
23
+
.. code-block:: python
24
+
25
+
properties = {
26
+
"graph_name": graph_name,
27
+
"llm_model": llm_model_name,
28
+
"embedder_model": embedder_model_name,
29
+
"source_type": source_type,
30
+
"execution_time": execution_time,
31
+
"error_node": error_node_name,
32
+
}
33
+
34
+
For more details, refer to the `telemetry.py <https://github.com/VinciGit00/Scrapegraph-ai/blob/main/scrapegraphai/telemetry/telemetry.py>`_ module.
35
+
36
+
**Opting Out**
37
+
38
+
If you prefer not to participate in telemetry, you can opt out using any of the following methods:
39
+
40
+
1. **Programmatically Disable Telemetry**:
41
+
42
+
Add the following code at the beginning of your script:
43
+
44
+
.. code-block:: python
45
+
46
+
from scrapegraphai import telemetry
47
+
telemetry.disable_telemetry()
48
+
49
+
2. **Configuration File**:
50
+
51
+
Set the ``telemetry_enabled`` key to ``false`` in ``~/.scrapegraphai.conf`` under the ``[DEFAULT]`` section:
0 commit comments