Skip to content

Commit deb7c2d

Browse files
authored
Merge pull request #750 from ScrapeGraphAI/main
2 parents f17089c + 931b975 commit deb7c2d

20 files changed

+145
-37
lines changed

CHANGELOG.md

Lines changed: 99 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,102 @@
1+
## [1.26.5](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.4...v1.26.5) (2024-10-13)
2+
3+
4+
### Bug Fixes
5+
6+
* async invocation ([c2179ab](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/c2179abc60d1242f272067eaca4750019b6f1d7e))
7+
8+
## [1.26.4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.3...v1.26.4) (2024-10-13)
9+
10+
11+
### Bug Fixes
12+
13+
* csv_node ([b208ef7](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/b208ef792c9347ab608fdbe0913066343c3019ff))
14+
15+
## [1.26.3](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.2...v1.26.3) (2024-10-13)
16+
17+
18+
### Bug Fixes
19+
20+
* generate answer node ([431b209](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/431b2093bee2ef5eea8292e804044b06c73585d7))
21+
22+
## [1.26.2](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.1...v1.26.2) (2024-10-13)
23+
24+
25+
### Bug Fixes
26+
27+
* add new dipendency ([35c44e4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/35c44e4d2ca3f6f7f27c8c5efd3381e8fc3acc82))
28+
29+
## [1.26.1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.0...v1.26.1) (2024-10-13)
30+
31+
32+
### Bug Fixes
33+
34+
* async tim ([7b07368](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/7b073686ef1ff743defae5a2af3e740650f658d2))
35+
* typo ([9c62f24](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/9c62f24e7396c298f16470bac9f548e8fe51ca5f))
36+
* typo ([c9d6ef5](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/c9d6ef5915b2155379fba5132c8640635eb7da06))
37+
38+
## [1.26.0](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.25.2...v1.26.0) (2024-10-13)
39+
40+
41+
### Features
42+
43+
* add deep scraper implementation ([4b371f4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4b371f4d94dae47986aad751508813d89ce87b93))
44+
* add google proxy support ([a986523](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/a9865238847e2edccde579ace7ba226f7012e95d))
45+
* add html_mode to smart_scraper ([bdcffd6](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/bdcffd6360237b27797546a198ceece55ce4bc81))
46+
* add reasoning integration ([b2822f6](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/b2822f620a610e61d295cbf4b670aa08fde9de24))
47+
* async invocation ([257f393](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/257f393761e8ff823e37c72659c8b55925c4aecb))
48+
* conditional_node ([f837dc1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/f837dc16ce6db0f38fd181822748ca413b7ab4b0))
49+
* finished basic version of deep scraper ([85cb957](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/85cb9572971719f9f7c66171f5e2246376b6aed2))
50+
* prompt refactoring ([5a2f6d9](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/5a2f6d9a77a814d5c3756e85cabde8af978f4c06))
51+
* refactoring fetch_node ([39a029e](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/39a029ed9a8cd7c2277ba1386b976738e99d231b))
52+
* refactoring of mdscraper ([3b7b701](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/3b7b701a89aad503dea771db3f043167f7203d46))
53+
* refactoring of research web ([26f89d8](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/26f89d895d547ef2463492f82da7ac21b57b9d1b))
54+
* refactoring of the conditional node ([420c71b](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/420c71ba2ca0fc77465dd533a807b887c6a87f52))
55+
* undected_chromedriver support ([80ece21](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/80ece2179ac47a7ea42fbae4b61504a49ca18daa))
56+
* update chromium loader ([4f816f3](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4f816f3b04974e90ca4208158f05724cfe68ffb8))
57+
58+
59+
### Bug Fixes
60+
61+
* bugs ([026a70b](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/026a70bd3a01b0ebab4d175ae4005e7f3ba3a833))
62+
* import error ([37b6ba0](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/37b6ba08ae9972240fc00a15efe43233fd093f3b))
63+
* integration with html_mode ([f87ffa1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/f87ffa1d8db32b38c47d9f5aa2ae88f1d7978a04))
64+
* nodes prompt ([8753537](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/8753537ecd2a0ba480cda482b6dc50c090b418d6))
65+
* pyproject.toml ([3b27c5e](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/3b27c5e88c0b0744438e8b604f40929e22d722bc))
66+
* refactoring prompts ([c655642](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/c65564257798a5ccdc2bdf92487cd9b069e6d951))
67+
* removed pdf_scraper graph and created document scraper ([a57da96](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/a57da96175a09a16d990eeee679988d10832ce13))
68+
* search_on_web paremter ([7f03ec1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/7f03ec15de20fc2d6c2aad2655cc5348cced1951))
69+
* typo ([e285127](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/e28512720c3d47917814cf388912aef0e2230188))
70+
71+
72+
### Perf
73+
74+
* Proxy integration in googlesearch ([e828c70](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/e828c7010acb1bd04498e027da69f35d53a37890))
75+
76+
77+
### CI
78+
79+
* **release:** 1.22.0-beta.4 [skip ci] ([4330179](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4330179cb65674d65423c1763f90182e85c15a74))
80+
* **release:** 1.22.0-beta.5 [skip ci] ([6d8f543](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/6d8f5435d1ecd2d90b06aade50abc064f75c9d78))
81+
* **release:** 1.22.0-beta.6 [skip ci] ([39f7815](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/39f78154a6f1123fa8aca5e169c803111c175473))
82+
* **release:** 1.26.0-beta.1 [skip ci] ([ac31d7f](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/ac31d7f7101ba6d7251131aa010d9ef948fa611f))
83+
* **release:** 1.26.0-beta.10 [skip ci] ([0c7ebe2](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/0c7ebe28ac32abeab9b55bca2bceb7c4e591028e))
84+
* **release:** 1.26.0-beta.11 [skip ci] ([6d8828a](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/6d8828aa62a8026cc874d84169a5bcb600b1a389))
85+
* **release:** 1.26.0-beta.12 [skip ci] ([44d10aa](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/44d10aa1c035efe5b71d4394e702ff2592eac18d))
86+
* **release:** 1.26.0-beta.13 [skip ci] ([12f2b99](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/12f2b9946be0b68b59a25cbd71f675ac705198cc))
87+
* **release:** 1.26.0-beta.14 [skip ci] ([eb25725](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/eb257259f8880466bf9a01416e0c9366d3d55a3b))
88+
* **release:** 1.26.0-beta.15 [skip ci] ([528a974](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/528a9746fed50c1ca1c1a572951d6a7044bf85fc))
89+
* **release:** 1.26.0-beta.16 [skip ci] ([04bd2a8](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/04bd2a87fbd482c92cf35398127835205d8191f0))
90+
* **release:** 1.26.0-beta.17 [skip ci] ([f17089c](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/f17089c123d96ae9e1407e2c008209dc630b45da))
91+
* **release:** 1.26.0-beta.2 [skip ci] ([5cedeb8](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/5cedeb8486f5ca30586876be0c26f81b43ce8031))
92+
* **release:** 1.26.0-beta.3 [skip ci] ([4f65be4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4f65be44b50b314a96bb746830070e79095b713c))
93+
* **release:** 1.26.0-beta.4 [skip ci] ([84d7937](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/84d7937472513d140d1a2334f974a571cbf42a45))
94+
* **release:** 1.26.0-beta.5 [skip ci] ([ea9ed1a](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/ea9ed1a9819f1c931297743fb69ee4ee1bf6665a))
95+
* **release:** 1.26.0-beta.6 [skip ci] ([4cd21f5](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4cd21f500d545852a7a17328586a45306eac7419))
96+
* **release:** 1.26.0-beta.7 [skip ci] ([482f060](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/482f060c9ad2a0fd203a4e47ac7103bf8040550d))
97+
* **release:** 1.26.0-beta.8 [skip ci] ([38b795e](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/38b795e48a1e568a823571a3c2f9fdeb95d0266e))
98+
* **release:** 1.26.0-beta.9 [skip ci] ([4dc0699](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4dc06994832c561eeebca172c965a42aee661f3e))
99+
1100
## [1.26.0-beta.17](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.0-beta.16...v1.26.0-beta.17) (2024-10-12)
2101

3102

pyproject.toml

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
[project]
22
name = "scrapegraphai"
33

4-
version = "1.26.0b17"
4+
version = "1.26.5"
55

66
description = "A web scraping library based on LangChain which uses LLM and direct graph logic to create scraping pipelines."
77
authors = [
@@ -33,7 +33,12 @@ dependencies = [
3333
"fastembed>=0.3.6",
3434
"semchunk>=2.2.0",
3535
"transformers>=4.44.2",
36-
"googlesearch-python>=1.2.5"
36+
"transformers>=4.44.2",
37+
"googlesearch-python>=1.2.5",
38+
"async-timeout>=4.0.3",
39+
"transformers>=4.44.2",
40+
"googlesearch-python>=1.2.5",
41+
"simpleeval>=1.0.0"
3742
]
3843

3944
license = "MIT"

requirements-dev.lock

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,7 @@ astroid==3.2.4
3232
async-timeout==4.0.3
3333
# via aiohttp
3434
# via langchain
35+
# via scrapegraphai
3536
attrs==24.2.0
3637
# via aiohttp
3738
# via jsonschema
@@ -40,7 +41,7 @@ babel==2.16.0
4041
# via sphinx
4142
beautifulsoup4==4.12.3
4243
# via furo
43-
# via google
44+
# via googlesearch-python
4445
# via scrapegraphai
4546
blinker==1.8.2
4647
# via streamlit
@@ -108,8 +109,6 @@ gitdb==4.0.11
108109
# via gitpython
109110
gitpython==3.1.43
110111
# via streamlit
111-
google==3.0.0
112-
# via scrapegraphai
113112
google-ai-generativelanguage==0.6.6
114113
# via google-generativeai
115114
google-api-core==2.19.1
@@ -131,6 +130,8 @@ google-generativeai==0.7.2
131130
googleapis-common-protos==1.63.2
132131
# via google-api-core
133132
# via grpcio-status
133+
googlesearch-python==1.2.5
134+
# via scrapegraphai
134135
graphviz==0.20.3
135136
# via burr
136137
greenlet==3.0.3
@@ -417,6 +418,7 @@ requests==2.32.3
417418
# via fastembed
418419
# via free-proxy
419420
# via google-api-core
421+
# via googlesearch-python
420422
# via huggingface-hub
421423
# via langchain
422424
# via langchain-community

requirements.lock

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,12 +21,13 @@ anyio==4.4.0
2121
async-timeout==4.0.3
2222
# via aiohttp
2323
# via langchain
24+
# via scrapegraphai
2425
attrs==23.2.0
2526
# via aiohttp
2627
# via jsonschema
2728
# via referencing
2829
beautifulsoup4==4.12.3
29-
# via google
30+
# via googlesearch-python
3031
# via scrapegraphai
3132
boto3==1.34.146
3233
# via langchain-aws
@@ -65,8 +66,6 @@ frozenlist==1.4.1
6566
# via aiosignal
6667
fsspec==2024.6.1
6768
# via huggingface-hub
68-
google==3.0.0
69-
# via scrapegraphai
7069
google-ai-generativelanguage==0.6.6
7170
# via google-generativeai
7271
google-api-core==2.19.1
@@ -88,6 +87,8 @@ google-generativeai==0.7.2
8887
googleapis-common-protos==1.63.2
8988
# via google-api-core
9089
# via grpcio-status
90+
googlesearch-python==1.2.5
91+
# via scrapegraphai
9192
greenlet==3.0.3
9293
# via playwright
9394
grpcio==1.65.1
@@ -306,6 +307,7 @@ requests==2.32.3
306307
# via fastembed
307308
# via free-proxy
308309
# via google-api-core
310+
# via googlesearch-python
309311
# via huggingface-hub
310312
# via langchain
311313
# via langchain-community

scrapegraphai/builders/graph_builder.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ def build_graph(self):
119119
Returns:
120120
dict: A JSON representation of the graph configuration.
121121
"""
122-
return self.chain.ainvoke(self.prompt)
122+
return self.chain.invoke(self.prompt)
123123

124124
@staticmethod
125125
def convert_json_to_graphviz(json_data, format: str = 'pdf'):

scrapegraphai/nodes/generate_answer_csv_node.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ def __init__(
6060

6161
self.additional_info = node_config.get("additional_info")
6262

63-
async def execute(self, state):
63+
def execute(self, state):
6464
"""
6565
Generates an answer by constructing a prompt from the user's input and the scraped
6666
content, querying the language model, and parsing its response.
@@ -126,7 +126,7 @@ async def execute(self, state):
126126
)
127127

128128
chain = prompt | self.llm_model | output_parser
129-
answer = chain.ainvoke({"question": user_prompt})
129+
answer = chain.invoke({"question": user_prompt})
130130
state.update({self.output[0]: answer})
131131
return state
132132

@@ -157,7 +157,7 @@ async def execute(self, state):
157157
)
158158

159159
merge_chain = merge_prompt | self.llm_model | output_parser
160-
answer = await merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
160+
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})
161161

162162
state.update({self.output[0]: answer})
163163
return state

scrapegraphai/nodes/generate_answer_node.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ def __init__(
5757
self.is_md_scraper = node_config.get("is_md_scraper", False)
5858
self.additional_info = node_config.get("additional_info")
5959

60-
async def execute(self, state: dict) -> dict:
60+
def execute(self, state: dict) -> dict:
6161
"""
6262
Executes the GenerateAnswerNode.
6363
@@ -123,7 +123,7 @@ async def execute(self, state: dict) -> dict:
123123
chain = prompt | self.llm_model
124124
if output_parser:
125125
chain = chain | output_parser
126-
answer = await chain.ainvoke({"question": user_prompt})
126+
answer = chain.invoke({"question": user_prompt})
127127

128128
state.update({self.output[0]: answer})
129129
return state
@@ -143,7 +143,7 @@ async def execute(self, state: dict) -> dict:
143143
chains_dict[chain_name] = chains_dict[chain_name] | output_parser
144144

145145
async_runner = RunnableParallel(**chains_dict)
146-
batch_results = await async_runner.ainvoke({"question": user_prompt})
146+
batch_results = async_runner.invoke({"question": user_prompt})
147147

148148
merge_prompt = PromptTemplate(
149149
template=template_merge_prompt,
@@ -154,7 +154,7 @@ async def execute(self, state: dict) -> dict:
154154
merge_chain = merge_prompt | self.llm_model
155155
if output_parser:
156156
merge_chain = merge_chain | output_parser
157-
answer = await merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
157+
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})
158158

159159
state.update({self.output[0]: answer})
160160
return state

scrapegraphai/nodes/generate_answer_node_k_level.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ def execute(self, state: dict) -> dict:
143143
merge_chain = merge_prompt | self.llm_model
144144
if output_parser:
145145
merge_chain = merge_chain | output_parser
146-
answer = merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
146+
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})
147147

148148
state["answer"] = answer
149149

scrapegraphai/nodes/generate_answer_omni_node.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ def execute(self, state: dict) -> dict:
117117
)
118118

119119
chain = prompt | self.llm_model | output_parser
120-
answer = chain.ainvoke({"question": user_prompt})
120+
answer = chain.invoke({"question": user_prompt})
121121

122122
state.update({self.output[0]: answer})
123123
return state
@@ -149,7 +149,7 @@ def execute(self, state: dict) -> dict:
149149
)
150150

151151
merge_chain = merge_prompt | self.llm_model | output_parser
152-
answer = merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
152+
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})
153153

154154
state.update({self.output[0]: answer})
155155
return state

scrapegraphai/nodes/generate_code_node.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -325,7 +325,7 @@ def generate_initial_code(self, state: dict) -> str:
325325
output_parser = StrOutputParser()
326326

327327
chain = prompt | self.llm_model | output_parser
328-
generated_code = chain.ainvoke({})
328+
generated_code = chain.invoke({})
329329
return generated_code
330330

331331
def semantic_comparison(self, generated_result: Any, reference_result: Any) -> Dict[str, Any]:
@@ -368,7 +368,7 @@ def semantic_comparison(self, generated_result: Any, reference_result: Any) -> D
368368
)
369369

370370
chain = prompt | self.llm_model | output_parser
371-
return chain.ainvoke({
371+
return chain.invoke({
372372
"generated_result": json.dumps(generated_result, indent=2),
373373
"reference_result": json.dumps(reference_result_dict, indent=2)
374374
})

scrapegraphai/nodes/generate_scraper_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,7 @@ def execute(self, state: dict) -> dict:
130130
)
131131
map_chain = prompt | self.llm_model | StrOutputParser()
132132

133-
answer = map_chain.ainvoke({"question": user_prompt})
133+
answer = map_chain.invoke({"question": user_prompt})
134134

135135
state.update({self.output[0]: answer})
136136
return state

scrapegraphai/nodes/html_analyzer_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@ def execute(self, state: dict) -> dict:
9393
output_parser = StrOutputParser()
9494

9595
chain = prompt | self.llm_model | output_parser
96-
html_analysis = chain.ainvoke({})
96+
html_analysis = chain.invoke({})
9797

9898
state.update({self.output[0]: html_analysis, self.output[1]: reduced_html})
9999
return state

scrapegraphai/nodes/merge_answers_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ def execute(self, state: dict) -> dict:
9595
)
9696

9797
merge_chain = prompt_template | self.llm_model | output_parser
98-
answer = merge_chain.ainvoke({"user_prompt": user_prompt})
98+
answer = merge_chain.invoke({"user_prompt": user_prompt})
9999
answer["sources"] = state.get("urls", [])
100100

101101
state.update({self.output[0]: answer})

scrapegraphai/nodes/merge_generated_scripts_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ def execute(self, state: dict) -> dict:
7474
)
7575

7676
merge_chain = prompt_template | self.llm_model | StrOutputParser()
77-
answer = merge_chain.ainvoke({"user_prompt": user_prompt})
77+
answer = merge_chain.invoke({"user_prompt": user_prompt})
7878

7979
state.update({self.output[0]: answer})
8080
return state

scrapegraphai/nodes/prompt_refiner_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,7 @@ def execute(self, state: dict) -> dict:
9696
output_parser = StrOutputParser()
9797

9898
chain = prompt | self.llm_model | output_parser
99-
refined_prompt = chain.ainvoke({})
99+
refined_prompt = chain.invoke({})
100100

101101
state.update({self.output[0]: refined_prompt})
102102
return state

scrapegraphai/nodes/reasoning_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -91,7 +91,7 @@ def execute(self, state: dict) -> dict:
9191
output_parser = StrOutputParser()
9292

9393
chain = prompt | self.llm_model | output_parser
94-
refined_prompt = chain.ainvoke({})
94+
refined_prompt = chain.invoke({})
9595

9696
state.update({self.output[0]: refined_prompt})
9797
return state

scrapegraphai/nodes/robots_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ def execute(self, state: dict) -> dict:
108108
)
109109

110110
chain = prompt | self.llm_model | output_parser
111-
is_scrapable = chain.ainvoke({"path": source})[0]
111+
is_scrapable = chain.invoke({"path": source})[0]
112112

113113
if "no" in is_scrapable:
114114
self.logger.warning(

scrapegraphai/nodes/search_link_node.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -142,7 +142,7 @@ def execute(self, state: dict) -> dict:
142142
input_variables=["content", "user_prompt"],
143143
)
144144
merge_chain = merge_prompt | self.llm_model | output_parser
145-
answer = merge_chain.ainvoke(
145+
answer = merge_chain.invoke(
146146
{"content": chunk.page_content}
147147
)
148148
relevant_links += answer

0 commit comments

Comments
 (0)