Skip to content

allignment #750

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 18 commits into from
Oct 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
99 changes: 99 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,102 @@
## [1.26.5](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.4...v1.26.5) (2024-10-13)


### Bug Fixes

* async invocation ([c2179ab](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/c2179abc60d1242f272067eaca4750019b6f1d7e))

## [1.26.4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.3...v1.26.4) (2024-10-13)


### Bug Fixes

* csv_node ([b208ef7](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/b208ef792c9347ab608fdbe0913066343c3019ff))

## [1.26.3](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.2...v1.26.3) (2024-10-13)


### Bug Fixes

* generate answer node ([431b209](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/431b2093bee2ef5eea8292e804044b06c73585d7))

## [1.26.2](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.1...v1.26.2) (2024-10-13)


### Bug Fixes

* add new dipendency ([35c44e4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/35c44e4d2ca3f6f7f27c8c5efd3381e8fc3acc82))

## [1.26.1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.0...v1.26.1) (2024-10-13)


### Bug Fixes

* async tim ([7b07368](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/7b073686ef1ff743defae5a2af3e740650f658d2))
* typo ([9c62f24](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/9c62f24e7396c298f16470bac9f548e8fe51ca5f))
* typo ([c9d6ef5](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/c9d6ef5915b2155379fba5132c8640635eb7da06))

## [1.26.0](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.25.2...v1.26.0) (2024-10-13)


### Features

* add deep scraper implementation ([4b371f4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4b371f4d94dae47986aad751508813d89ce87b93))
* add google proxy support ([a986523](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/a9865238847e2edccde579ace7ba226f7012e95d))
* add html_mode to smart_scraper ([bdcffd6](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/bdcffd6360237b27797546a198ceece55ce4bc81))
* add reasoning integration ([b2822f6](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/b2822f620a610e61d295cbf4b670aa08fde9de24))
* async invocation ([257f393](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/257f393761e8ff823e37c72659c8b55925c4aecb))
* conditional_node ([f837dc1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/f837dc16ce6db0f38fd181822748ca413b7ab4b0))
* finished basic version of deep scraper ([85cb957](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/85cb9572971719f9f7c66171f5e2246376b6aed2))
* prompt refactoring ([5a2f6d9](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/5a2f6d9a77a814d5c3756e85cabde8af978f4c06))
* refactoring fetch_node ([39a029e](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/39a029ed9a8cd7c2277ba1386b976738e99d231b))
* refactoring of mdscraper ([3b7b701](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/3b7b701a89aad503dea771db3f043167f7203d46))
* refactoring of research web ([26f89d8](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/26f89d895d547ef2463492f82da7ac21b57b9d1b))
* refactoring of the conditional node ([420c71b](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/420c71ba2ca0fc77465dd533a807b887c6a87f52))
* undected_chromedriver support ([80ece21](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/80ece2179ac47a7ea42fbae4b61504a49ca18daa))
* update chromium loader ([4f816f3](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4f816f3b04974e90ca4208158f05724cfe68ffb8))


### Bug Fixes

* bugs ([026a70b](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/026a70bd3a01b0ebab4d175ae4005e7f3ba3a833))
* import error ([37b6ba0](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/37b6ba08ae9972240fc00a15efe43233fd093f3b))
* integration with html_mode ([f87ffa1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/f87ffa1d8db32b38c47d9f5aa2ae88f1d7978a04))
* nodes prompt ([8753537](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/8753537ecd2a0ba480cda482b6dc50c090b418d6))
* pyproject.toml ([3b27c5e](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/3b27c5e88c0b0744438e8b604f40929e22d722bc))
* refactoring prompts ([c655642](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/c65564257798a5ccdc2bdf92487cd9b069e6d951))
* removed pdf_scraper graph and created document scraper ([a57da96](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/a57da96175a09a16d990eeee679988d10832ce13))
* search_on_web paremter ([7f03ec1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/7f03ec15de20fc2d6c2aad2655cc5348cced1951))
* typo ([e285127](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/e28512720c3d47917814cf388912aef0e2230188))


### Perf

* Proxy integration in googlesearch ([e828c70](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/e828c7010acb1bd04498e027da69f35d53a37890))


### CI

* **release:** 1.22.0-beta.4 [skip ci] ([4330179](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4330179cb65674d65423c1763f90182e85c15a74))
* **release:** 1.22.0-beta.5 [skip ci] ([6d8f543](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/6d8f5435d1ecd2d90b06aade50abc064f75c9d78))
* **release:** 1.22.0-beta.6 [skip ci] ([39f7815](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/39f78154a6f1123fa8aca5e169c803111c175473))
* **release:** 1.26.0-beta.1 [skip ci] ([ac31d7f](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/ac31d7f7101ba6d7251131aa010d9ef948fa611f))
* **release:** 1.26.0-beta.10 [skip ci] ([0c7ebe2](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/0c7ebe28ac32abeab9b55bca2bceb7c4e591028e))
* **release:** 1.26.0-beta.11 [skip ci] ([6d8828a](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/6d8828aa62a8026cc874d84169a5bcb600b1a389))
* **release:** 1.26.0-beta.12 [skip ci] ([44d10aa](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/44d10aa1c035efe5b71d4394e702ff2592eac18d))
* **release:** 1.26.0-beta.13 [skip ci] ([12f2b99](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/12f2b9946be0b68b59a25cbd71f675ac705198cc))
* **release:** 1.26.0-beta.14 [skip ci] ([eb25725](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/eb257259f8880466bf9a01416e0c9366d3d55a3b))
* **release:** 1.26.0-beta.15 [skip ci] ([528a974](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/528a9746fed50c1ca1c1a572951d6a7044bf85fc))
* **release:** 1.26.0-beta.16 [skip ci] ([04bd2a8](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/04bd2a87fbd482c92cf35398127835205d8191f0))
* **release:** 1.26.0-beta.17 [skip ci] ([f17089c](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/f17089c123d96ae9e1407e2c008209dc630b45da))
* **release:** 1.26.0-beta.2 [skip ci] ([5cedeb8](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/5cedeb8486f5ca30586876be0c26f81b43ce8031))
* **release:** 1.26.0-beta.3 [skip ci] ([4f65be4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4f65be44b50b314a96bb746830070e79095b713c))
* **release:** 1.26.0-beta.4 [skip ci] ([84d7937](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/84d7937472513d140d1a2334f974a571cbf42a45))
* **release:** 1.26.0-beta.5 [skip ci] ([ea9ed1a](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/ea9ed1a9819f1c931297743fb69ee4ee1bf6665a))
* **release:** 1.26.0-beta.6 [skip ci] ([4cd21f5](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4cd21f500d545852a7a17328586a45306eac7419))
* **release:** 1.26.0-beta.7 [skip ci] ([482f060](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/482f060c9ad2a0fd203a4e47ac7103bf8040550d))
* **release:** 1.26.0-beta.8 [skip ci] ([38b795e](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/38b795e48a1e568a823571a3c2f9fdeb95d0266e))
* **release:** 1.26.0-beta.9 [skip ci] ([4dc0699](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/4dc06994832c561eeebca172c965a42aee661f3e))

## [1.26.0-beta.17](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.26.0-beta.16...v1.26.0-beta.17) (2024-10-12)


Expand Down
9 changes: 7 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[project]
name = "scrapegraphai"

version = "1.26.0b17"
version = "1.26.5"

description = "A web scraping library based on LangChain which uses LLM and direct graph logic to create scraping pipelines."
authors = [
Expand Down Expand Up @@ -33,7 +33,12 @@ dependencies = [
"fastembed>=0.3.6",
"semchunk>=2.2.0",
"transformers>=4.44.2",
"googlesearch-python>=1.2.5"
"transformers>=4.44.2",
"googlesearch-python>=1.2.5",
"async-timeout>=4.0.3",
"transformers>=4.44.2",
"googlesearch-python>=1.2.5",
"simpleeval>=1.0.0"
]

license = "MIT"
Expand Down
8 changes: 5 additions & 3 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ astroid==3.2.4
async-timeout==4.0.3
# via aiohttp
# via langchain
# via scrapegraphai
attrs==24.2.0
# via aiohttp
# via jsonschema
Expand All @@ -40,7 +41,7 @@ babel==2.16.0
# via sphinx
beautifulsoup4==4.12.3
# via furo
# via google
# via googlesearch-python
# via scrapegraphai
blinker==1.8.2
# via streamlit
Expand Down Expand Up @@ -108,8 +109,6 @@ gitdb==4.0.11
# via gitpython
gitpython==3.1.43
# via streamlit
google==3.0.0
# via scrapegraphai
google-ai-generativelanguage==0.6.6
# via google-generativeai
google-api-core==2.19.1
Expand All @@ -131,6 +130,8 @@ google-generativeai==0.7.2
googleapis-common-protos==1.63.2
# via google-api-core
# via grpcio-status
googlesearch-python==1.2.5
# via scrapegraphai
graphviz==0.20.3
# via burr
greenlet==3.0.3
Expand Down Expand Up @@ -417,6 +418,7 @@ requests==2.32.3
# via fastembed
# via free-proxy
# via google-api-core
# via googlesearch-python
# via huggingface-hub
# via langchain
# via langchain-community
Expand Down
8 changes: 5 additions & 3 deletions requirements.lock
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,13 @@ anyio==4.4.0
async-timeout==4.0.3
# via aiohttp
# via langchain
# via scrapegraphai
attrs==23.2.0
# via aiohttp
# via jsonschema
# via referencing
beautifulsoup4==4.12.3
# via google
# via googlesearch-python
# via scrapegraphai
boto3==1.34.146
# via langchain-aws
Expand Down Expand Up @@ -65,8 +66,6 @@ frozenlist==1.4.1
# via aiosignal
fsspec==2024.6.1
# via huggingface-hub
google==3.0.0
# via scrapegraphai
google-ai-generativelanguage==0.6.6
# via google-generativeai
google-api-core==2.19.1
Expand All @@ -88,6 +87,8 @@ google-generativeai==0.7.2
googleapis-common-protos==1.63.2
# via google-api-core
# via grpcio-status
googlesearch-python==1.2.5
# via scrapegraphai
greenlet==3.0.3
# via playwright
grpcio==1.65.1
Expand Down Expand Up @@ -306,6 +307,7 @@ requests==2.32.3
# via fastembed
# via free-proxy
# via google-api-core
# via googlesearch-python
# via huggingface-hub
# via langchain
# via langchain-community
Expand Down
2 changes: 1 addition & 1 deletion scrapegraphai/builders/graph_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ def build_graph(self):
Returns:
dict: A JSON representation of the graph configuration.
"""
return self.chain.ainvoke(self.prompt)
return self.chain.invoke(self.prompt)

@staticmethod
def convert_json_to_graphviz(json_data, format: str = 'pdf'):
Expand Down
6 changes: 3 additions & 3 deletions scrapegraphai/nodes/generate_answer_csv_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def __init__(

self.additional_info = node_config.get("additional_info")

async def execute(self, state):
def execute(self, state):
"""
Generates an answer by constructing a prompt from the user's input and the scraped
content, querying the language model, and parsing its response.
Expand Down Expand Up @@ -126,7 +126,7 @@ async def execute(self, state):
)

chain = prompt | self.llm_model | output_parser
answer = chain.ainvoke({"question": user_prompt})
answer = chain.invoke({"question": user_prompt})
state.update({self.output[0]: answer})
return state

Expand Down Expand Up @@ -157,7 +157,7 @@ async def execute(self, state):
)

merge_chain = merge_prompt | self.llm_model | output_parser
answer = await merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})

state.update({self.output[0]: answer})
return state
8 changes: 4 additions & 4 deletions scrapegraphai/nodes/generate_answer_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def __init__(
self.is_md_scraper = node_config.get("is_md_scraper", False)
self.additional_info = node_config.get("additional_info")

async def execute(self, state: dict) -> dict:
def execute(self, state: dict) -> dict:
"""
Executes the GenerateAnswerNode.

Expand Down Expand Up @@ -123,7 +123,7 @@ async def execute(self, state: dict) -> dict:
chain = prompt | self.llm_model
if output_parser:
chain = chain | output_parser
answer = await chain.ainvoke({"question": user_prompt})
answer = chain.invoke({"question": user_prompt})

state.update({self.output[0]: answer})
return state
Expand All @@ -143,7 +143,7 @@ async def execute(self, state: dict) -> dict:
chains_dict[chain_name] = chains_dict[chain_name] | output_parser

async_runner = RunnableParallel(**chains_dict)
batch_results = await async_runner.ainvoke({"question": user_prompt})
batch_results = async_runner.invoke({"question": user_prompt})

merge_prompt = PromptTemplate(
template=template_merge_prompt,
Expand All @@ -154,7 +154,7 @@ async def execute(self, state: dict) -> dict:
merge_chain = merge_prompt | self.llm_model
if output_parser:
merge_chain = merge_chain | output_parser
answer = await merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})

state.update({self.output[0]: answer})
return state
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/generate_answer_node_k_level.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ def execute(self, state: dict) -> dict:
merge_chain = merge_prompt | self.llm_model
if output_parser:
merge_chain = merge_chain | output_parser
answer = merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})

state["answer"] = answer

Expand Down
4 changes: 2 additions & 2 deletions scrapegraphai/nodes/generate_answer_omni_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ def execute(self, state: dict) -> dict:
)

chain = prompt | self.llm_model | output_parser
answer = chain.ainvoke({"question": user_prompt})
answer = chain.invoke({"question": user_prompt})

state.update({self.output[0]: answer})
return state
Expand Down Expand Up @@ -149,7 +149,7 @@ def execute(self, state: dict) -> dict:
)

merge_chain = merge_prompt | self.llm_model | output_parser
answer = merge_chain.ainvoke({"context": batch_results, "question": user_prompt})
answer = merge_chain.invoke({"context": batch_results, "question": user_prompt})

state.update({self.output[0]: answer})
return state
4 changes: 2 additions & 2 deletions scrapegraphai/nodes/generate_code_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -325,7 +325,7 @@ def generate_initial_code(self, state: dict) -> str:
output_parser = StrOutputParser()

chain = prompt | self.llm_model | output_parser
generated_code = chain.ainvoke({})
generated_code = chain.invoke({})
return generated_code

def semantic_comparison(self, generated_result: Any, reference_result: Any) -> Dict[str, Any]:
Expand Down Expand Up @@ -368,7 +368,7 @@ def semantic_comparison(self, generated_result: Any, reference_result: Any) -> D
)

chain = prompt | self.llm_model | output_parser
return chain.ainvoke({
return chain.invoke({
"generated_result": json.dumps(generated_result, indent=2),
"reference_result": json.dumps(reference_result_dict, indent=2)
})
Expand Down
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/generate_scraper_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ def execute(self, state: dict) -> dict:
)
map_chain = prompt | self.llm_model | StrOutputParser()

answer = map_chain.ainvoke({"question": user_prompt})
answer = map_chain.invoke({"question": user_prompt})

state.update({self.output[0]: answer})
return state
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/html_analyzer_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def execute(self, state: dict) -> dict:
output_parser = StrOutputParser()

chain = prompt | self.llm_model | output_parser
html_analysis = chain.ainvoke({})
html_analysis = chain.invoke({})

state.update({self.output[0]: html_analysis, self.output[1]: reduced_html})
return state
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/merge_answers_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ def execute(self, state: dict) -> dict:
)

merge_chain = prompt_template | self.llm_model | output_parser
answer = merge_chain.ainvoke({"user_prompt": user_prompt})
answer = merge_chain.invoke({"user_prompt": user_prompt})
answer["sources"] = state.get("urls", [])

state.update({self.output[0]: answer})
Expand Down
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/merge_generated_scripts_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ def execute(self, state: dict) -> dict:
)

merge_chain = prompt_template | self.llm_model | StrOutputParser()
answer = merge_chain.ainvoke({"user_prompt": user_prompt})
answer = merge_chain.invoke({"user_prompt": user_prompt})

state.update({self.output[0]: answer})
return state
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/prompt_refiner_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ def execute(self, state: dict) -> dict:
output_parser = StrOutputParser()

chain = prompt | self.llm_model | output_parser
refined_prompt = chain.ainvoke({})
refined_prompt = chain.invoke({})

state.update({self.output[0]: refined_prompt})
return state
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/reasoning_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ def execute(self, state: dict) -> dict:
output_parser = StrOutputParser()

chain = prompt | self.llm_model | output_parser
refined_prompt = chain.ainvoke({})
refined_prompt = chain.invoke({})

state.update({self.output[0]: refined_prompt})
return state
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/robots_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ def execute(self, state: dict) -> dict:
)

chain = prompt | self.llm_model | output_parser
is_scrapable = chain.ainvoke({"path": source})[0]
is_scrapable = chain.invoke({"path": source})[0]

if "no" in is_scrapable:
self.logger.warning(
Expand Down
2 changes: 1 addition & 1 deletion scrapegraphai/nodes/search_link_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ def execute(self, state: dict) -> dict:
input_variables=["content", "user_prompt"],
)
merge_chain = merge_prompt | self.llm_model | output_parser
answer = merge_chain.ainvoke(
answer = merge_chain.invoke(
{"content": chunk.page_content}
)
relevant_links += answer
Expand Down
Loading