Skip to content

Commit ffcbeb6

Browse files
authored
Add additional MCP examples (#430)
# Motivation <!-- Why is this change necessary? --> # Content <!-- Please include a summary of the change --> # Testing <!-- How was the change tested? --> # Please check the following before marking your PR as ready for review - [ ] I have added tests for my changes - [ ] I have updated the documentation or added new documentation as needed --------- Co-authored-by: rushilpatel0 <[email protected]>
1 parent 451002e commit ffcbeb6

File tree

15 files changed

+468
-66
lines changed

15 files changed

+468
-66
lines changed

docs/introduction/ide-usage.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -59,8 +59,8 @@ it will allow an agent to:
5959
- improve a codemod
6060
- get setup instructions
6161

62-
### Configuration
63-
#### Usage with Cline:
62+
### IDE Configuration
63+
#### Cline
6464
Add this to your cline_mcp_settings.json:
6565
```
6666
{
@@ -79,7 +79,7 @@ Add this to your cline_mcp_settings.json:
7979
```
8080

8181

82-
#### Usage with Cursor:
82+
#### Cursor:
8383
Under the `Settings` > `Feature` > `MCP Servers` section, click "Add New MCP Server" and add the following:
8484

8585
```

docs/mint.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,8 @@
100100
"tutorials/sqlalchemy-1.6-to-2.0",
101101
"tutorials/fixing-import-loops-in-pytorch",
102102
"tutorials/python2-to-python3",
103-
"tutorials/flask-to-fastapi"
103+
"tutorials/flask-to-fastapi",
104+
"tutorials/build-mcp"
104105
]
105106
},
106107
{

docs/tutorials/build-mcp.mdx

Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
---
2+
title: "Building a Model Context Protocol server with Codegen"
3+
sidebarTitle: "MCP Server"
4+
icon: "boxes-stacked"
5+
iconType: "solid"
6+
---
7+
8+
Learn how to build a Model Context Protocol (MCP) server that enables AI models to understand and manipulate code using Codegen's powerful tools.
9+
10+
This guide will walk you through creating an MCP server that can provide semantic code search
11+
12+
<Info>View the full code in our [examples repository](https://github.com/codegen-sh/codegen-sdk/tree/develop/src/codegen/extensions/mcp)</Info>
13+
14+
15+
## Setup:
16+
Install the MCP python library
17+
```
18+
uv pip install mcp
19+
```
20+
21+
## Step 1: Setting Up Your MCP Server
22+
23+
First, let's create a basic MCP server using Codegen's MCP tools:
24+
25+
server.py
26+
```python
27+
from codegen import Codebase
28+
from mcp.server.fastmcp import FastMCP
29+
from typing import Annotated
30+
# Initialize the codebase
31+
codebase = Codebase.from_repo(".")
32+
33+
# create the MCP server using FastMCP
34+
mcp = FastMCP(name="demo-mcp", instructions="Use this server for semantic search of codebases")
35+
36+
37+
if __name__ == "__main__":
38+
# Initialize and run the server
39+
print("Starting demo mpc server...")
40+
mcp.run(transport="stdio")
41+
42+
```
43+
44+
## Step 2: Create the search tool
45+
46+
Let's implement the semantic search tool.
47+
48+
server.py
49+
```python
50+
from codegen.extensions.tools.semantic_search import semantic_search
51+
52+
....
53+
54+
@mcp.tool('codebase_semantic_search', "search codebase with the provided query")
55+
def search(query: Annotated[str, "search query to run against codebase"]):
56+
codebase = Codebase("provide location to codebase", programming_language="provide codebase Language")
57+
# use the semantic search tool from codegen.extenstions.tools OR write your own
58+
results = semantic_search(codebase=codebase, query=query)
59+
return results
60+
61+
....
62+
```
63+
64+
## Run Your MCP Server
65+
66+
You can run and inspect your MCP server with:
67+
68+
```
69+
mcp dev server.py
70+
```
71+
72+
If you'd like to integrate this into an IDE checkout out this [setup guide](/introduction/ide-usage#mcp-server-setup)
73+
74+
And that's a wrap, chime in at our [community
75+
Slack](https://community.codegen.com) if you have quesions or ideas for additional MCP tools/capabilities

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,7 @@ dependencies = [
6868
"langchain_core",
6969
"langchain_openai",
7070
"numpy>=2.2.2",
71+
"mcp[cli]",
7172
]
7273

7374
license = { text = "Apache-2.0" }
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
"""Demo implementation of an agent with Codegen tools."""
2+
3+
from langchain_core.messages import BaseMessage
4+
from langchain_core.runnables.history import RunnableWithMessageHistory
5+
6+
from codegen.extensions.langchain.agent import create_codebase_agent
7+
from codegen.sdk.core.codebase import Codebase
8+
9+
AGENT_INSTRUCTIONS = """
10+
Instruction Set for Codegen SDK Expert Agent
11+
12+
Overview:
13+
This instruction set is designed for an agent that is an expert on the Codegen SDK, specifically the Python library. The agent will be asked questions about the SDK, including classes, utilities, properties, and how to accomplish tasks using the SDK. The goal is to provide helpful responses that assist users in achieving their tasks with the SDK.
14+
15+
Key Responsibilities:
16+
1. Expertise in Codegen SDK:
17+
- The agent is an expert on the Codegen SDK, with a deep understanding of its components and functionalities.
18+
- It should be able to provide detailed explanations of classes, utilities, and properties defined in the SDK.
19+
20+
2. Answering Questions:
21+
- The agent will be asked questions about the Codegen SDK, such as:
22+
- "Find all imports"
23+
- "How do I add an import for a symbol?"
24+
- "What is a statement object?"
25+
- Responses should be clear, concise, and directly address the user's query.
26+
27+
3. Task-Oriented Responses:
28+
- The user is typically accomplishing a task using the Codegen SDK.
29+
- Responses should be helpful toward that goal, providing guidance and solutions that facilitate task completion.
30+
31+
4. Python Library Focus:
32+
- Assume that questions are related to the Codegen SDK Python library.
33+
- Provide Python-specific examples and explanations when applicable.
34+
35+
Use the provided agent tools to look up additional information if needed.
36+
By following this instruction set, the agent will be well-equipped to assist users in effectively utilizing the Codegen SDK for their projects.
37+
"""
38+
39+
40+
def create_sdk_expert_agent(
41+
codebase: Codebase,
42+
model_name: str = "gpt-4o",
43+
temperature: float = 0,
44+
verbose: bool = True,
45+
) -> RunnableWithMessageHistory:
46+
"""Create an agent with all codebase tools.
47+
48+
Args:
49+
codebase: The codebase to operate on
50+
model_name: Name of the model to use (default: gpt-4)
51+
temperature: Model temperature (default: 0)
52+
verbose: Whether to print agent's thought process (default: True)
53+
54+
Returns:
55+
Initialized agent with message history
56+
"""
57+
# Initialize language model
58+
59+
system_message: BaseMessage = BaseMessage(content=AGENT_INSTRUCTIONS, type="SYSTEM")
60+
61+
agent = create_codebase_agent(chat_history=[system_message], codebase=codebase, model_name=model_name, temperature=temperature, verbose=verbose)
62+
63+
return agent

src/codegen/cli/mcp/server.py

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,10 @@
33
from mcp.server.fastmcp import Context, FastMCP
44

55
from codegen.cli.api.client import RestAPI
6+
from codegen.cli.mcp.agent.docs_expert import create_sdk_expert_agent
67
from codegen.cli.mcp.resources.system_prompt import SYSTEM_PROMPT
78
from codegen.cli.mcp.resources.system_setup_instructions import SETUP_INSTRUCTIONS
9+
from codegen.sdk.core.codebase import Codebase
810
from codegen.shared.enums.programming_language import ProgrammingLanguage
911

1012
# Initialize FastMCP server
@@ -39,6 +41,19 @@ def get_service_config() -> dict[str, Any]:
3941
# ----- TOOLS -----
4042

4143

44+
@mcp.tool()
45+
def ask_codegen_sdk(query: Annotated[str, "Ask a question to an exper agent for details about any aspect of the codegen sdk core set of classes and utilities"]):
46+
codebase = Codebase("../../sdk/core")
47+
agent = create_sdk_expert_agent(codebase=codebase)
48+
49+
result = agent.invoke(
50+
{"input": query},
51+
config={"configurable": {"session_id": "demo"}},
52+
)
53+
54+
return result["output"]
55+
56+
4257
@mcp.tool()
4358
def generate_codemod(
4459
title: Annotated[str, "The title of the codemod (hyphenated)"],

src/codegen/extensions/langchain/__init__.py

Lines changed: 0 additions & 54 deletions
This file was deleted.

src/codegen/extensions/langchain/agent.py

Lines changed: 68 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,10 @@
11
"""Demo implementation of an agent with Codegen tools."""
22

3-
from langchain import hub
43
from langchain.agents import AgentExecutor
54
from langchain.agents.openai_functions_agent.base import OpenAIFunctionsAgent
6-
from langchain_community.chat_message_histories import ChatMessageHistory
5+
from langchain.hub import pull
6+
from langchain_core.chat_history import InMemoryChatMessageHistory
7+
from langchain_core.messages import BaseMessage
78
from langchain_core.runnables.history import RunnableWithMessageHistory
89
from langchain_openai import ChatOpenAI
910

@@ -30,6 +31,7 @@ def create_codebase_agent(
3031
model_name: str = "gpt-4o",
3132
temperature: float = 0,
3233
verbose: bool = True,
34+
chat_history: list[BaseMessage] = [],
3335
) -> RunnableWithMessageHistory:
3436
"""Create an agent with all codebase tools.
3537
@@ -65,7 +67,7 @@ def create_codebase_agent(
6567
]
6668

6769
# Get the prompt to use
68-
prompt = hub.pull("hwchase17/openai-functions-agent")
70+
prompt = pull("hwchase17/openai-functions-agent")
6971

7072
# Create the agent
7173
agent = OpenAIFunctionsAgent(
@@ -82,7 +84,69 @@ def create_codebase_agent(
8284
)
8385

8486
# Create message history handler
85-
message_history = ChatMessageHistory()
87+
message_history = InMemoryChatMessageHistory(messages=chat_history)
88+
89+
# Wrap with message history
90+
return RunnableWithMessageHistory(
91+
agent_executor,
92+
lambda session_id: message_history,
93+
input_messages_key="input",
94+
history_messages_key="chat_history",
95+
)
96+
97+
98+
def create_codebase_inspector_agent(
99+
codebase: Codebase,
100+
model_name: str = "gpt-4o",
101+
temperature: float = 0,
102+
verbose: bool = True,
103+
chat_history: list[BaseMessage] = [],
104+
) -> RunnableWithMessageHistory:
105+
"""Create an agent with all codebase tools.
106+
107+
Args:
108+
codebase: The codebase to operate on
109+
model_name: Name of the model to use (default: gpt-4)
110+
temperature: Model temperature (default: 0)
111+
verbose: Whether to print agent's thought process (default: True)
112+
113+
Returns:
114+
Initialized agent with message history
115+
"""
116+
# Initialize language model
117+
llm = ChatOpenAI(
118+
model_name=model_name,
119+
temperature=temperature,
120+
)
121+
122+
# Get all codebase tools
123+
tools = [
124+
ViewFileTool(codebase),
125+
ListDirectoryTool(codebase),
126+
SearchTool(codebase),
127+
DeleteFileTool(codebase),
128+
RevealSymbolTool(codebase),
129+
]
130+
131+
# Get the prompt to use
132+
prompt = pull("codegen-agent/codebase-agent")
133+
134+
# Create the agent
135+
agent = OpenAIFunctionsAgent(
136+
llm=llm,
137+
tools=tools,
138+
prompt=prompt,
139+
)
140+
141+
# Create the agent executor
142+
agent_executor = AgentExecutor(
143+
agent=agent,
144+
tools=tools,
145+
verbose=verbose,
146+
)
147+
148+
# Create message history handler
149+
message_history = InMemoryChatMessageHistory(messages=chat_history)
86150

87151
# Wrap with message history
88152
return RunnableWithMessageHistory(

0 commit comments

Comments
 (0)