You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+12Lines changed: 12 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -470,6 +470,18 @@ This allows us to generate better solutions at the cost of longer runtime.
470
470
We also show the work done in each round of the search, including a comparison of the query plans before and after the addition of each index.
471
471
This give the LLM additional context that it can use when responding to the indexing recommendations.
472
472
473
+
### Experimental: Index Tuning by LLM
474
+
475
+
Postgres MCP Pro includes an experimental index tuning feature based on [Optimization by LLM](https://arxiv.org/abs/2309.03409).
476
+
Instead of using heuristics to explore possible index configurations, we provide the database schema and query plans to an LLM and ask it to propose index configurations.
477
+
We then use `hypopg` to predict performance with the proposed indexes, then feed those results back into the LLM to produce a new set of suggestions.
478
+
We repeat this process until multiple rounds of iteration produce no further improvements.
479
+
480
+
Index optimization by LLM is has advantages when the index search space is large, or when indexes with many columns need to be considered.
481
+
Like traditional search-based approaches, it relies on the accuracy of the `hypopg` performance predictions.
482
+
483
+
In order to perform index optimization by LLM, you must provide an OpenAI API key by setting the `OPENAI_API_KEY` environment variable.
0 commit comments