Class: Agent
Description:
Orchestrates query processing by integrating configuration, memory management, LLM interactions, and tool execution.
Constructor:
Agent(config_path: str, llm_provider: str = "openai")
- config_path: Path to the JSON configuration file.
- llm_provider: Identifier for the LLM provider (e.g.,
"openai"
,"anthropic"
,"custom"
).
Key Methods:
_load_config(config_path: str) -> Dict[str, Any]
Reads and parses the configuration JSON file.
Usage Example:
config = agent._load_config("agent_config.json")
_load_tools(tool_names: List[str]) -> Dict[str, Dict[str, Any]]
Loads tool configuration files from the tools/
directory based on the list of tool names.
_execute_tool(tool_name: str, parameters: Dict[str, Any]) -> Any
Dynamically imports and executes a tool function with the provided parameters.
_create_system_prompt() -> str
Generates a system prompt using configuration data and tool information.
_format_tools_for_llm() -> List[Dict[str, Any]]
Formats tool configurations into the structure required by the LLM provider.
process_query(query: str) -> str
Main method to process a user query:
- Builds message context (system prompt, conversation memory, and query).
- Sends messages to the LLM.
- Detects and executes tool calls.
- Returns the final response.
Usage Example:
from agent import create_agent, run_agent
agent = create_agent("agent_config.json", llm_provider="openai")
response = run_agent(agent, "Calculate 3 * 5")
print("Agent Response:", response)
Helper Functions:
create_agent(config_path: str, llm_provider: str = "openai") -> Agent
Factory function to instantiate an Agent.
run_agent(agent: Agent, query: str) -> str
Executes the agent's query processing and returns the response.