This section provides solutions to common issues you might encounter when using or extending the framework.
1. Agent Configuration Issues
- Verify that the configuration file (e.g.,
agent_config.json
) exists in the specified location. - Check file permissions and ensure the path is correct.
- Use a JSON validator (such as jsonlint.com) to check your configuration file for syntax errors.
- Ensure proper formatting (e.g., commas between entries, correct quotation marks).
2. LLM Provider and API Key Errors
Set the API key in your environment:
export OPENAI_API_KEY="your_openai_api_key_here"
For Windows, use:
set OPENAI_API_KEY=your_openai_api_key_here
Install the missing module using pip, e.g.:
pip install openai
Verify that the module is available in your current Python environment.
3. Tool Loading and Execution Errors
- Confirm that the tool configuration file exists in the
tools/
directory. - Verify that the tool name in your configuration file is spelled correctly.
- Check the
function_path
in the tool configuration to ensure it points to the correct module and function. - Confirm that the corresponding Python file exists and that the function is defined.
4. Memory Issues
- Verify that memory is enabled in your configuration (
"memory": true
). - For semantic memory, check that dependencies (SentenceTransformers, FAISS) are installed and that the embedding model is loading correctly.
5. RAG (Retrieval-Augmented Generation) Issues
- Ensure you have correctly indexed documents using the
index_documents
method. - Check that the documents provided are non-empty and in plain text.
- Validate that the FAISS index is built correctly and that the embeddings are computed.