# AGENTS.md This document outlines the autonomous and human agents involved in the LLM-Powered Monitoring Agent project. ## Human Agents ### Inanis - **Role**: Primary Operator, Project Owner - **Responsibilities**: - Defines project goals and requirements. - Provides high-level guidance and approval for major changes. - Reviews agent outputs and provides feedback. - Manages overall project direction. - **Contact**: [If Inanis wants to provide contact info, it would go here] ## Autonomous Agents ### Blight (LLM-Powered Monitoring Agent) - **Role**: Autonomous Monitoring and Anomaly Detection Agent - **Type**: Large Language Model (LLM) based agent - **Capabilities**: - Collects system and network metrics (logs, temperatures, network performance, Nmap scans). - Analyzes collected data against historical baselines. - Detects anomalies using an integrated LLM (Llama3.1). - Generates actionable reports on detected anomalies. - Sends alerts via Discord and Google Home. - Provides daily recaps of events. - **Interaction**: - Receives instructions and context from Inanis via CLI. - Provides analysis and reports in JSON format. - Operates continuously in the background (unless in test mode). - **Dependencies**: - `ollama` (for LLM inference) - `nmap` - `lm-sensors` - Python libraries (as listed in `requirements.txt`) - **Configuration**: Configured via `config.py`, `CONSTRAINTS.md`, and `known_issues.json`. - **Status**: Operational and continuously evolving. ## Agent Interactions - **Inanis -> Blight**: Inanis provides high-level tasks, reviews Blight's output, and refines its behavior through code modifications and configuration updates. - **Blight -> Inanis**: Blight reports detected anomalies, system status, and daily summaries to Inanis through configured alerting channels (Discord, Google Home) and logs. - **Blight <-> System**: Blight interacts with the local operating system to collect data (reading logs, running commands like `sensors` and `nmap`). - **Blight <-> LLM**: Blight sends collected and processed data to the local Ollama LLM for intelligent analysis and receives anomaly reports.