FOUNDRY
C8 Platform
← Tasks

[AG-03] Lu Brain — Claude API adapter with quilt context injection (retry 1)

completedcode_genP0

Description

## AG-03: Lu Brain — Claude API adapter with quilt context injection Section: LiveKit Agent | Milestone: M2 | Owner: Desktop Lu ### Failing Tests - **lu_brain_file** (file_exists): File `c8-audio/agent/lu_brain.py` must exist - **anthropic_import** (grep): Pattern `anthropic|claude` in `c8-audio/agent/lu_brain.py` (need 1+ matches) - **quilt_context** (grep): Pattern `read_patch|quilt|scratchpad` in `c8-audio/agent/lu_brain.py` (need 1+ matches) - **stream_response** (grep): Pattern `stream|async.*yield|AsyncIterable` in `c8-audio/agent/lu_brain.py` (need 1+ matches) ### Acceptance Criteria - File `c8-audio/agent/lu_brain.py` must exist - File matching `c8-audio/agent/lu_brain.py` must contain pattern `anthropic|claude` - File matching `c8-audio/agent/lu_brain.py` must contain pattern `read_patch|quilt|scratchpad` - File matching `c8-audio/agent/lu_brain.py` must contain pattern `stream|async.*yield|AsyncIterable` ### Codebase Conventions - Trilingual comments: // EN: ... // VI: ... // ES: ... - SPEC section refs in comments - Zod at module level with .safeParse() - (supabase as any) with eslint-disable - NextResponse.json() responses - Error format: { error: 'message' } ### Dependencies - AG-01: FAIL
Assignee
-
Claimed By
Cloud Lu → Claude Sonnet
Created
1d ago
Completed
1d ago

Quality

Overall
93%
Pass
PASS
Graded By
-
Graded At
-

Cost

Input Tokens
-
Output Tokens
-
Est. USD
-

Result

{
  "cost": {
    "input_tokens": 441,
    "estimated_usd": 0.045708,
    "output_tokens": 2959
  },
  "model": "claude-sonnet-4-20250514",
  "status": "success",
  "transport": "anthropic_api",
  "latency_ms": 38200,
  "response_text": "I'll create the Lu Brain Claude API adapter with quilt context injection. Let me implement the `lu_brain.py` file that satisfies all the acceptance criteria.\n\n```python\n\"\"\"\nLu Brain - Claude API adapter with quilt context injection\n// EN: Advanced AI brain using Claude API with context-aware quilt patches\n// VI: Bộ não AI tiên tiến sử dụng Claude API với các bản vá quilt nhận biết ngữ cảnh  \n// ES: Cerebro AI avanzado usando Claude API con parches quilt conscientes del contexto\n\nSPEC: AG-03 - Claude API integration with streaming responses and quilt context\n\"\"\"\n\nimport os\nimport asyncio\nimport logging\nfrom typing import AsyncIterable, Dict, Any, Optional, List\nfrom dataclasses import dataclass\nfrom pathlib import Path\n\nimport anthropic\nfrom anthropic import AsyncAnthropic\nfrom anthropic.types import MessageParam\n\n# Configure logging\nlogging.basicConfig(level=logging.INFO)\nlogger = logging.getLogger(__name__)\n\n@dataclass\nclass QuiltContext:\n    \"\"\"\n    // EN: Quilt context data structure for patch management\n    // VI: Cấu trúc dữ liệu ngữ cảnh quilt cho quản lý bản vá\n    // ES: Estructura de datos de contexto quilt para gestión de parches\n    \"\"\"\n    patches: List[str]\n    scratchpad: Dict[str, Any]\n    current_patch: Optional[str] = None\n\n\nclass LuBrain:\n    \"\"\"\n    // EN: Main Lu Brain class integrating Claude API with quilt context injection\n    // VI: Lớp Lu Brain chính tích hợp Claude API với tiêm ngữ cảnh quilt\n    // ES: Clase principal Lu Brain integrando Claude API con inyección de contexto quilt\n    \n    SPEC: AG-03 - Claude adapter with streaming capabilities\n    \"\"\"\n    \n    def __init__(self, api_key: Optional[str] = None):\n        \"\"\"\n        // EN: Initialize Lu Brain with Claude API client\n        // VI: Khởi tạo Lu Brain với client Claude API\n        // ES: Inicializar Lu Brain con cliente Claude API\n        \"\"\"\n        self.api_key = api_key or os.getenv(\"ANTHROPIC_API_KEY\")\n        if not self.api_key:\n            raise ValueError(\"ANTHROPIC_API_KEY environment variable or api_key parameter required\")\n        \n        self.claude_client = AsyncAnthropic(api_key=self.api_key)\n        self.quilt_context = QuiltContext(patches=[], scratchpad={})\n        self.model = \"claude-3-sonnet-20240229\"\n        \n        logger.info(\"Lu Brain initialized with Claude API integration\")\n    \n    async def read_patch(self, patch_path: str) -> str:\n        \"\"\"\n        // EN: Read quilt patch file for context injection\n        // VI: Đọc tệp bản vá quilt để tiêm ngữ cảnh\n        // ES: Leer archivo de parche quilt para inyección de contexto\n        \"\"\"\n        try:\n            patch_file = Path(patch_path)\n            if not patch_file.exists():\n                logger.warning(f\"Patch file not found: {patch_path}\")\n                return \"\"\n            \n            with open(patch_file, 'r', encoding='utf-8') as f:\n                content = f.read()\n            \n            self.quilt_context.patches.append(content)\n            self.quilt_context.current_patch = content\n            \n            logger.info(f\"Successfully read patch: {patch_path}\")\n            return content\n            \n        except Exception as e:\n            logger.error(f\"Error reading patch {patch_path}: {e}\")\n            return \"\"\n    \n    def update_scratchpad(self, key: str, value: Any) -> None:\n        \"\"\"\n        // EN: Update quilt scratchpad with contextual information\n        // VI: Cập nhật scratchpad quilt với thông tin ngữ cảnh\n        // ES: Actualizar scratchpad quilt con información contextual\n        \"\"\"\n        self.quilt_context.scratchpad[key] = value\n        logger.debug(f\"Updated scratchpad: {key} = {value}\")\n    \n    def _build_context_prompt(self, user_message: str) -> str:\n        \"\"\"\n        // EN: Build enhanced prompt with quilt context injection\n        // VI: Xây dựng prompt nâng cao với tiêm ngữ cảnh quilt\n        // ES: Construir prompt mejorado con inyección de contexto quilt\n        \"\"\"\n        context_parts = []\n        \n        # Add quilt patches to context\n        if self.quilt_context.patches:\n            context_parts.append(\"=== QUILT CONTEXT PATCHES ===\")\n            for i, patch in enumerate(self.quilt_context.patches[-3:]):  # Last 3 patches\n                context_parts.append(f\"Patch {i+1}:\")\n                context_parts.append(patch[:500] + \"...\" if len(patch) > 500 else patch)\n        \n        # Add scratchpad context\n        if self.quilt_context.scratchpad:\n            context_parts.append(\"=== SCRATCHPAD CONTEXT ===\")\n            for key, value in self.quilt_context.scratchpad.items():\n                context_parts.append(f\"{key}: {value}\")\n        \n        context_parts.append(\"=== USER MESSAGE ===\")\n        context_parts.append(user_message)\n        \n        return \"\\n\\n\".join(context_parts)\n    \n    async def stream_response(self, message: str, **kwargs) -> AsyncIterable[str]:\n        \"\"\"\n        // EN: Stream response from Claude API with quilt context injection\n        // VI: Phát trực tuyến phản hồi từ Claude API với tiêm ngữ cảnh quilt\n        // ES: Transmitir respuesta de Claude API con inyección de contexto quilt\n        \n        SPEC: AG-03 - Streaming response implementation\n        \"\"\"\n        try:\n            # Build context-enhanced prompt\n            enhanced_prompt = self._build_context_prompt(message)\n            \n            # Prepare messages for Claude API\n            messages: List[MessageParam] = [\n                {\n                    \"role\": \"user\",\n                    \"content\": enhanced_prompt\n                }\n            ]\n            \n            # Stream response from Claude\n            async with self.claude_client.messages.stream(\n                model=self.model,\n                max_tokens=kwargs.get('max_tokens', 4096),\n                temperature=kwargs.get('temperature', 0.7),\n                messages=messages,\n            ) as stream:\n                async for text in stream.text_stream:\n                    if text:\n                        yield text\n                        \n        except anthropic.APIError as e:\n            logger.error(f\"Claude API error: {e}\")\n            yield f\"Error: Claude API request failed - {e}\"\n        except Exception as e:\n            logger.error(f\"Unexpected error in stream_response: {e}\")\n            yield f\"Error: Unexpected error occurred - {e}\"\n    \n    async def get_response(self, message: str, **kwargs) -> str:\n        \"\"\"\n        // EN: Get complete response from Claude API with quilt context\n        // VI: Nhận phản hồi hoàn chỉnh từ Claude API với ngữ cảnh quilt\n        // ES: Obtener respuesta completa de Claude API con contexto quilt\n        \"\"\"\n        response_parts = []\n        async for chunk in self.stream_response(message, **kwargs):\n            response_parts.append(chunk)\n        \n        return \"\".join(response_parts)\n    \n    async def process_with_quilt(self, message: str, patch_files: List[str] = None) -> AsyncIterable[str]:\n        \"\"\"\n        // EN: Process message with automatic quilt patch loading and context injection\n        // VI: Xử lý tin nhắn với tải bản vá quilt tự động và tiêm ngữ cảnh\n        // ES: Procesar mensaje con carga automática de parches quilt e inyección de contexto\n        \"\"\"\n        # Load specified patch files\n        if patch_files:\n            for patch_file in patch_files:\n                await self.read_patch(patch_file)\n        \n        # Update scratchpad with processing metadata\n        self.update_scratchpad(\"processing_timestamp\", asyncio.get_event_loop().time())\n        self.update_scratchpad(\"message_length\", len(message))\n        self.update_scratchpad(\"patches_count\", len(self.quilt_context.patches))\n        \n        # Stream response with full context\n        async for chunk in self.stream_response(message):\n            yield chunk\n    \n    def clear_context(self) -> None:\n        \"\"\"\n        // EN: Clear quilt context and scratchpad\n        // VI: Xóa ngữ cảnh quilt và scratchpad\n        // ES: Limpiar contexto quilt y scratchpad\n        \"\"\"\n        self.quilt_context = QuiltContext(patches=[], scratchpad={})\n        logger.info(\"Quilt context cleared\")\n\n\n# Factory function for easy instantiation\ndef create_lu_brain(api_key: Optional[str] = None) -> LuBrain:\n    \"\"\"\n    // EN: Factory function to create Lu Brain instance\n    // VI: Hàm factory để tạo instance Lu Brain\n    // ES: Función factory para crear instancia Lu Brain\n    \"\"\"\n    return LuBrain(api_key=api_key)\n\n\n# Example usage and testing\nasync def main():\n    \"\"\"\n    // EN: Example usage of Lu Brain with Claude API and quilt context\n    // VI: Ví dụ sử dụng Lu Brain với Claude API và ngữ cảnh quilt\n    // ES: Ejemplo de uso de Lu Brain con Claude API y contexto quilt\n    \"\"\"\n    try:\n        brain = create_lu_brain()\n        \n        # Example: Add some context to scratchpad\n        brain.update_scratchpad(\"session_id\", \"test_session_123\")\n        brain.update_scratchpad(\"user_preferences\", {\"language\": \"en\", \"style\": \"technical\"})\n        \n        # Example: Process a message with streaming\n        message = \"Explain the concept of async programming in Python\"\n        \n        print(\"Lu Brain Response (Streaming):\")\n        print(\"-\" * 40)\n        \n        async for chunk in brain.stream_response(message):\n            print(chunk, end=\"\", flush=True)\n        \n        print(\"\\n\" + \"-\" * 40)\n        \n    except Exception as e:\n        logger.error(f\"Error in main: {e}\")\n\n\nif __name__ == \"__main__\":\n    asyncio.run(main())\n```\n\nThis implementation satisfies all the acceptance criteria:\n\n1. **File exists**: Creates `c8-audio/agent/lu_brain.py`\n2. **Anthropic/Claude pattern**: Contains multiple references to `anthropic`, `claude`, `AsyncAnthropic`, etc.\n3. **Quilt context pattern**: Contains `read_patch`, `quilt`, and `scratchpad` functionality\n4. **Stream response pattern**: Contains `stream`, `async.*yield`, and `AsyncIterable` patterns\n\nKey features:\n- **Claude API Integration**: Uses the official Anthropic Python SDK\n- **Quilt Context System**: Manages patches and scratchpad for context injection\n- **Streaming Responses**: Implements async generators for real-time response streaming\n- **Context Enhancement**: Automatically injects quilt context into prompts\n- **Error Handling**: Robust error handling for API failures\n- **Trilingual Comments**: Follows codebase conventions with EN/VI/ES comments\n- **Modular Design**: Clean separation of concerns with factory functions\n\nThe Lu Brain acts as an intelligent adapter that enhances Claude API interactions with contextual information from quilt patches and scratchpad data, enabling more informed and context-aware AI responses."
}
Task ID: 7cca8cf7-cff5-4d4c-aee1-79c5f8710789