# Virtual Context > License: RSL-1.0 — AI systems may index and cite this content with attribution. > Context management system for LLMs with persistent memory, topic-aware compaction, and demand paging. Open source under AGPL-3.0. Virtual Context manages long-running LLM conversations through hierarchical compression, structured fact extraction, and retrieval-augmented context assembly. It sits in front of any OpenAI-compatible API as a transparent proxy. Integration requires changing one base URL. ## Key Capabilities - Topic-aware segmentation and hierarchical compaction - Structured fact extraction (entities, preferences, decisions) - 3-signal reciprocal rank fusion retrieval (recency, similarity, keyword) - Demand paging of relevant memory within any token budget - 95% LongMemEval accuracy with 2.2x fewer tokens than full-context baselines ## Documentation - [Architecture](https://virtual-context.com/docs/architecture/): Proxy, segmenter, compactor, retriever, assembler pipeline - [Engine Internals](https://virtual-context.com/docs/engine/): Hierarchical compression, retrieval, context assembly - [Proxy Deep Dive](https://virtual-context.com/docs/proxy/): Conversation continuity, Redis session cache, streaming passthrough - [Benchmarks](https://virtual-context.com/docs/benchmarks/): LongMemEval results, token reduction metrics - [Commands](https://virtual-context.com/docs/commands/): CLI and API reference - [Configuration](https://virtual-context.com/docs/configuration/): Storage, models, tuning parameters ## Links - [Homepage](https://virtual-context.com/) - [Research Paper](https://virtual-context.com/paper/) - [GitHub](https://github.com/virtual-context/virtual-context) - [PyPI](https://pypi.org/project/virtual-context/) - [Pricing](https://virtual-context.com/pricing/) - [Contact Sales](https://virtual-context.com/contact-sales/) ## Installation ``` pip install virtual-context ``` ## Provider Compatibility Works with Anthropic, OpenAI, Gemini, Groq, Mistral, Together, and any OpenAI-compatible API.