All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog,
and this project adheres to Semantic Versioning.
- Google Gemini Support - Full integration with Google Gemini models
- Gemini Pro, Gemini Pro Vision, and Gemini Ultra support
- Streaming responses support
- Vision/multimodal capabilities
- Safety settings configuration
- Generation parameters (temperature, top-k, top-p)
- Gemini provider implementation with comprehensive error handling
- Gemini-specific test suite
- Gemini usage examples
- Updated provider support matrix to show Gemini as ✅ Supported
- Enhanced documentation with Gemini configuration and examples
0.1.0 - 2025-05-10
- Initial release of LiteLLM Clojure
- Unified API for multiple LLM providers
- Support for OpenAI (GPT-3.5-Turbo, GPT-4, GPT-4o)
- Support for Anthropic (Claude 3 Opus, Sonnet, Haiku, Claude 2.x)
- Support for OpenRouter (access to multiple providers)
- Support for Ollama (local models)
- Async operation support with proper thread pool management
- Streaming response support for OpenAI and Anthropic
- Function calling support for OpenAI models
- Provider abstraction layer for easy extension
- Health monitoring and system checks
- Cost tracking and token estimation
- Request/response schema validation with Malli
- Comprehensive error handling
- Thread pool management with Claypoole
- Configuration system with Aero support
- Built-in caching support
create-system
- Create and configure LiteLLM systemshutdown-system!
- Gracefully shutdown systemcompletion
- Main completion APImake-request
- Direct request APIhealth-check
- System health monitoringsystem-info
- Get system information
- ✅ OpenAI - Full support with streaming and function calling
- ✅ Anthropic - Full support with streaming
- ✅ OpenRouter - Full support with multiple model access
- ✅ Ollama - Basic support for local models
- 🔄 Azure OpenAI - Planned
- 🔄 Google (Gemini) - Planned
- 🔄 Cohere - Planned
- 🔄 Mistral - Planned
- Test suite needs refinement for configuration validation
- Provider name extraction logic needs improvement for some edge cases
- Integration tests require API keys and are tagged separately
- Streaming API implementation is partial
- Some advanced OpenAI features not yet supported
- Comprehensive README with installation and usage examples
- Provider-specific configuration guides
- API reference examples
- Quick start guide
- Clojure 1.11.1
- Hato 0.9.0 (HTTP client)
- Cheshire 5.12.0 (JSON)
- Claypoole 1.1.4 (Thread pools)
- Malli 0.13.0 (Schema validation)
- Core.async 1.6.681 (Async operations)
- Maven/Clojars compatible build system
- tools.build integration
- Group ID:
tech.unravel
- Artifact ID:
litellm-clj