ai-servers/llm-gateway/internal/proxy
Ray Andrew 90adf6f3a8
feat(gateway): add circuit breaker, retry, and concurrency limit support
feat(gateway): add debug logging with file storage and retention

feat(gateway): add audit logging for user actions

feat(gateway): add request ID tracking and rate limit headers

feat(gateway): add model aliases and load balancing strategies

feat(gateway): add config hot-reload via SIGHUP

feat(gateway): add CORS support

feat(gateway): add data export API and dashboard endpoints

feat(gateway): add dashboard pages for audit and debug logs

feat(gateway): add concurrent request limiting per token

feat(gateway): add streaming timeout support

feat(gateway): add migration support for new schema fields
2026-02-15 04:21:40 -06:00
..
auth.go feat(gateway): fix static token initialized in DB 2026-02-15 02:27:43 -06:00
concurrency.go feat(gateway): add circuit breaker, retry, and concurrency limit support 2026-02-15 04:21:40 -06:00
concurrency_test.go feat(gateway): add circuit breaker, retry, and concurrency limit support 2026-02-15 04:21:40 -06:00
handler.go feat(gateway): add circuit breaker, retry, and concurrency limit support 2026-02-15 04:21:40 -06:00
models.go feat(gateway): add llm-gateway service 2026-02-15 01:23:50 -06:00
ratelimit.go feat(gateway): add circuit breaker, retry, and concurrency limit support 2026-02-15 04:21:40 -06:00
ratelimit_test.go feat(gateway): add circuit breaker, retry, and concurrency limit support 2026-02-15 04:21:40 -06:00
stream.go feat(gateway): add circuit breaker, retry, and concurrency limit support 2026-02-15 04:21:40 -06:00