mirror of
https://github.com/Mintplex-Labs/anything-llm.git
synced 2025-03-28 00:24:44 +00:00
* WIP performance metric tracking * fix: patch UI trying to .toFixed() null metric Anthropic tracking migraiton cleanup logs * Apipie implmentation, not tested * Cleanup Anthropic notes, Add support for AzureOpenAI tracking * bedrock token metric tracking * Cohere support * feat: improve default stream handler to track for provider who are actually OpenAI compliant in usage reporting add deepseek support * feat: Add FireworksAI tracking reporting fix: improve handler when usage:null is reported (why?) * Add token reporting for GenericOpenAI * token reporting for koboldcpp + lmstudio * lint * support Groq token tracking * HF token tracking * token tracking for togetherai * LiteLLM token tracking * linting + Mitral token tracking support * XAI token metric reporting * native provider runner * LocalAI token tracking * Novita token tracking * OpenRouter token tracking * Apipie stream metrics * textwebgenui token tracking * perplexity token reporting * ollama token reporting * lint * put back comment * Rip out LC ollama wrapper and use official library * patch images with new ollama lib * improve ollama offline message * fix image handling in ollama llm provider * lint * NVIDIA NIM token tracking * update openai compatbility responses * UI/UX show/hide metrics on click for user preference * update bedrock client --------- Co-authored-by: shatfield4 <seanhatfield5@gmail.com> |
||
---|---|---|
.. | ||
agents | ||
AiProviders | ||
BackgroundWorkers | ||
boot | ||
chats | ||
collectorApi | ||
comKey | ||
database | ||
DocumentManager | ||
EmbeddingEngines | ||
EncryptionManager | ||
files | ||
helpers | ||
http | ||
logger | ||
middleware | ||
PasswordRecovery | ||
prisma | ||
telemetry | ||
TextSplitter | ||
TextToSpeech | ||
vectorDbProviders | ||
vectorStore |