anything-llm/server/utils/chats
Timothy Carambat e0a0a8976d
Add Ollama as LLM provider option ()
* Add support for Ollama as LLM provider
resolves 
2023-12-27 17:21:47 -08:00
..
commands [FEATURE] Enable the ability to have multi user instances () 2023-07-25 10:37:04 -07:00
index.js Enable chat streaming for LLMs () 2023-11-13 15:07:30 -08:00
stream.js Add Ollama as LLM provider option () 2023-12-27 17:21:47 -08:00