shamoon
|
33f0cc2373
|
Fix openai api key, config settings saving
|
2025-05-24 11:47:18 -07:00 |
|
shamoon
|
db4433281c
|
Just use the built-in ollama LLM class of course
|
2025-05-24 11:47:17 -07:00 |
|
shamoon
|
97b129ca3d
|
Fix naming
|
2025-05-24 11:47:16 -07:00 |
|
shamoon
|
4c91f43f5a
|
Trim nodes
|
2025-05-24 11:47:16 -07:00 |
|
shamoon
|
d5f1a8e7f7
|
Backend streaming chat
|
2025-05-24 11:47:16 -07:00 |
|
shamoon
|
1622ea046c
|
Fixup some tests
|
2025-05-24 11:47:15 -07:00 |
|
shamoon
|
ba4149f92b
|
Unify, respect perms
[ci skip]
|
2025-05-24 11:47:15 -07:00 |
|
shamoon
|
15b3b17c5e
|
Individual doc chat
[ci skip]
|
2025-05-24 11:47:14 -07:00 |
|
shamoon
|
a4bc0836eb
|
Super basic doc chat
[ci skip]
|
2025-05-24 11:47:14 -07:00 |
|
shamoon
|
a2b9ac8878
|
Better encapsulate backends, use llama_index OpenAI
|
2025-05-24 11:47:14 -07:00 |
|
shamoon
|
53f78a692d
|
Tweak ollama timeout, prompt
[ci skip]
|
2025-05-24 11:47:13 -07:00 |
|
shamoon
|
2300323741
|
Fix ollama, fix RAG
[ci skip]
|
2025-05-24 11:47:13 -07:00 |
|
shamoon
|
b045dc1b87
|
RAG into suggestions
|
2025-05-24 11:47:12 -07:00 |
|
shamoon
|
1aef7200ab
|
llamaindex vector index, llmindex mangement command
|
2025-05-24 11:47:12 -07:00 |
|
shamoon
|
2cecdd9f9e
|
Use a frontend config
|
2025-05-24 11:47:11 -07:00 |
|
shamoon
|
a0dc7e21f9
|
Fix
|
2025-05-24 11:47:08 -07:00 |
|
shamoon
|
7a01f9fc34
|
Backend tests
|
2025-05-24 11:47:08 -07:00 |
|
shamoon
|
385bfa5f78
|
Correct object retrieval
|
2025-05-24 11:47:08 -07:00 |
|
shamoon
|
4fb08deedc
|
Refactor
|
2025-05-24 11:47:07 -07:00 |
|
shamoon
|
527de02fdd
|
Move module
|
2025-05-24 11:47:07 -07:00 |
|