|
8ef7a03895
|
with every commit, Miku grows stronger.
changed some defaults; added and then decided to drop repetition penalty related hyperparameters; fixed prompt formatting
|
2025-01-29 11:25:33 +00:00 |
|
|
21a2b1d4d0
|
Huggingface api mode, support for editing system prompts, config refactor
|
2025-01-29 09:39:59 +00:00 |
|
|
a361f110ec
|
Add min_new_tokens flag
|
2025-01-29 02:08:02 +00:00 |
|
|
411d458549
|
Add back repetition penalty
|
2025-01-20 00:55:39 +00:00 |
|
|
8c3c68f384
|
Update for compatibility with Langchain server
|
2025-01-19 04:31:46 +00:00 |
|
|
c0d48a92dc
|
Fix bug where unauthorized users could change llmconf lol
|
2024-07-31 04:46:43 +00:00 |
|
|
c55e613a4a
|
Use llama-3 model, new defaults
|
2024-05-28 18:34:09 +00:00 |
|
|
0adc21d73e
|
Error handling for automatic TTS request
|
2024-05-11 05:15:45 +00:00 |
|
|
09e3c4307c
|
Add message context option
|
2024-05-11 02:55:23 +00:00 |
|
|
cf601a72fb
|
Add TTS
|
2024-05-09 23:20:14 +00:00 |
|
|
a8efab7788
|
MikuAI features: LLM and RVC
|
2024-03-31 21:36:09 +00:00 |
|
James Shiffer
|
8346f52f23
|
Change chat cmd max tokens, replies
|
2024-02-06 18:50:55 -08:00 |
|
James Shiffer
|
19346fb2c3
|
Fix command options
|
2024-02-06 18:26:50 -08:00 |
|
James Shiffer
|
20129cc3ef
|
Move command to its own folder
|
2024-02-06 18:12:00 -08:00 |
|
James Shiffer
|
c37d2eace8
|
Implement LLaMA chat client as interaction
|
2024-02-06 17:57:10 -08:00 |
|