You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-2Lines changed: 4 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -73,16 +73,18 @@ A super efficient and powerful, yet friendly LLM Chat system, featuring:
73
73
* Business-ready, support `hosted` provider that you can control the billing of.
74
74
* Supports other add-in **BYOK** providers, like `OpenAI`, `OpenRouter`,...
75
75
* Seamless authentication integration with the main `backend`.
76
-
* Beautiful syntax highlighting. 🌈
76
+
* Beautiful syntax highlighting 🌈.
77
77
* Thread branching, freezing, and sharing.
78
78
* Real-time, multi-agents, multi-users support ¹.
79
79
* Invite your families and friends, and play with the Agents together in real-time.
80
80
* Or maybe invite your colleagues, and brainstorm together with the help and power of AI.
81
81
* Resumable and multi-streams ¹.
82
-
* Ask follow-up questions while the previous isn't done, the model is able to pick up what's available currently. 🍳🍳
82
+
* Ask follow-up questions while the previous isn't done, the model is able to pick up what's available currently 🍳🍳.
83
+
* Multi-users can send messages at the same time 😲😲.
83
84
* Easy and private: anonymous, guest usage supported.
84
85
* Mobile-friendly.
85
86
* Fully internalized, with AI-powered translations and smooth switching between languages.
87
+
* Blazingly fast ⚡ with local caching and optimistic updates.
86
88
* Designed to be scalable
87
89
*> Things are isolated and common interfaces are defined and utilized where possible, there's no tightly coupled-hacks that prevents future scaling, things just works, elegantly.
0 commit comments