You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-2Lines changed: 4 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,20 +69,22 @@ So if you use SSR, you could use the official [Nuxt Kinde](https://nuxt.com/modu
69
69
70
70
> Work started in 2025-06-12 for [**T3 Chat Cloneathon**](https://cloneathon.t3.chat/) competition, with no prior AI SDK and chat streams experience, but I think I did an amazing job 🫡!
71
71
72
-
A super efficient and powerful LLM Chat system, featuring:
72
+
A super efficient and powerful, yet friendly LLM Chat system, featuring:
73
73
* Business-ready, support `hosted` provider that you can control the billing of.
74
74
* Supports other add-in **BYOK** providers, like `OpenAI`, `OpenRouter`,...
75
75
* Seamless authentication integration with the main `backend`.
76
76
* Beautiful syntax highlighting. 🌈
77
77
* Thread branching, freezing, and sharing.
78
78
* Real-time, multi-agents, multi-users support ¹.
79
79
* Invite your families and friends, and play with the Agents together in real-time.
80
-
* Or maybe invite your colleagues, and brainstorm with the help of AI together.
80
+
* Or maybe invite your colleagues, and brainstorm together with the help and power of AI.
81
81
* Resumable and multi-streams ¹.
82
82
* Ask follow-up questions while the previous isn't done, the model is able to pick up what's available currently. 🍳🍳
83
83
* Easy and private: anonymous, guest usage supported.
84
84
* Mobile-friendly.
85
85
* Fully internalized, with AI-powered translations and smooth switching between languages.
86
+
* Designed to be scalable
87
+
*> Things are isolated and common interfaces are defined and utilized where possible, there's no tightly coupled-hacks that prevents future scaling, things just works, elegantly.
86
88
87
89
`*1`: currently the "stream" received when resuming or for other real-time users in the same thread is implemented via a custom polling mechanism, and not SSE. it is intentionally chosed to be this way for more minimal infrastructure setup and wider hosting support, so smaller user groups can host their own version easily, it is still very performant and efficient.
0 commit comments