You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+29Lines changed: 29 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,6 +13,8 @@
13
13
*[Overview](#overview)
14
14
*[What's inside?](#whats-inside)
15
15
*[Overview of the tech](#overview-of-the-tech)
16
+
*[Highlight Features / Components](#highlight-features--components)
17
+
*[AI / LLM Chat](#ai--llm-chat)
16
18
*[Apps and Libraries](#apps-and-libraries)
17
19
*[`frontend`: a Nuxt app, compatible with v4 structure.](#frontend-a-nuxt-app-compatible-with-v4-structure)
18
20
*[`backend`: a Hono🔥 app.](#backend-a-hono-app)
@@ -32,6 +34,10 @@
32
34
33
35
This is a base monorepo starter template to kick-start your beautifully organized project, whether its a fullstack project, monorepo of multiple libraries and applications, or even just one API server and its related infrastructure deployment and utilities.
34
36
37
+
Out-of-the-box with the included apps, we have a fullstack project: with a `frontend` Nuxt 4 app, a main `backend` using Hono, and a `backend-convex` Convex app.
38
+
* General APIs, such as authentication, are handled by the main `backend`, which is designed to be serverless-compatible and can be deployed anywhere, allowing for the best possible latency, performance, and cost, according to your needs.
39
+
*`backend-convex` is a modular, add-in `backend`, utilized to power components like `AI Chat`.
40
+
35
41
It is recommended to use an AI Agent ([`Roo Code`](https://github.com/RooVetGit/Roo-Code) recommended) to help you setup the monorepo according to your needs, see [Utilities](#utilities)
36
42
37
43
## What's inside?
@@ -57,6 +63,29 @@ So if you use SSR, you could use the official [Nuxt Kinde](https://nuxt.com/modu
57
63
58
64
💯 JS is always [**TypeScript**](https://www.typescriptlang.org/) where possible.
59
65
66
+
### Highlight Features / Components
67
+
68
+
#### AI / LLM Chat
69
+
70
+
> Work started in 2025-06-12 for [**T3 Chat Cloneathon**](https://cloneathon.t3.chat/) competition, with no prior AI SDK and chat streams experience, but I think I did an amazing job 🫡!
71
+
72
+
A super efficient and powerful LLM Chat system, featuring:
73
+
* Business-ready, support `hosted` provider that you can control the billing of.
74
+
* Supports other add-in **BYOK** providers, like `OpenAI`, `OpenRouter`,...
75
+
* Seamless authentication integration with the main `backend`.
76
+
* Beautiful syntax highlighting. 🌈
77
+
* Thread branching, freezing, and sharing.
78
+
* Real-time, multi-agents, multi-users support ¹.
79
+
* Invite your families and friends, and play with the Agents together in real-time.
80
+
* Or maybe invite your colleagues, and brainstorm with the help of AI together.
81
+
* Resumable and multi-streams ¹.
82
+
* Ask follow-up questions while the previous isn't done, the model is able to pick up what's available currently. 🍳🍳
83
+
* Easy and private: anonymous, guest usage supported.
84
+
* Mobile-friendly.
85
+
* Fully internalized, with AI-powered translations and smooth switching between languages.
86
+
87
+
`*1`: currently the "stream" received when resuming or for other real-time users in the same thread is implemented via a custom polling mechanism, and not SSE. it is intentionally chosed to be this way for more minimal infrastructure setup and wider hosting support, so smaller user groups can host their own version easily, it is still very performant and efficient.
88
+
60
89
### Apps and Libraries
61
90
62
91
#### [`frontend`](./apps/frontend): a [Nuxt](https://nuxt.com/) app, compatible with v4 structure.
0 commit comments