Skip to content

Commit abe3324

Browse files
committed
docs(README): update
1 parent 1098018 commit abe3324

1 file changed

Lines changed: 29 additions & 0 deletions

File tree

README.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@
1313
* [Overview](#overview)
1414
* [What's inside?](#whats-inside)
1515
* [Overview of the tech](#overview-of-the-tech)
16+
* [Highlight Features / Components](#highlight-features--components)
17+
* [AI / LLM Chat](#ai--llm-chat)
1618
* [Apps and Libraries](#apps-and-libraries)
1719
* [`frontend`: a Nuxt app, compatible with v4 structure.](#frontend-a-nuxt-app-compatible-with-v4-structure)
1820
* [`backend`: a Hono🔥 app.](#backend-a-hono-app)
@@ -32,6 +34,10 @@
3234

3335
This is a base monorepo starter template to kick-start your beautifully organized project, whether its a fullstack project, monorepo of multiple libraries and applications, or even just one API server and its related infrastructure deployment and utilities.
3436

37+
Out-of-the-box with the included apps, we have a fullstack project: with a `frontend` Nuxt 4 app, a main `backend` using Hono, and a `backend-convex` Convex app.
38+
* General APIs, such as authentication, are handled by the main `backend`, which is designed to be serverless-compatible and can be deployed anywhere, allowing for the best possible latency, performance, and cost, according to your needs.
39+
* `backend-convex` is a modular, add-in `backend`, utilized to power components like `AI Chat`.
40+
3541
It is recommended to use an AI Agent ([`Roo Code`](https://github.com/RooVetGit/Roo-Code) recommended) to help you setup the monorepo according to your needs, see [Utilities](#utilities)
3642

3743
## What's inside?
@@ -57,6 +63,29 @@ So if you use SSR, you could use the official [Nuxt Kinde](https://nuxt.com/modu
5763

5864
💯 JS is always [**TypeScript**](https://www.typescriptlang.org/) where possible.
5965

66+
### Highlight Features / Components
67+
68+
#### AI / LLM Chat
69+
70+
> Work started in 2025-06-12 for [**T3 Chat Cloneathon**](https://cloneathon.t3.chat/) competition, with no prior AI SDK and chat streams experience, but I think I did an amazing job 🫡!
71+
72+
A super efficient and powerful LLM Chat system, featuring:
73+
* Business-ready, support `hosted` provider that you can control the billing of.
74+
* Supports other add-in **BYOK** providers, like `OpenAI`, `OpenRouter`,...
75+
* Seamless authentication integration with the main `backend`.
76+
* Beautiful syntax highlighting. 🌈
77+
* Thread branching, freezing, and sharing.
78+
* Real-time, multi-agents, multi-users support ¹.
79+
* Invite your families and friends, and play with the Agents together in real-time.
80+
* Or maybe invite your colleagues, and brainstorm with the help of AI together.
81+
* Resumable and multi-streams ¹.
82+
* Ask follow-up questions while the previous isn't done, the model is able to pick up what's available currently. 🍳🍳
83+
* Easy and private: anonymous, guest usage supported.
84+
* Mobile-friendly.
85+
* Fully internalized, with AI-powered translations and smooth switching between languages.
86+
87+
`*1`: currently the "stream" received when resuming or for other real-time users in the same thread is implemented via a custom polling mechanism, and not SSE. it is intentionally chosed to be this way for more minimal infrastructure setup and wider hosting support, so smaller user groups can host their own version easily, it is still very performant and efficient.
88+
6089
### Apps and Libraries
6190

6291
#### [`frontend`](./apps/frontend): a [Nuxt](https://nuxt.com/) app, compatible with v4 structure.

0 commit comments

Comments
 (0)