Skip to content

Commit 1f4e715

Browse files
committed
Placeholder doc
1 parent f0ec65c commit 1f4e715

File tree

1 file changed

+100
-0
lines changed

1 file changed

+100
-0
lines changed
Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
<!--Copyright 2025 The HuggingFace Team and the Swiss AI Initiative. All rights reserved.
2+
3+
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
4+
the License. You may obtain a copy of the License at
5+
6+
http://www.apache.org/licenses/LICENSE-2.0
7+
8+
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
9+
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
10+
specific language governing permissions and limitations under the License.
11+
12+
⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
13+
rendered properly in your Markdown viewer.
14+
15+
-->
16+
17+
<div style="float: right;">
18+
<div class="flex flex-wrap space-x-1">
19+
<img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-DE3412?style=flat&logo=pytorch&logoColor=white">
20+
<img alt="FlashAttention" src="https://img.shields.io/badge/%E2%9A%A1%EF%B8%8E%20FlashAttention-eae0c8?style=flat">
21+
<img alt="SDPA" src="https://img.shields.io/badge/SDPA-DE3412?style=flat&logo=pytorch&logoColor=white">
22+
<img alt="Tensor parallelism" src="https://img.shields.io/badge/Tensor%20parallelism-06b6d4?style=flat&logoColor=white">
23+
</div>
24+
</div>
25+
26+
# Apertus
27+
28+
[Apertus](https://www.swiss-ai.org) is a family of large language models from the Swiss AI Initiative.
29+
30+
> [!TIP]
31+
> Coming soon
32+
33+
The example below demonstrates how to generate text with [`Pipeline`] or the [`AutoModel`], and from the command line.
34+
35+
<hfoptions id="usage">
36+
<hfoption id="Pipeline">
37+
38+
```py
39+
import torch
40+
from transformers import pipeline
41+
42+
pipeline = pipeline(
43+
task="text-generation",
44+
model="swiss-ai/Apertus-8B",
45+
torch_dtype=torch.bfloat16,
46+
device=0
47+
)
48+
pipeline("Plants create energy through a process known as")
49+
```
50+
51+
</hfoption>
52+
<hfoption id="AutoModel">
53+
54+
```py
55+
import torch
56+
from transformers import AutoModelForCausalLM, AutoTokenizer
57+
58+
tokenizer = AutoTokenizer.from_pretrained(
59+
"swiss-ai/Apertus-8B",
60+
)
61+
model = AutoModelForCausalLM.from_pretrained(
62+
"swiss-ai/Apertus-8B",
63+
torch_dtype=torch.bfloat16,
64+
device_map="auto",
65+
attn_implementation="sdpa"
66+
)
67+
input_ids = tokenizer("Plants create energy through a process known as", return_tensors="pt").to("cuda")
68+
69+
output = model.generate(**input_ids, cache_implementation="static")
70+
print(tokenizer.decode(output[0], skip_special_tokens=True))
71+
```
72+
73+
</hfoption>
74+
<hfoption id="transformers CLI">
75+
76+
```bash
77+
echo -e "Plants create energy through a process known as" | transformers run --task text-generation --model swiss-ai/Apertus-8B --device 0
78+
```
79+
80+
</hfoption>
81+
</hfoptions>
82+
83+
## ApertusConfig
84+
85+
[[autodoc]] ApertusConfig
86+
87+
## ApertusModel
88+
89+
[[autodoc]] ApertusModel
90+
- forward
91+
92+
## ApertusForCausalLM
93+
94+
[[autodoc]] ApertusForCausalLM
95+
- forward
96+
97+
## ApertusForTokenClassification
98+
99+
[[autodoc]] ApertusForTokenClassification
100+
- forward

0 commit comments

Comments
 (0)