File tree Expand file tree Collapse file tree 2 files changed +2
-2
lines changed
Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Original file line number Diff line number Diff line change @@ -365,7 +365,7 @@ Text Embedding
365365
366366.. note ::
367367 Unlike base Qwen2, :code: `Alibaba-NLP/gte-Qwen2-7B-instruct ` uses bi-directional attention.
368- You can set `--hf-overrides '{"is_causal": false}' ` to change the attention mask accordingly.
368+ You can set :code: `--hf-overrides '{"is_causal": false}' ` to change the attention mask accordingly.
369369
370370 On the other hand, its 1.5B variant (:code: `Alibaba-NLP/gte-Qwen2-1.5B-instruct `) uses causal attention
371371 despite being described otherwise on its model card.
Original file line number Diff line number Diff line change @@ -393,7 +393,7 @@ Feature x Hardware
393393 - ✅
394394 - ✅
395395 - ✅
396- - ✗
396+ - ?
397397 * - :abbr: `enc-dec ( Encoder-Decoder Models ) `
398398 - ✅
399399 - ✅
You can’t perform that action at this time.
0 commit comments