[Doc] Fix typos in docs (#10636)
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
This commit is contained in:
@@ -365,7 +365,7 @@ Text Embedding
|
||||
|
||||
.. note::
|
||||
Unlike base Qwen2, :code:`Alibaba-NLP/gte-Qwen2-7B-instruct` uses bi-directional attention.
|
||||
You can set `--hf-overrides '{"is_causal": false}'` to change the attention mask accordingly.
|
||||
You can set :code:`--hf-overrides '{"is_causal": false}'` to change the attention mask accordingly.
|
||||
|
||||
On the other hand, its 1.5B variant (:code:`Alibaba-NLP/gte-Qwen2-1.5B-instruct`) uses causal attention
|
||||
despite being described otherwise on its model card.
|
||||
|
||||
Reference in New Issue
Block a user