Skip to content

Commit

Permalink
Merge pull request #582 from vipulaSD/patch-2
Browse files Browse the repository at this point in the history
docs: fix broken links
  • Loading branch information
osanseviero authored Dec 20, 2023
2 parents 60f7702 + ceef93d commit 4c8adfa
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion chapters/en/chapter2/5.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -329,7 +329,7 @@ With Transformer models, there is a limit to the lengths of the sequences we can
- Use a model with a longer supported sequence length.
- Truncate your sequences.

Models have different supported sequence lengths, and some specialize in handling very long sequences. [Longformer](https://huggingface.co/transformers/model_doc/longformer.html) is one example, and another is [LED](https://huggingface.co/transformers/model_doc/led.html). If you're working on a task that requires very long sequences, we recommend you take a look at those models.
Models have different supported sequence lengths, and some specialize in handling very long sequences. [Longformer](https://huggingface.co/docs/transformers/model_doc/longformer) is one example, and another is [LED](https://huggingface.co/docs/transformers/model_doc/led). If you're working on a task that requires very long sequences, we recommend you take a look at those models.

Otherwise, we recommend you truncate your sequences by specifying the `max_sequence_length` parameter:

Expand Down

0 comments on commit 4c8adfa

Please sign in to comment.