Skip to content

请问RWKV Context长度可以无限外推么。 #179

Discussion options

You must be logged in to vote

Hello! The context length of RWKV is theoretically infinity, while in reality, it is limited by the context length on which it was trained, as well as numerical precision, and the size of the hidden state.
As a result, you might encounter some performance decrease when the context length is longer than that being trained.
See 128k context length here:
https://huggingface.co/xiaol/rwkv-7B-world-novel-128k
(Possibly the open-source model with the longest context length)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@BrightXiaoHan
Comment options

Answer selected by BrightXiaoHan
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants