We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We present DeepSeek-V3, a strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
就是上面的话,37B是怎么计算出来的呢? 是直接根据专家数 人工手算出来的吗?
比如总参数是可以调p.numel()来获取 这个37B有什么方法是获取的呢?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
We present DeepSeek-V3, a strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
就是上面的话,37B是怎么计算出来的呢?
是直接根据专家数 人工手算出来的吗?
比如总参数是可以调p.numel()来获取
这个37B有什么方法是获取的呢?
The text was updated successfully, but these errors were encountered: