You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the reader, the pretrained weight you used for BertForQuestionAnswering is 'bert-base-uncased'. Is it okay I used other pretrained weight like 'bert-large-uncased-whole-word-masking-finetuned-squad'? Will it help to enhance the result?
When I used 'bert-large-uncased', it stated the error 'cuda out of memory'. Is there any solution for that? Will fp16=True help?
I haven't trained the model on SQuAD2.0, will it help to increase the accuracy of response?
Thanks for your response in advance.
The text was updated successfully, but these errors were encountered:
HI,
There are several questions I would like to ask:
In the reader, the pretrained weight you used for BertForQuestionAnswering is 'bert-base-uncased'. Is it okay I used other pretrained weight like 'bert-large-uncased-whole-word-masking-finetuned-squad'? Will it help to enhance the result?
When I used 'bert-large-uncased', it stated the error 'cuda out of memory'. Is there any solution for that? Will fp16=True help?
I haven't trained the model on SQuAD2.0, will it help to increase the accuracy of response?
Thanks for your response in advance.
The text was updated successfully, but these errors were encountered: