You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First thanks for sharing the code, it's really helpful!!
I have a question when I tried to use the pretrained Bert on my dataset for sentence classification. I realize that in Bert, the input feature should be consist of token embedding, segment embedding and position embedding. But I'm not seeing the positional embedding in your code. In run_model:
inputs = {'input_ids': batch[0],
'attention_mask': batch[1],
'token_type_ids': batch[2] if args['model_type'] in ['bert', 'xlnet'] else None, # XLM don't use segment_ids
'labels': batch[3]}
outputs = model(**inputs)
Or I might miss this detail, could you please tell me whether you implement this, and if so where exactly?
Thanks again and looking forward to your reply!
The text was updated successfully, but these errors were encountered:
First thanks for sharing the code, it's really helpful!!
I have a question when I tried to use the pretrained Bert on my dataset for sentence classification. I realize that in Bert, the input feature should be consist of token embedding, segment embedding and position embedding. But I'm not seeing the positional embedding in your code. In run_model:
Or I might miss this detail, could you please tell me whether you implement this, and if so where exactly?
Thanks again and looking forward to your reply!
The text was updated successfully, but these errors were encountered: