Skip to content

Conversation

BunnyNoBugs
Copy link

Add RoBERTa support in squad_torch_bert config.

The changes concern the shape of token_type_ids specific for RoBERTa

b_input_masks = torch.cat(input_masks, dim=0).to(self.device)
b_input_type_ids = torch.cat(input_type_ids, dim=0).to(self.device)

if self.pretrained_bert == 'roberta-base':
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it seems like we should generalize this condition (not only to roberta-base)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

@BunnyNoBugs
Copy link
Author

Now distilbert, roberta, xlm, bart and longformer-like models can be trained correctly (confirmed on a GPU)

@IgnatovFedor IgnatovFedor changed the base branch from master to dev April 6, 2022 15:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants