Skip to content

Commit e568e07

Browse files
authored
fix get_offset_mapping error for Ernie tokenizer (#2857)
1 parent 4d18e4d commit e568e07

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

paddlenlp/transformers/tokenizer_utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1320,7 +1320,7 @@ def get_offset_mapping(self, text):
13201320
if text is None:
13211321
return None
13221322
split_tokens = []
1323-
if self.do_basic_tokenize:
1323+
if hasattr(self, "basic_tokenizer"):
13241324
for token in self.basic_tokenizer.tokenize(
13251325
text, never_split=self.all_special_tokens):
13261326
# If the token is part of the never_split set

0 commit comments

Comments
 (0)