Flair框架与PyTorch - OverflowError:int太大无法转换

5
我正在尝试使用Flair框架(https://github.com/flairNLP/flair)训练一个命名实体识别模型,并使用以下嵌入: TransformerWordEmbeddings('emilyalsentzer/Bio_ClinicalBERT')。然而,它总是失败并显示OverflowError: int too big to convert。这也会发生在其他一些转换器字嵌入,如XLNet。但是,BERTRoBERTa可以正常工作。

这是错误的完整跟踪:

2021-04-15 09:34:48,106 ----------------------------------------------------------------------------------------------------
2021-04-15 09:34:48,106 Corpus: "Corpus: 778 train + 259 dev + 260 test sentences"
2021-04-15 09:34:48,106 ----------------------------------------------------------------------------------------------------
2021-04-15 09:34:48,106 Parameters:
2021-04-15 09:34:48,106  - learning_rate: "0.1"
2021-04-15 09:34:48,106  - mini_batch_size: "32"
2021-04-15 09:34:48,106  - patience: "3"
2021-04-15 09:34:48,106  - anneal_factor: "0.5"
2021-04-15 09:34:48,106  - max_epochs: "200"
2021-04-15 09:34:48,106  - shuffle: "True"
2021-04-15 09:34:48,106  - train_with_dev: "False"
2021-04-15 09:34:48,106  - batch_growth_annealing: "False"
2021-04-15 09:34:48,107 ----------------------------------------------------------------------------------------------------
2021-04-15 09:34:48,107 Model training base path: "/home/xxx/data/xxx-clinical-bert"
2021-04-15 09:34:48,107 ----------------------------------------------------------------------------------------------------
2021-04-15 09:34:48,107 Device: cuda:0
2021-04-15 09:34:48,107 ----------------------------------------------------------------------------------------------------
2021-04-15 09:34:48,107 Embeddings storage mode: gpu
2021-04-15 09:34:48,116 ----------------------------------------------------------------------------------------------------
Traceback (most recent call last):
  File "train_medical_2.py", line 144, in <module>
    train_ner(d + '-base-ent',corpus_base)
  File "train_medical_2.py", line 136, in train_ner
    max_epochs=200)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/trainers/trainer.py", line 381, in train
    loss = self.model.forward_loss(batch_step)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/models/sequence_tagger_model.py", line 637, in forward_loss
    features = self.forward(data_points)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/models/sequence_tagger_model.py", line 642, in forward
    self.embeddings.embed(sentences)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/embeddings/token.py", line 81, in embed
    embedding.embed(sentences)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/embeddings/base.py", line 60, in embed
    self._add_embeddings_internal(sentences)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/embeddings/token.py", line 923, in _add_embeddings_internal
    self._add_embeddings_to_sentence(sentence)
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/flair/embeddings/token.py", line 999, in _add_embeddings_to_sentence
    truncation=True,
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 2438, in encode_plus
    **kwargs,
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/transformers/tokenization_utils_fast.py", line 472, in _encode_plus
    **kwargs,
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/transformers/tokenization_utils_fast.py", line 379, in _batch_encode_plus
    pad_to_multiple_of=pad_to_multiple_of,
  File "/home/d111199102201607101/flair/lib/python3.6/site-packages/transformers/tokenization_utils_fast.py", line 330, in set_truncation_and_padding
    self._tokenizer.enable_truncation(max_length, stride=stride, strategy=truncation_strategy.value)
OverflowError: int too big to convert

我已经尝试更改embedding_storage_modehidden_sizemini_batch_size,但这些都没有解决问题。
有人遇到相同的问题吗? 有什么方法可以解决吗?
谢谢。

1
我也遇到了同样的问题。你解决了吗? - sissythem
1个回答

0

您可以使用以下参数来限制令牌的长度

embedding = TransformerWordEmbeddings('emilyalsentzer/Bio_ClinicalBERT')
embedding.max_subtokens_sequence_length = 512
embedding.stride = 512//2

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接