Skip to content

Added two pre-trained models and one new fine-tuning class

Compare
Choose a tag to compare
@thomwolf thomwolf released this 30 Nov 22:15
66d50ca

This release comprise the following improvements and updates:

  • added two new pre-trained models from Google: bert-large-cased and bert-base-multilingual-cased,
  • added a model that can be fine-tuned for token-level classification: BertForTokenClassification,
  • added tests for every model class, with and without labels,
  • fixed tokenizer loading function BertTokenizer.from_pretrained() when loading from a directory containing a pretrained model,
  • fixed typos in model docstrings and completed the docstrings,
  • improved examples (added do_lower_caseargument).