Added two pre-trained models and one new fine-tuning class
This release comprise the following improvements and updates:
- added two new pre-trained models from Google:
bert-large-cased
andbert-base-multilingual-cased
, - added a model that can be fine-tuned for token-level classification:
BertForTokenClassification
, - added tests for every model class, with and without labels,
- fixed tokenizer loading function
BertTokenizer.from_pretrained()
when loading from a directory containing a pretrained model, - fixed typos in model docstrings and completed the docstrings,
- improved examples (added
do_lower_case
argument).