Skip to content

Latest commit

 

History

History
16 lines (11 loc) · 563 Bytes

README.md

File metadata and controls

16 lines (11 loc) · 563 Bytes

BERT with CoLA

This example demonstrates how to retrain a BERT (Bidirectional Encoder Representations from Transformers) model on the CoLA (Corpus of Linguistic Acceptability) dataset.

Setup

To begin, you'll need the latest version of Swift for TensorFlow installed. Make sure you've added the correct version of swift to your path.

To train the model, run:

cd swift-models
swift run -c release BERT-CoLA