Loading...

Mini N-gram Model Text Generation

Using a Custom PyTorch model with Only 64K Params

shivamCode0/ngram

This model is a simple 7-gram model that uses the last 7 characters to generate the next one. It is trained on the Tiny Shakespeare dataset and only uses 64,150 parameters, making it extremely small for a language model. Look at the repo for more details.


0.014.0
110000