A word game based on BERT - "What would BERT say?"
TL;DR: I created a simple word game using the natural language model BERT.1 Click here to play it.
I like word puzzles, especially ones based on meanings of words (rather than just forming words from a set of given letters). A while back I had the idea of using a language model that does masked prediction (like BERT) to create a word game where the player tries to guess the missing word in a sentence.2 After an initial attempt to implement it as a mobile app, I decided to do a web version and finally published it today: “What would BERT say?”.
For the source texts, I downloaded some books from Project Gutenberg, including the top 50 most downloaded books. After parsing the books into sentences, I pick one of the words in each sentence, mask it, and ask BERT to predict it. Top 20 words returned by BERT and the original word in the sentence constitute the answers. If you guess the original word, you get 3 points; if you guess any of the other words you get 1 point.
Given my rather limited web programming skills, the game itself is rather minimalistic, and there is certainly a lot of room for improvement. Because there can be some preprocessing/parsing issues, and because the missing word is picked randomly, some of the questions can be too easy or too difficult. So I’d say no need to feel bad skipping questions.
Hope you enjoy the game! Please send me an email if you have any questions or comments.
Bibliography
-
Devlin, Jacob; Chang, Ming-Wei; Lee, Kenton; Toutanova, Kristina (11 October 2018). “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding”. arXiv:1810.04805v2 ↩
-
One can do something similar perhaps using a thesaurus but using a language model that was trained on large amounts of text provides much better predictions for the missing word. ↩