Played around with Hugging Face's implementation of BERT and GPT2 for my understanding. The notebook includes:
- Bert as an encoder, GPT2 as a decoder.
- Simple examples of how vectors representations look like including the length of each vector.
- How similar representations correspond to spatial proximity.
Open BERT_GPT2_Play.ipynb in colab from the link and create your own copy. Further comments inline.