Skip to content

Latest commit

 

History

History
41 lines (26 loc) · 1.59 KB

VISUALIZATION.md

File metadata and controls

41 lines (26 loc) · 1.59 KB

Visualization

The code is based on bertviz, a nice tool for BERT visualization.

Prepare

  • Change work directory to this directory.

    cd ./viz
  • Create a soft link to the data folder (If you are working on Windows, please modify the data path in the jupyter notebook by yourself).

    ln -s ../data ./
  • Download and unzip COCO val2017: images, annotations, place them under ./data/coco.

  • (Optional) Download pre-trained models as described in PREPARE_PRETRAINED_MODELS.md, if you want to precompute all attention maps by yourself.

Pre-compute attention maps

  • Pre-computing all attention maps on COCO val2017:

    python pretrain/vis_attention_maps.py --cfg cfgs/pretrain/vis_attention_maps_coco.yaml --save-dir ./vl-bert_viz
  • We provide 100 pre-computed attention maps of vl-bert-base-e2e on COCO val2017: GoogleDrive BaiduPan, please download and unzip it into ./data.

Visualization on Jupyter Notebook

  • Open Jupyter Notebook in this directory and select model_view_vl-bert_coco.ipynb.

    jupyter notebook
  • Run all cells in the notebook in order.

  • Browse attention maps in the last cell, you can change the image id to visualize other examples.