목록성능향상 (1)
On the journey of
[CSE URP]BEiT: BERT Pre-Training of Image Transformers (1)
BEiT: BERT Pre-Training of Image Transformers Hangbo Bao, Li Dong, Songhao Piao, Furu Wei * https://github.com/microsoft/unilm/tree/master/beit Abstract 논문의 제목인 BEIT (Bidirectional Encoder representation from Image Transformers)는 BERT 모델에서 차용한 것 입력 이미지를 2가지 방법을 통해 masked image modeling(MIM) 학습 VIT + Blockwise Masking : image patches (such as 16 x 16 pixles) DALL-E Tokenizer : visual toekns (i.e...
Experiences & Study/CSE URP' 29
2023. 8. 24. 17:53