일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | |
7 | 8 | 9 | 10 | 11 | 12 | 13 |
14 | 15 | 16 | 17 | 18 | 19 | 20 |
21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 | 29 | 30 |
Tags
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 리뷰
- 뉴텝스 400
- Attention Is All You Need 리뷰
- MMTOD
- 정책기반 agent
- 길찾기
- Multi Task Learning Objectives for Natural Language Processing 리뷰
- UBAR: Towards Fully End-to-End Task-Oriented Dialog System with GPT-2
- hugging face tokenizer에서 special case 추가하기
- NLP 논문 리뷰
- Evaluate Multiwoz
- attention 설명
- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 리뷰
- ImageNet Classification with Deep ConvolutionalNeural Networks 리뷰
- RuntimeError: DataLoader worker (pid(s) ) exited unexpectedly
- A Neural Attention Model for Abstractive Sentence Summarization
- 다양한 모듈에서 log쓰기
- The Natural Language Decathlon:Multitask Learning as Question Answering
- BART 논문리뷰
- TOD 논문리뷰
- CNN 논문리뷰
- Multi Task Learning Objectives for Natural Language Processing
- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
- BERT 사용방법
- BERT란
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 논문리뷰
- 바닥부터 배우는 강화 학습
- Attention Is All You Need
- T5 논문 리뷰
- Zero-shot Generalization in Dialog State Tracking through GenerativeQuestion Answering
Archives
- Today
- Total
목록2021/11/13 (1)
one by one ◼◻◼◻
[CNN] ImageNet Classification with Deep ConvolutionalNeural Networks
제목 : ImageNet Classification with Deep ConvolutionalNeural Networks 저자 : Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton 링크 : https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf 이번주에는 Alex Net 으로도 알려져 있는 ImageNet Classification with Deep ConvolutionalNeural Networks 논문을 읽어 보았습니다. 무려 2021년 11월 기준 90000회가 넘는 인용수를 가진 엄청난 논문이었습니다. 논문을 읽으면서 느낀점은, 논문을 읽는다는 느낌이..
논문리뷰
2021. 11. 13. 23:56