일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
Tags
- BERT란
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 논문리뷰
- UBAR: Towards Fully End-to-End Task-Oriented Dialog System with GPT-2
- attention 설명
- Evaluate Multiwoz
- NLP 논문 리뷰
- BERT 사용방법
- 다양한 모듈에서 log쓰기
- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 리뷰
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 리뷰
- Multi Task Learning Objectives for Natural Language Processing
- The Natural Language Decathlon:Multitask Learning as Question Answering
- 길찾기
- 바닥부터 배우는 강화 학습
- TOD 논문리뷰
- A Neural Attention Model for Abstractive Sentence Summarization
- RuntimeError: DataLoader worker (pid(s) ) exited unexpectedly
- Zero-shot Generalization in Dialog State Tracking through GenerativeQuestion Answering
- T5 논문 리뷰
- ImageNet Classification with Deep ConvolutionalNeural Networks 리뷰
- 정책기반 agent
- 뉴텝스 400
- MMTOD
- BART 논문리뷰
- Attention Is All You Need 리뷰
- Multi Task Learning Objectives for Natural Language Processing 리뷰
- hugging face tokenizer에서 special case 추가하기
- Attention Is All You Need
- CNN 논문리뷰
Archives
- Today
- Total
목록ImageNet Classification with Deep ConvolutionalNeural Networks 리뷰 (1)
one by one ◼◻◼◻

제목 : ImageNet Classification with Deep ConvolutionalNeural Networks 저자 : Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton 링크 : https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf 이번주에는 Alex Net 으로도 알려져 있는 ImageNet Classification with Deep ConvolutionalNeural Networks 논문을 읽어 보았습니다. 무려 2021년 11월 기준 90000회가 넘는 인용수를 가진 엄청난 논문이었습니다. 논문을 읽으면서 느낀점은, 논문을 읽는다는 느낌이..
논문리뷰
2021. 11. 13. 23:56