일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
Tags
- 바닥부터 배우는 강화 학습
- Evaluate Multiwoz
- Multi Task Learning Objectives for Natural Language Processing
- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 리뷰
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 리뷰
- Attention Is All You Need
- RuntimeError: DataLoader worker (pid(s) ) exited unexpectedly
- T5 논문 리뷰
- The Natural Language Decathlon:Multitask Learning as Question Answering
- A Neural Attention Model for Abstractive Sentence Summarization
- TOD 논문리뷰
- MMTOD
- CNN 논문리뷰
- ImageNet Classification with Deep ConvolutionalNeural Networks 리뷰
- 길찾기
- Multi Task Learning Objectives for Natural Language Processing 리뷰
- BART 논문리뷰
- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
- BERT 사용방법
- Attention Is All You Need 리뷰
- hugging face tokenizer에서 special case 추가하기
- 뉴텝스 400
- UBAR: Towards Fully End-to-End Task-Oriented Dialog System with GPT-2
- Zero-shot Generalization in Dialog State Tracking through GenerativeQuestion Answering
- attention 설명
- 다양한 모듈에서 log쓰기
- 정책기반 agent
- NLP 논문 리뷰
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 논문리뷰
- BERT란
Archives
- Today
- Total
목록ImageNet Classification with Deep ConvolutionalNeural Networks 리뷰 (1)
one by one ◼◻◼◻

제목 : ImageNet Classification with Deep ConvolutionalNeural Networks 저자 : Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton 링크 : https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf 이번주에는 Alex Net 으로도 알려져 있는 ImageNet Classification with Deep ConvolutionalNeural Networks 논문을 읽어 보았습니다. 무려 2021년 11월 기준 90000회가 넘는 인용수를 가진 엄청난 논문이었습니다. 논문을 읽으면서 느낀점은, 논문을 읽는다는 느낌이..
논문리뷰
2021. 11. 13. 23:56