This repo contains all my work for this specialization. The code and images, are taken from Deep Learning Specialization on Coursera.
In five courses, you are going learn the foundations of Deep Learning, understand how to build neural networks, and learn how to lead successful machine learning projects. You will learn about Convolutional networks, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, and more. You will work on case studies from healthcare, autonomous driving, sign language reading, music generation, and natural language processing. You will master not only the theory, but also see how it is applied in industry. You will practice all these ideas in Python and in TensorFlow, which we will teach.
The Github Repository is designed to help viewers improving programming skills and revisiting basic Deep Learning knowledge.
Please follow and respect the Coursera Honor Code if you are enrolled with any Coursera Deep Learning courses. It is OK to use this repo as a reference to debug your program, but it is wrong to copy-paste codes from the repo just to getting by the Lab Assignments. Knowledge and practical experience are more important than certificates.
Personally speaking, even with a PhD in computer science, I still find few Lab Assignments are quite difficult to get all right at the first run , especially, the labs on the Deep ConvNets and B-LSTM models. One typo will cost you more time to debug, but it definitely worth it. You are improving your programming skills, learning new knowledges and knowing yourself at the same time.
Comments and Recommendation
There are also some disadvantage about this DL course. For example, the TensorFlow package used in the Lab is version 1.x. The most cutting edge package of TensorFlow is already 2.4.x (TensorFlow ), which means in real practice, all the code you learned in these courses have to updated, but the math are still the same. The Keras today is part of TensorFlow 2.0 instead of an indenpendent framework.
Moreover, I highly recommend this Paper With Code website https://paperswithcode.com/, which create a free and open resource with Machine Learning papers, code and evaluation tables.
For example, we see the trends of paper implementations grouped of frameworks. It is clear to see all those authors prefer PyTorch, thus you know what to do.
- Understand the major technology trends driving Deep Learning.
- Be able to build, train and apply fully connected deep neural networks.
- Know how to implement efficient (vectorized) neural networks.
- Understand the key parameters in a neural network’s architecture.
- Week 2 – Python Basics with Numpy
- Week 2 – Logistic Regression with a Neural Network mindset
- Week 3 – Planar data classification with a hidden layer
- Week 4 – Building your Deep Neural Network: Step by Step
- Week 4 – Deep Neural Network: Application
- Understand industry best-practices for building deep learning applications.
- Be able to effectively use the common neural network “tricks”, including initialization, L2 and dropout regularization, Batch normalization, gradient checking,
- Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
- Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance
- Be able to implement a neural network in TensorFlow.
- Week 1 – Initialization
- Week 1 – Regularization
- Week 1 – Gradient Checking
- Week 2 – Optimization
- Week 3 – TensorFlow
- Understand how to diagnose errors in a machine learning system, and
- Be able to prioritize the most promising directions for reducing error
- Understand complex ML settings, such as mismatched training/test sets, and comparing to and/or surpassing human-level performance
- Know how to apply end-to-end learning, transfer learning, and multi-task learning
- There is no Program Assignments for this course. But this course comes with very interesting case study quizzes.
- Understand how to build a convolutional neural network, including recent variations such as residual networks.
- Know how to apply convolutional networks to visual detection and recognition tasks.
- Know to use neural style transfer to generate art.
- Be able to apply these algorithms to a variety of image, video, and other 2D or 3D data.
- Week 1 – Convolutional Model: step by step
- Week 1 – Convolutional Model: application
- Week 2 – Keras – Tutorial – Happy House
- Week 2 – Residual Networks
- Week 3 – Autonomous driving application – Car detection
- Week 4 – Face Recognition for the Happy House
- Week 4 – Art Generation with Neural Style Transfer
- Understand how to build and train Recurrent Neural Networks (RNNs), and commonly-used variants such as GRUs and LSTMs.
- Be able to apply sequence models to natural language problems, including text synthesis.
- Be able to apply sequence models to audio applications, including speech recognition and music synthesis.
- Week 1 – Building a Recurrent Neural Network – Step by Step
- Week 1 – Dinosaur Island – Character-Level Language Modeling
- Week 1 – Improvise a Jazz Solo with an LSTM Network
- Week 2 – Operations on word vectors
- Week 2 – Emojify
- Week 3 – Neural machine translation with attention
- Week 3 – Trigger word detection
This is a list of research papers referenced by the Deep Learning Specialization course.
- Lacuna et al, 1998 – Gradient-based learning applied to document recognition
- Krizhevsky et al 2012 – ImageNet classification with deep convolutional neural networks
- Simony & Zisserman 2015 – Very deep convolutional networks for large scale image recognition – He et al, 2015
- Servant et al, 2014 – OverFeat: Integrated recognition, localization and detection using convolutional networks
- Redmon et al, 2014 – You Only Look Once: Unified real-time object detection
- Redmon et al, 2016 – YOLO9000: Better, Faster, Stronger
- Girishik et al, 2013 – Rich feature hierarchies for accurate object detection and semantic segmentation
- Girshik, 2015 – Fast R-CNN
- Ren et al, 2016 – Faster R-CNN: Toward real-time object detection with region proposal networks
- Gates et al, 2015 – A neural algorithm of artistic style
- Harish Narayanan – Convolutional neural networks for artistic style transfer
- Log0, TensorFlow Implementation of “A Neural Algorithm of Artistic Style”
- Karen Simonyan and Andrew Zisserman (2015). Very deep convolutional networks for large-scale image recognition
- On the Properties of Neural Machine Translation: Encoder-Decoder Approaches – Kyunghyun Cho, Bart van Merrienboer, Dzmitry Bahdanau, Yoshua Bengio
- Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling – Junyoung Chung, Caglar Gulcehre, KyungHyun Cho, Yoshua Bengio
- Linguistic Regularities in Continuous Space Word Representations – Tomas Mikolov, Wen-tau Yih, Geoffrey Zwei
- A Neural Probabilistic Language Model – Bengio et al
- Distributed Representations of Words and Phrases and their Compositionality Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, Jeffrey Dean
- GloVe: Global Vectors for Word Representation Jeffrey Pennington, Richard Socher, Christopher D. Manning
- Sequence to Sequence Learning with Neural Networks – Ilya Sutskever, Oriol Vinyals, Quoc V. Le
- Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation – Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, Yoshua Bengio
- Deep Captioning with Multimodal Recurrent Neural Networks (m-RNN) – Junhua Mao, Wei Xu, Yi Yang, Jiang Wang, Zhiheng Huang, Alan Yuille
- Show and Tell: A Neural Image Caption Generator – Oriol Vinyals, Alexander Toshev, Samy Bengio, Dumitru Erhan
- Deep Visual-Semantic Alignments for Generating Image Descriptions – Andrej Karpathy, Li Fei-Fei
- BLEU: a Method for Automatic Evaluation of Machine Translation Kishore Papineni, Salim Roukos, Todd Ward, and Wei-Jing Zhu
- Neural Machine Translation by Jointly Learning to Align and Translate – Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio
- Show, Attend and Tell: Neural Image Caption Generation with Visual Attention – Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhutdinov, Richard Zemel, Yoshua Bengio
- Connectionist Temporal Classification: Labelling Unsegmented Sequence Data with Recurrent Neural Networks
Master Deep Learning, and Break into AI
- Github: Reference List
- Organization: https://www.deeplearning.ai
- Course: https://www.coursera.org/specializations/deep-learning
- Instructor: Andrew Ng
- 中文推荐 GitBook [吴恩达《深度学习》系列课程笔记]: (https://kyonhuang.top/Andrew-Ng-Deep-Learning-notes/#/)