برگزار شده

کلاس و دوره آموزشی نیمه خصوصی یادگیری عمیق (Deep Learning Fundamentals) با Profesor مهدی شکری زاده

ثبت نام در کلاس آنلاین نیمه خصوصی یادگیری عمیق (Deep Learning Fundamentals) با قیمت مناسب که Profesor مهدی شکری زاده برگزار می کند.

برگزار کننده: Profesor مهدی شکری زاده زمان ثبت نام: 1399/05/15 13:30 - 1399/06/28 03:30 زمان برگزاری: 1399/06/28 13:30 - 1399/10/26 03:30 روزهای برگزاری: جمعه ها 9 ضبح تا 2 بعد از ظهر
هزینه دوره: 
2,500,000 تومان
  • توضیحات دوره
  • سرفصل ها
  • نظرات
دوره برگزار شده است.
توضیحات دوره
نحوه برگزاری: حضوری
15
حداکثر ظرفیت شرکت‌کنندگان 20 نفر است.
This course is suitable for:

1 - Enthtusiastic people who just began their journey in the realm of AI

2 - Professionals who pursue further insight into deep learning

3 - Any student with a desire to become expert

سرفصل ها

Introduction to Deep Learning
- History
- How 'Deep' is Deep?

Tools of the Trade
- Python, Keras, Mxnet
- What About TensorFlow?
- Do I Need OpenCV?

Image Fundamentals
- Pixels
- Farming Images
- Image Coordinate System
- Images as Numpy Arrays
- RGB or BGR?
- Scaling and Aspect Ratio

Image Classification Basics
- What is Classification?
- Semantics
- Supervised / Unsupervised / Semi-Supervised Learning
- Deep Learning Classification Pipeline
- Gather / Split / Train / Evaluate
- Feature-based Learning Vs. Deep Learning

DataSets for Image Classification
- MNIST
- Animals
- CIFAR-10
- SMILES
- Kaggle
- Flowers-17
- CALTECH-101
- Tiny Imagenet 200
- Adience
- ImageNet
- CVPR
- Stanford Cars
- LISA

Configure Your Own Development Environment

First Image Classifier
- Introducing the Animal Dataset
- Build Your Own Deep Learning Toolkit
- Basic Image PreProcessor
- Building an Image Loader
- KNN
- KNN Hyperparameters
- KNN: Pros and Cons

Parameterized Learning
- The Famous Four
- Linear Classification
- From Images to Labels
- Simple Linear Classifier with Python
- Loss Function
- Multi-Class SVM Loss
- Cross-Entropy Loss
- Softmax Classifiers

Optimizing Methods and Regularization
- Gradient Descent
- Bias trick
- Implementing Basic Gradient Descent in Python
- Stochastic Gradient Descent (SGD)
- Mini-batch SGD
- Extensions of SGD: Momentum, Nesterov
- What is Regularization?
- Updating Loss and Weight with Regularization
- Types of Regularization

Neural Network Fundamentals
- Introduction
- The Famous Perceptron
- Backpropagation and Multi-layet Networks
- Multi-layet Networks with Keras
- The Four Ingredients in a Neural Network
- Weight Initialization
- Constant Initialization
- Uniform and Normal Distributions
- LeCun Uniform and Normal
- Glorat / Xavier Uniform and Normal
- He et. al. / Kaiming / MSRA Uniform and Normal

Convolutional Neural Networks
- Understanding Convolutions
- Convolution Versus Cross-correlation
- The Big and Small Matrices
- Kernels
- Implementing Convolution with Python
- Deep Learning and Convolutions
- CNN Building Blocks
- Layer Types
- Convolutional Layers
- Activation Layers
- Pooling Layers
- Fully-connected Layers
- Batch Normalization
- Dropout
- Common Architectures
- Layer Patterns
- Rules of Thumb
- Translation, Rotation and Scaling in CNNs

Training Your First CNN
- Keras Configuration
- Converting Images to Arrays
- Understanding keras.json File
- Image to Array PreProcessor
- Implementing ShallowNet
- ShallowNet on Animals Dataset
- ShallowNet on CIFAR-10

Saving and Loading Models
- Serializing a Model to Disk
- Loading a Pre-trained Model

LeNet: Recognizing Handwritten Digits
- The LeNet Architecture
- Implementing LeNet
- LeNet on MNIST

MiniVGGNet: Going Deeper with CNNs
- The VGG Family
- The Mini VGGNet Architecture
- Implementing MiniVGGNet
- MiniVGGNet on CIFAR-10
- With and Without Batch Normalization

Learning Rate Schedulers
- Dropping the Learning Rate
- The Standard Decay Schedule in Keras
- Step-based Decay
- Implementing Custom Learning Rate in Keras

Spotting Underfitting and Overfitting
- What are Underfitting and Overfitting?
- Effects of Learning Rates
- Pay Attention to Training Curve
- Validation Loss is Lower Than Training Loss?
- Monitoring the Training Process
- Creating a Training Monitor
- Babysit Your Trainer

Checkpointing Models
- Checkpointing Neural Network Models
- Improvements
- Checkpointing Best Models

Visualizing Network Architecture
- Impoprtance of Visualization
- Installing graphviz and paydot
- Visualizing Keras Networks

Out-of-the-box CNNs for Classification
- State-of-the-art CNNs in Keras
- VGG16 and VGG19
- ResNet
- Inception V3
- Xception
- Pre-trained ImageNet CNNs

Case Study: Breaking Captchas
Case Study: Smile Detection

نظرات
ثبت نظر جدید
هنوز نظری ثبت نشده است.
تماس با ما
تماس از طریق تلگرام
استاد سلام
Ostad salam online school

می خوام از کارهای مهم استادسلام با خبر بشم

اگر «بله» را انتخاب می‌کنید بعد از آن دکمه Allow را هم بزنید.