f69dc311a936 // 3.77 TiB free of 3.98 TiB

c File Name Size Date
parent folder--
-001. Chapter 5. Deep learning for computer vision.mp4115775702023-09-06 18:50:10
-002. Chapter 5. The convolution operation.mp4199022792023-09-06 18:50:13
-003. Chapter 5. The max-pooling operation.mp4164588332023-09-06 18:50:15
-004. Chapter 5. Training a convnet from scratch on a small dataset.mp4309349682023-09-06 18:50:18
-005. Chapter 5. Data preprocessing.mp4320219112023-09-06 18:50:22
-006. Chapter 5. Using a pretrained convnet.mp4452257422023-09-06 18:50:27
-007. Chapter 5. Fine-tuning.mp4200066712023-09-06 18:50:30
-008. Chapter 5. Visualizing what convnets learn.mp4283615532023-09-06 18:50:33
-009. Chapter 5. Visualizing convnet filters.mp4368038862023-09-06 18:50:38
-010. Chapter 6. Deep learning for text and sequences.mp4335105602023-09-06 18:50:41
-011. Chapter 6. Using word embeddings.mp4327463422023-09-06 18:50:45
-012. Chapter 6. Putting it all together from raw text to word embeddings.mp4213304692023-09-06 18:50:48
-013. Chapter 6. Understanding recurrent neural networks.mp4252762832023-09-06 18:50:51
-014. Chapter 6. Understanding the LSTM and GRU layers.mp4302167132023-09-06 18:50:55
-015. Chapter 6. Advanced use of recurrent neural networks.mp4232471022023-09-06 18:50:58
-016. Chapter 6. A common-sense, non-machine-learning baseline.mp4205873102023-09-06 18:51:01
-017. Chapter 6. Using recurrent dropout to fight overfitting.mp4352690542023-09-06 18:51:05
-018. Chapter 6. Going even further.mp4125432252023-09-06 18:51:07
-019. Chapter 6. Sequence processing with convnets.mp4158356912023-09-06 18:51:09
-020. Chapter 6. Combining CNNs and RNNs to process long sequences.mp4178015862023-09-06 18:51:12
-021. Chapter 7. Advanced deep-learning best practices.mp4189324032023-09-06 18:51:14
-022. Chapter 7. Multi-input models.mp4118837192023-09-06 18:51:16
-023. Chapter 7. Directed acyclic graphs of layers.mp4317270932023-09-06 18:51:20
-024. Chapter 7. Layer weight sharing.mp4122733752023-09-06 18:51:22
-025. Chapter 7. Inspecting and monitoring deep-learning models using Keras callba- acks and TensorBoard.mp4197187352023-09-06 18:51:26
-026. Chapter 7. Introduction to TensorBoard the TensorFlow visualization framework.mp4166071742023-09-06 18:51:28
-027. Chapter 7. Getting the most out of your models.mp4195716422023-09-06 18:51:31
-028. Chapter 7. Hyperparameter optimization.mp4204903912023-09-06 18:51:34
-029. Chapter 7. Model ensembling.mp4313162732023-09-06 18:51:38
-030. Chapter 8. Generative deep learning.mp4264760912023-09-06 18:51:41
-031. Chapter 8. A brief history of generative recurrent networks.mp4301364142023-09-06 18:51:46
-032. Chapter 8. Implementing character-level LSTM text generation.mp4200804182023-09-06 18:51:48
-033. Chapter 8. DeepDream.mp4308711632023-09-06 18:51:52
-034. Chapter 8. Neural style transfer.mp4191344142023-09-06 18:51:54
-035. Chapter 8. Neural style transfer in Keras.mp4213027082023-09-06 18:51:57
-036. Chapter 8. Generating images with variational autoencoders.mp4161021382023-09-06 18:52:00
-038. Chapter 8. Introduction to generative adversarial networks.mp4193511822023-09-06 18:52:03
-039. Chapter 8. A bag of tricks.mp4330856212023-09-06 18:52:07
-040. Chapter 9. Conclusions.mp4203505152023-09-06 18:52:09
-041. Chapter 9. How to think about deep learning.mp4375406572023-09-06 18:52:14
-042. Chapter 9. Key network architectures.mp4265683272023-09-06 18:52:17
-043. Chapter 9. The space of possibilities.mp4117539192023-09-06 18:52:19
-044. Chapter 9. The limitations of deep learning.mp4196468392023-09-06 18:52:21
-045. Chapter 9. Local generalization vs. extreme generalization.mp4143272992023-09-06 18:52:24
-046. Chapter 9. The future of deep learning.mp4396917702023-09-06 18:52:29
-047. Chapter 9. Automated machine learning.mp4340144742023-09-06 18:52:33
-048. Chapter 9. Staying up to date in a fast-moving field.mp4191191542023-09-06 18:52:35

control-panel