2020重新啓航

2020學習目標:

  • 讀完Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow
  • 學完CMU CS 11-747, Neural Networks for NLP
  • 一口氣可以做50個俯臥撐

精讀Paper&Book

  • 201712 - Transformer - Attention Is All You Need
  • 201810 - BERT - Pre-training of Deep Bidirectional Transformers for Language Understanding
  • 201907 - RoBERTa - A Robustly Optimized BERT Pretraining Approach
  • 202001 - ALBERT - A Lite BERT for Self-supervised Learning of Language Representations

精學courses

每週一節課

精通tensorflow 1.12

  • Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow
    • 1. The Machine Learning Landscape

    • 2. End-to-End Machine Learning Project

    • 3. Classification

    • 4. Training Models

    • 12. Custom Models and Training with TensorFlow

    • 13. Loading and Preprocessing Data with TensorFlow

    • 11. Training Deep Neural Networks

    • 10. Introduction to Artificial Neural Networks with Keras

    • 16. Natural Language Processing with RNNs and Attention

    • 8. Dimensionality Reduction

    • 9. Unsupervised Learning Techniques

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章