2020重新启航

2020学习目标:

  • 读完Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow
  • 学完CMU CS 11-747, Neural Networks for NLP
  • 一口气可以做50个俯卧撑

精读Paper&Book

  • 201712 - Transformer - Attention Is All You Need
  • 201810 - BERT - Pre-training of Deep Bidirectional Transformers for Language Understanding
  • 201907 - RoBERTa - A Robustly Optimized BERT Pretraining Approach
  • 202001 - ALBERT - A Lite BERT for Self-supervised Learning of Language Representations

精学courses

每周一节课

精通tensorflow 1.12

  • Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow
    • 1. The Machine Learning Landscape

    • 2. End-to-End Machine Learning Project

    • 3. Classification

    • 4. Training Models

    • 12. Custom Models and Training with TensorFlow

    • 13. Loading and Preprocessing Data with TensorFlow

    • 11. Training Deep Neural Networks

    • 10. Introduction to Artificial Neural Networks with Keras

    • 16. Natural Language Processing with RNNs and Attention

    • 8. Dimensionality Reduction

    • 9. Unsupervised Learning Techniques

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章