The concepts of datasets, samples, labels in Machine Learning.

Sources:

  1. Mu Li et al. 1. Introduction. Dive into Deep Learning.

  2. Cross-validation: evaluating estimator performance

Read more »

Decorators are a significant part of Python. In simple words: they are functions which take other functions as inputs and output their modified versions.

Sources:

  1. Python Tips: Decorators
Read more »

Resources for essential topics of Machine Learning and Deep learning, including Natural language processing (NLP), Computer Vision (CV), Reinforcement Learning (RL), Self-Supervised Learning (SSL), etc.

After several years of learning, I don't need these resources anymore. But I still preserve them to share with those who're new to this field.

Read more »

Sources:

  1. The spelled-out intro to neural networks and backpropagation: building micrograd
  2. Stanford CS231N, Lecture 4

Other useful resources:

  1. StatQuest's Neural Networks videos

  2. Efficient backprop. You need do download it via

    1
    wget http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf

The python script and jupyter notebook used in this article can be found at here.

This article is a step-by-step explanation of neural networks which are extensively used in machine learning. It only involves the most basic case where the input of a neuron is a vector (1-D tensor)and output of a neuron is a scalar (0-D tensor). But the idea holds for higher, please reter to Derivatives, Backpropagation, and Vectorization for details.

# TODO

Read more »

Ref:

  1. EE 376A: Information Theory. Winter 2018. Lecture 6. - Stanford University
  2. EE 376A: Information Theory. Winter 2017. Lecture 4. - Stanford University
  3. Elements of Information Theory
Read more »
0%