1.1. Precision & recall with an example
1.2. Evaluating a model using Accuracy
1.3. Precision & recall trade off in binary classification with an example
3.1. Gradient of a function with single variable - basis for training neural nets #ne
3.2. Gradient of a function with multiple variables #math #genai #neuralnet #chatgpt
3.3. Mathematical model of a neuron. Basis for all LLMs like #chatgpt #claude #anthro
3.4. Neuron vs neural network. An axon terminal contains various neurotransmitters th
3.5. A neuron with a data point that has two features - X1 & X2. Expected output is Y
3.6. Neural network forward pass in action with two data points each with two feature
3.7. Back propagation example - #neuron #ai #genai #training #calculus
4.1. Transformer architecture from 'Attention is all you need' paper.
4.2. How encoders function in transformer models. #chatgpt #mistral #transformer #llm
4.3. How decoders function in transformer models #chatgpt #mistral #transformer #llms
5.1. Hyper parameters of Nano GPT (simplified GPT). #nanogpt #chatgpt #genai
5.2. Input to Nano GPT. How generated labeled data is organized and fed to Nano GPT.
5.3. What is self-attention in transformers. #nanogpt #chatgpt #genai
5.4. How information is aggregated at each token in a sentence 'Sunshine Brings Hope'
5.5. How dimensions of input tensors change as they pass through different layers of