🧠 Quiz

Quiz: Attention Mechanisms: Teaching Neural Networks Where to Look

Question 1 of 7
What is the fundamental problem with RNNs that attention mechanisms solve?
Question 2 of 7
In the translation example, how many numbers does the RNN encoder compress 12 words into?
Question 3 of 7
What is the key difference between embeddings and hidden state vectors?
Question 4 of 7
What mathematical operation is used to measure similarity between Query and Keys in the attention mechanism?
Question 5 of 7
What function converts attention scores into weights that sum to 1.0?
Question 6 of 7
How much faster is attention compared to RNNs when training on 1000 words?
Question 7 of 7
How many attention heads do transformers typically use per layer?
← Back to Newsletter