Infolia AI
Learn AI
Newsletter
Blog
Subscribe Free
▼
Get Started Today
Get curated AI insights delivered to your inbox
Subscribe
No spam, unsubscribe anytime
🧠 Quiz
Quiz: Training Neural Networks: The Complete Learning Loop
Question 1 of 8
What is the first step in the complete training loop?
Initialize weights with random values
Forward propagation to make prediction
Calculate loss to measure error
Backward propagation to find responsible weights
Question 2 of 8
What does one epoch represent?
One complete pass through the entire training dataset
Processing one batch of images
One weight update
One forward propagation step
Question 3 of 8
In the example with 60,000 images and a batch size of 32, how many batches are there per epoch?
1,875
32
60,000
18,750
Question 4 of 8
What is the sweet spot range for batch sizes mentioned in the content? (Read through https://infolia.ai/archive/40 to answer)
32-128
1-10
256-512
1000-10000
Question 5 of 8
What technique is used when you stop training after validation loss stops improving?
Early stopping
Gradient clipping
Batch normalization
Dropout
Question 6 of 8
What are the classic metrics that indicate overfitting?
Training: 99%, Validation: 75%
Training: 60%, Validation: 58%
Training: 96%, Validation: 94%
Training: 50%, Validation: 50%
Question 7 of 8
What happens when the loss shoots up to NaN during training?
Learning rate is too high and the algorithm overshoots, hitting numerical instability
Learning rate is too low and training is too slow
The model is underfitting
The batch size is too small
Question 8 of 8
What is the recommended starting value for the learning rate hyperparameter?
0.001
0.1
0.0001
1.0
Submit Quiz
← Back to Newsletter