No student devices needed. Know more
10 questions
Our model predicts which people have Covid-19. The model correctly predicts 100 people who have Covid-19, but it misses 50 people who have the illness and falsely predicts that 10 people have the illness when they don't. What is the precision of the model?
91%
66%
Not enough information
A diagnostic test is used to identify individuals with a certain disease. The test correctly identifies 120 individuals with the disease, but fails to identify 30 individuals who actually have it, and incorrectly identifies 15 individuals as having the disease. What is the recall of the test?
80%
75%
Not enough information
If we predict True for every observation, what will our model recall be?
Not enough information
0%
100%
Which of the following do we use a test set for?
We use it to evaluate new features that we could add to our model
We use it for hyper-parameter tuning
We use it for held-out performance evaluation
None of the above
When is cross validation particularly helpful?
When we have a small dataset
When we have a large dataset
When we are particularly constrained on computational resources
Which of the following does an F1 score depend on?
TP, FP, FN
TP, TN, FP, TN
TN, FP, TP
TP, FN
Which of the following could you use to evaluate performance on a regression task?
Precision
Recall
Accuracy
MSE
What is the accuracy of a model that correctly predicts 150 cases out of 200 total cases, with 30 false positives?
75%
85%
Not enough information
Which metric is most appropriate for evaluating a model when the cost of false negatives is very high?
Precision
Recall
F1 Score
Accuracy
In a classification task, if a model has a high precision but low recall, what does this indicate?
The model is good at identifying all positive cases
The model is good at identifying positive cases but misses many
The model is not good at identifying positive cases
The model has a balanced performance
Explore all questions with a free account