What is Precision?

Precision measures the proportion of observations that the algorithm predicts to be positive that actually are positive labels. In general, precision is calculated by:

Precision = True Positives / (True Positives + False Positives)

Using an example:

Confusion Matrix

Precision = 150 / (150 + 60) = .714

Author

Help us improve this post by suggesting in comments below:

– modifications to the text, and infographics
– video resources that offer clear explanations for this question
– code snippets and case studies relevant to this concept
– online blogs, and research publications that are a “must read” on this topic

Leave the first comment

Partner Ad