The second chart provides a day-by-day breakdown of this data.
For example, on March 10, 2007, the Spamhaus blacklist correctly tagged 71% of spam received, and incorrectly tagged no non-spam mail. The Fiveten blacklist correctly tagged 66% of the spam, but incorrectly reported one third of the non-spam mail as spam. The determination I make from this data is that Spamhaus blocks more spam than Fiveten, and does it more accurately. If I used the Fiveten list, I would block much desired mail. With Spamhaus, no desired mail would have been blocked on that day.
Note that not everyone is going to agree with my classification of false positives, and that's fine. In my determination, a false positive is a piece of mail that I signed up for that would've been blocked by a given blacklist. I think that's accurate. You will find, though, that some blacklists list things that I would not consider spam. For example, some lists will block mail from any sender who is not 100% confirmed opt-in (aka double opt-in). Since very few senders are fully 100% confirmed opt-in, lists such as these inherently block mail from many senders. A list operated in this fashion would have a vastly different interpretation of what constitutes a false positive than I would. It would be within their charter to list and facilitate the blocking of mail from sites like this, even if they haven't sent spam. This wouldn't be considered a false positive by such a list, but would potentially be considered a false positive by me.
Click here for information on what and how I determine to be spam and not spam.