Even hired its data team a few months after launch. Our mandate was simple: figure out if we were doing the right thing. Everyone at the company wanted to know if the product was optimizing for impact, and to optimize we needed numbers: risks to minimize and objectives to maximize.
“This is the first time we’ve ever done something like this.” I remember Jon being being openly uncertain at my final onsite. The company still didn’t know about this whole ‘data science’ thing. It was a risk, both to Even and to me. …
Dr. Shiva Ayyadurai is doubling down on a video which I poked holes in a few days ago. In it, he claimed that Joe Biden stole more than 60,000 votes in Michigan. It involved poor mathematics, and folks like Matt Parker of StandupMaths drew similar conclusions.
Stanford Ph.D and director of MIT’s Election Lab, Charles Stewart III, happens to agree with us.
In this new video, Ayyadurai dismisses math-based criticisms by saying that detecting election-fraud is “not a math problem, but a pattern-recognition problem.”
A few days ago, Dr. Shiva Ayyadurai posted a video that claimed to prove election fraud in Michigan. He is wrong, and I’ll show you how using data from Oakland County, Michigan. My code and data sources are public and replicable — and everything I write is open for comment and discussion.
Previously, I posted a detailed takedown of how his analysis was a mathematical parlor trick — which he uses to generate a “suspicious” result that’s supposed to prove that Biden stole 60,000+ votes from Trump.
NOTE: On Nov. 16th, Ayyadurai doubled down on his misleading analyses.
On November 10th, Dr. Shiva Ayyadurai posted a video claiming that some simple analytics revealed election fraud in Michigan. It received more than 200,000 views, and claims that Joe Biden stole more than 60,000 votes in Michigan.
The main thrust of his analysis is a mathematical parlor trick. In a separate post, I play that parlor trick myself with Oakland County data to “prove” the opposite conclusion — showing that his analysis is bogus at its core.
We have officially entered the post-truth era.
For a few months I was deep in a skeptical hole where I had truly lost grip on what I considered “real”, and I had to claw my way out by getting real silly and coming up with a formal definition that we might all agree with. Truth, I propose, is given by this expression:
We’ve been sold a false promise.
Somewhere down the line we tricked ourselves into thinking that truth was a side-effect of volume. “If we collect enough data,” we said, “our overwhelming statistical power will blow a hole in the unknown.”
Instead, we shot ourselves in the foot.
In his article Statistical Paradises & Paradoxes In Big Data, the Harvard statistician (and certifiable genius, as far as I’m concerned) Xiao-li Meng sets down a rigorous proof of just how bad we screw ourselves when we collect data without regard for exactly how it’s collected.
He draws upon mathematics that are elegant…
We build software to solve human problems. But human problems can be messy, and sometimes it’s not terribly clear whether or not we’ve actually solved them.
Snapchat might tell they’re successful if they see 50% of regular users check out their new dog filter, and Facebook could say they’ve shattered their growth milestones by showing they’ve achieved more than 2.3 billion monthly active users.
Or, in the case of my company—Even…