Dr. Shiva Ayyadurai is doubling down on a video which I poked holes in a few days ago. In it, he claimed that Joe Biden stole more than 60,000 votes in Michigan. It involved poor mathematics, and folks like Matt Parker of StandupMaths drew similar conclusions.
Stanford Ph.D and director of MIT’s Election Lab, Charles Stewart III, happens to agree with us.
In this new video, Ayyadurai dismisses math-based criticisms by saying that detecting election-fraud is “not a math problem, but a pattern-recognition problem.”
Luckily, pattern recognition is my main discipline and the basis for my professional career — starting at the University of Pennsylvania where I learned how to detect seizures from ECOG data in BE521, developing during SBIR and STTR grant research for the Navy while at Commonwealth Computer Research, Inc, and finally coming into full form at a fin-tech start-up focusing on enhancing employee financial health. …
A few days ago, Dr. Shiva Ayyadurai posted a video that claimed to prove election fraud in Michigan. He is wrong, and I’ll show you how using data from Oakland County, Michigan. My code and data sources are public and replicable — and everything I write is open for comment and discussion.
Previously, I posted a detailed takedown of how his analysis was a mathematical parlor trick — which he uses to generate a “suspicious” result that’s supposed to prove that Biden stole 60,000+ votes from Trump.
NOTE: On Nov. 16th, Ayyadurai doubled down on his misleading analyses.
On November 10th, Dr. Shiva Ayyadurai posted a video claiming that some simple analytics revealed election fraud in Michigan. It received more than 200,000 views, and claims that Joe Biden stole more than 60,000 votes in Michigan.
The main thrust of his analysis is a mathematical parlor trick. In a separate post, I play that parlor trick myself with Oakland County data to “prove” the opposite conclusion — showing that his analysis is bogus at its core.
We have officially entered the post-truth era.
For a few months I was deep in a skeptical hole where I had truly lost grip on what I considered “real”, and I had to claw my way out by getting real silly and coming up with a formal definition that we might all agree with. Truth, I propose, is given by this expression:
We’ve been sold a false promise.
Somewhere down the line we tricked ourselves into thinking that truth was a side-effect of volume. “If we collect enough data,” we said, “our overwhelming statistical power will blow a hole in the unknown.”
Instead, we shot ourselves in the foot.
In his article Statistical Paradises & Paradoxes In Big Data, the Harvard statistician (and certifiable genius, as far as I’m concerned) Xiao-li Meng sets down a rigorous proof of just how bad we screw ourselves when we collect data without regard for exactly how it’s collected.
Sylvia Plath taught us that if you wait too long to pick the perfect fig, you’ll watch them rain down and rot at your feet. The same is true of waiting too long to release something you’ve been working on.
If you constantly retool and re-edit, you’ll lose opportunities to reach your audience. But, if you don’t iterate enough, you risk putting out something sub-par that they wouldn’t enjoy.
The trade-off between perfecting and releasing is a hard one to navigate for any creator. …
We build software to solve human problems. But human problems can be messy, and sometimes it’s not terribly clear whether or not we’ve actually solved them.
Snapchat might tell they’re successful if they see 50% of regular users check out their new dog filter, and Facebook could say they’ve shattered their growth milestones by showing they’ve achieved more than 2.3 billion monthly active users.
Or, in the case of my company—Even — how can we tell if we’re making any impact on the financial health of our members? …