A year in Normland
Art: Woman Writing a Letter, Frans Van Mieris, 1680
Last time this year, seven months pregnant, but still somehow exhibiting less outward signs of physical and mental discomfort than today, I hit publish on the first Normcore post. I has very low expectations and I didn’t have a clear vision for the newsletter, but I knew a couple things.
First, that I had been writing sporadically both on technical and personal topics for I have a physical need to write for an audience that, if unmet, turns into repressed anxiety and terrible tweets. Second, I was consummately tired of the tech press covering tech issues in a very surface level, and I was tired that there were not enough people who knew how to run bash in the tech press, and not enough humanitarians in Hadoop clusters. My work has so far been acting as this bridge regardless of where I work and what I do. So, to write about this intersection was a natural extension of the way I think.
Initially I said that the newsletter was about making complicated tech less sexy and covered topics like Kafka and Python and neural networks, but if anything I’ve failed, because tech is just as attractive to me personally as ever.
But I’m not interested in talking about, for example, Zoom, or how contact tracing is a privacy violation, or Softbank, or Keybase getting acquired, because everyone else is talking about them, and I’ve already talked about them. I’m always interested in the second layer of the story, the underlying theme, the system, the people behind the system, how the data flows, where the money comes from. The way I come at this stuff is kind of sideways because I need time to read all of it and have it permeate the second level of my stupid, sleep-deprived, quarantine brain.
Right now, there is absolutely some startup profiting in some way from coronavirus that we don’t yet know about, and no one is investigating it (except for newsletters). There is someone making facemasks that they sell at markup on a distributed platform that uses AI to recognize whether you’re wearing a mask or not as you browse Facebook and surface an ad to you. There is absolutely going to be another ClearviewAI. There is some other company that’s going to come out of the “It’s Time to Build” boom that does something shady. There are new Kafkas, new WeWorks, new Vision Funds rising. Somewhere out there is a three-person team working on detecting COVID with drones and the whole thing runs on Kubernetes. Kafka itself is rebranding, which means they’re in trouble, which means it’s time for me to find the next Kafka to write about.
Ultimately, though, at the heart of all of this, is people and systems. As long as people are running the technology and not robots, there are going to be misunderstandings, conflicts of interest, different points of view, and layers and layers of things going on. And as long as the media continues to misreport and misunderstand the underlying people and systems, Normcore will never run out of things to write about, (as long as the kids continue to go to sleep at 9 and give me an hour and a half to stare at a blank Google doc and think about the larger trends of the past couple months. )
For example, what I’m interested right now is, what is the deal with the woman fired from the Florida Department of Health for making dashboards?
Initially, the media reported that she was a data scientist fired for refusing to manipulate data, based on her resignation letter, which she posted to a listserv of people consuming COVID data in Florida.
Late last Friday, the architect and manager of Florida's COVID-19 dashboard — praised by White House officials for its accessibility — announced that she had been removed from her post, causing outcry from independent researchers now worried about government censorship.
The dashboard has been a one-stop shop for researchers, the media and the public to access and download tables of COVID-19 cases, testing and death data to analyze freely. It had been widely hailed as a shining example of transparency and accessibility.
But over the last few weeks it had "crashed" and gone offline; data has gone missing without explanation and access to the underlying data sheets has become increasingly difficult.
This was on the heels of the White House praising her dashboard. Then, there was the counter-story that she wasn’t actually a data scientist, and she was fired for insubordination.
Gov. DeSantis, asked again about the firing of Rebekah Jones, says she was putting data on the portal that scientists didn't believe was valid.
Immediately, the internet responded claiming that she was either a hero or villain. Political pundits denounced her. Data science managers lined up to hire her.
But, as always, there’s more to the story. And the story to me here is that, anytime you are asked to put data in front of a lot of people, things will go very, very bad.
How do I know? I was a data analyst for a lot of years, in charge of sending data to executives for review, including one point where I reported directly to a CEO. There were a lot of interesting parts of the job, but probably the most PTSD I have is being asked to put together charts for executive meetings. Because everyone coming into an executive meeting has an opinion and a political agenda about the data. Not a Republican or Democrat kind of political opinion, but an opinion about whether the data should be there, if it’s being collected correctly, and if it’s saying what we want it to say.
All numbers are made up, right?
Well, kind of. What I mean is that all the data that we trust and believe on a daily basis, is only accurate in a specific context, at a specific time, and at a specific level. If you dig deep enough, ultimately all of the data in the world that drives major and minor decisions alike is built on wobbly foundations.
Take, for example, the coronavirus mortality rate. We have no idea what the true number is. I mean we have some ideas of true numbers. But we’re not taking into account: undercounting minor cases that never get tested and never go to the hospital. Undercounting deaths that haven’t happened yet. Undercounting due to political reasons. Undercounting, simply because maybe hospitals are overwhelmed with the number of cases. Overcounting or undercounting recoveries. And much, much, more.
If any number can tell you something different at any given time, there’s no solid data foundation that data people can push back on and say, look, this is ridiculous, there’s no way we can change it.
For example, if someone came to you in a board meeting and said, “Hey, this chart looks great, but do you mind changing pi to 2.14 so we can get a C round of funding”, you’d look at them like they were insane (although something like that did actually happen, of course.)
But in public-facing dashboards, anything goes, and it’s not necessarily the fault of Rebekah Jones or of her superiors, that’s just the way the data analytics game crumbles.
There are a lot of people looking at data, everyone has their own agenda, and to add on top of that, we’re in the middle of a pandemic where tempers are high and no one knows what they’re doing.
So then you get a situation like this:
According to internal emails reviewed by the Times, Department of Health I.T. Director Craig Curry emailed Rebekah Jones just before 5 p.m. on May 5. He cited Dr. Carina Blackmore, Director for the Division of Disease Control and Health Protection.
“Per Dr. Blackmore, disable the ability to export the data to files from the dashboard immediately. We need to ensure that dates (date fields) in all objects match their counterpart on the PDF line list published,” Curry wrote.
The tables in the PDF documents did not include the column of data showing when symptoms were first reported, only the “Case Date” — the date the state recorded and confirmed the case.
“This is the wrong call,” Jones replied minutes later.A few minutes later, she emailed Curry again. “Case line data is down.”
Then, just after 6 p.m., the I.T. director emailed both Jones and Dr. Blackmore. “Re-enable for now please.”
Jones replied, “10-4.”
I’ve been played these games before, where it’s after 5 PM and you just want to go home but your boss has a big meeting tomorrow so you keep changing numbers on the Powerpoint deck until it’s the right message for the right person until you forget what the data even looks like anymore.
All of this is to say that maybe Jones is right, but we don’t know the true operating structure of the department, all of the underlying small-p political decisions that got made around the dashboard, and so, as usual we are only getting part of the story, and it’s going to take a while for it to come out, like months, and when it does, Normcore will somehow use it again in some other random post about something completely unrelated.
So, anyway, to bring this thing to a close, because I’ve already used up hour and a half, the house is quiet, and tomorrow is another day under quarantine, what I’m getting at is thank you so much. For joining me on the Normcore experiment, for reading, commenting, for sending me stories from your workplaces and your lives, for retweeting, participating in discussions and threads, and for being a fun, warm corner of the internet as Normcore readers.
Thank you for joining me in the never-ending struggle (sometimes against The World, sometimes against my own urge to tweet knee-jerk reactions) to bring nuance and reflective thinking back to our insane lives. Here’s to the next year.
What I’m reading lately:
Paul Ford has a newsletter, and it’s very good (at least until he stops writing it next month)
Google Maps as a sanctuary
In appreciation of Allie Brosh
The Newsletter:
This newsletter’s M.O. is takes on tech news that are rooted in humanism, nuance, context, rationality, and a little fun. It goes out once a week to free subscribers, and once more to paid subscribers. If you like it, forward it to friends and tell them to subscribe!
The Author:
I’m a data scientist. Most of my free time is spent wrangling a preschooler and a baby, reading, and writing bad tweets. Find out more here or follow me on Twitter.