
Episode 6: Laura Nolan and Control Pain
Échec de l'ajout au panier.
Échec de l'ajout à la liste d'envies.
Échec de la suppression de la liste d’envies.
Échec du suivi du balado
Ne plus suivre le balado a échoué
-
Narrateur(s):
-
Auteur(s):
À propos de cet audio
In the second episode of the VOID podcast, Courtney Wang, an SRE at Reddit, said that he was inspired to start writing more in-depth narrative incident reports after reading the write-up of the Slack January 4th, 2021 outage. That incident report, along with many other excellent ones, was penned by Laura Nolan and I've been trying to get her on this podcast since I started it.
So, this is a very exciting episode for me. And for you all, it's going to be a bit different because instead of just discussing a single incident that Laura has written about, we get to lean on and learn from her accumulated knowledge doing this for quite a few organizations. And she's come with opinions.
A fun fact about this episode, I was going to title it "Laura Nolan and Control Plane Incidents," but the automated transcription service that I use, which is typically pretty spot on (thanks, Descript!), kept changing "plane" to "pain" and well, you're about to find out just how ironic that actually is...
We discussed:
- A set of incidents she's been involved with that featured some form of control plane or automation as a contributing factor to the incident.
- What we can learn from fields of study like Resilience Engineering, such as the notion of Joint Cognitive Systems
- Other notable incidents that have similar factors
- Ways that we can better factor in human-computer collaboration in tooling to help make our lives easier when it comes to handling incidents
References:
Slack's Outage on Jan 4th 2021
A Terrible, Horrible, No-Good, Very Bad Day at Slack
Google's "satpocalypse"
Meta (Facebook) outage
Reddit Pi-day outage
Ironies of Automation (Lissane Bainbridge)