- cross-posted to:
- news@hexbear.net
- cross-posted to:
- news@hexbear.net
“This is basically what we were all worried about with Y2K, except it’s actually happened this time.”
What people were worried about with Y2K was nuclear weapons being launched and planes falling out of the sky. And it was nonsense, but bad things could have happened.
The good part is that the harm was mitigated for the most part through due diligence of IT workers.
Y2K wasn’t nonsense. It was unremarkable, ultimately, because of the efforts taken to avoid it for a decade.
20 Years Later, the Y2K Bug Seems Like a Joke—Because Those Behind the Scenes Took It Seriously
President Clinton had exhorted the government in mid-1998 to “put our own house in order,” and large businesses — spurred by their own testing — responded in kind, racking up an estimated expenditure of $100 billion in the United States alone. Their preparations encompassed extensive coordination on a national and local level, as well as on a global scale, with other digitally reliant nations examining their own systems.
“The Y2K crisis didn’t happen precisely because people started preparing for it over a decade in advance. And the general public who was busy stocking up on supplies and stuff just didn’t have a sense that the programmers were on the job,” says Paul Saffo, a futurist and adjunct professor at Stanford University.What is worth noting about this event is how public concern grows and reacts out of ignorance. Just because a pending catastrophe results in something ‘less-than’ does not mean best efforts weren’t taken to avoid it. Just because something isn’t as bad as it could have been doesn’t mean it was a hoax (see: covid19). Additionally, just because something turns out to be a grave concern doesn’t mean best efforts didn’t mitigate what could have been far worse (see: inflation).
After the collective sigh of relief in the first few days of January 2000, however, Y2K morphed into a punch line, as relief gave way to derision — as is so often the case when warnings appear unnecessary after they are heeded. It was called a big hoax; the effort to fix it a waste of time.
Written in 2019 about an event in 1999, it’s apparent to me that not much has changed. We’re doomed to repeat history even provided with the most advanced technology the world has ever known to pull up the full report of history in the palm of our hands.
The inherent conundrum of the
Y2K[insert current event here] debate is that those on both ends of the spectrum — from naysayers to doomsayers — can claim that the outcome proved their predictions correct.I never said it was nonsense. I said what a lot of people were worried about was nonsense- stuff like it causing nuclear armageddon or crashing the global economy.
And this event today isn’t even what IT professionals were worried about. This is a big headache for them and a day off for a lot of other people. It’s not going to do the damage Y2K would have done had people not done enough.
Real life Armageddon: Bruce Willis & crew return home and are greeted by boos and protestors with “waste of taxpayer money” signs. Can you imagine…
The United States would never send a crew up to stop an asteroid. If it’s a Dem president, SCOTUS would block it. If it’s Donald, he’d claim the asteroid is fake news and a Dem hoax, then the scoundrels in the House and Senate would obstruct any action via their little bunkers.
Work is borked so I get to paint Warhammer today.
Minis are for painting at unspecified times in the future, not now
My Mountain of Shame must be mined.
I love this phrase and I will use it.
I fully support your sacrifice o7
Be sure to post the results to the corresponding communities.
Meanwhile, friends at my old company run sites with CS and my current company doesn’t. I’m kicking back and having a great friday
So the hindsight is always 20/20 but was there like warning signs or red flags which should have been obvious this is going to happen or are you just lucky in hindsight?
Red flags? Yeah don’t use “security Software” that just increases your attack surface. Why the fuck would you want to install a root kit on your critical infrastructure?
The second one, as far as I can tell. But also, those calls are made above me and I have no insight into the decision-making. It could have been keen foresight by someone else.
Same. Had time for my trainees and used this for an extra learning session. :)
My office sent out this big message about people not being able to log in this morning. And I had absolutely no issues, and all of my tools are working. So I guess I’m stuck actually doing work.
Your work and their work, since they can’t log in.
Look at this “team” player hehe
Bro, why didn’t you lie 😭
Will this change how companies run their IT? Absofuckinglutelynot!
Nothing like getting a full work day in before the office opens
So was this Crowdstrike’s fuck up and not Microsoft’s?
Probably, but the issue is in the interface between Windows and the CrowdStrike software causing Windows to go into a crashing bootloop.
Closed source is great, I tell you. /s
It has nothing to do with closed source, this is entirely about a privileged application fucking around and not testing shit before pushing it globally. You can have the same issues with Linux too. My org has stated that our AV product is never to be installed on Linux machines because they hosed tons of machines years back doing something similar.
High privilege security applications are always going to carry this risk due to how deeply they hook into the OS to do their job.
That is true. An obvious failure is that the update that broke everything was pushed everywhere simultaneously.
That’s what has me confused. I haven’t even stepped into an office in 20 years so I have no modern experience there, but I would have thought that a company with such a massive user base would release things at different times based on region. Is it because with security based applications they don’t want to risk someone having time to exploit a new vulnerability?
Is it because with security based applications they don’t want to risk someone having time to exploit a new vulnerability?
Pretty much. Given how fast the malware scene evolves and implements day 1 exploits, and how quickly they need to react to day 0 exploits, there’s kind of an unwritten assumption (and it might actually be advertised as a feature) that security software needs to react as fast as possible to malicious signatures. And given the state of malware and crypto shit, it’s hard to argue that it isn’t needed, considering how much damage you’ll suffer if they get through your defenses.
That being said, this kind of a fuck up is damned near unacceptable, and these updates should have been put through multiple automated testing layers to catch something like this before this got to the end user devices. I could see the scale of it taking them out of business, but I also wouldn’t be surprised if they managed to scrape by if they handle it correctly (though I don’t see the path forward on this scale, but I’m not a c-suite for many reasons). Like I said above, we had an incident years back that hosed a bunch of our Linux boxes, but the vendor is still around (was a much smaller scale issue) and we even still use them because of how they worked with us to resolve the situation and prevent it from happening again.
Hard to tell, fake news running both of their names, looks like both?
Ftfy: ‘Largest
ITWindows outage in history’I learned of the problems from the radio news on my way back home.
CrowdStrike, not Microsoft, is responsible. Let’s put blame where blame is due.
This could happen to any OS that has cybersecurity where permissions are needed at deeper levels to protect systems.
I like how it’s the biggest IT issue and the best solution is to turn it off and on several times
They are saying “up to 15 times” now.
laughs in linux
Work was supposed to be slow today. D’: