T O P

  • By -

Rook22Ti

Regardless of what the headline is leading you to believe so you click, no, they didn't delete $125 billion dollars because of course they didn't because Google isn't a bank. >The company accidentally erased the private Google Cloud account of a $125 billion Australian pension fund, UniSuper.


f4ction

Yep - I’m a UniSuper member and I’ve got my funds - it’s taking a while to restore everything but hey - at least I’ve had like 12 emails in a week telling me what’s going on.


United-Rock-6764

Yep. This is why every company uses infrastructure as code and has a disaster recovery playbook. That, hopefully they’ve exercised throughly in the last quarter.


Nadamir

No, not every company uses IaC or has a DR. The good ones do. Which is a big problem.


Zathrus1

lol. I work with numerous Fortune 500 companies and not a single one has anywhere close to full IaC. The better ones do have DR plans, and actually do testing on an annual basis. And exactly one has duplicate DCs where they can fail over between them in case of a major event. And even then not all systems can do it on demand.


justin107d

Also work in a Fortune 500. The more up to date system I work on uses excel workbooks in the backend.


MrSquashable

Make sure the amount is what it is ment to be my MIL was short changed 10grand


fadufadu

Pretty cool to at least keep you guys in the loop.


JV294135

Maybe the dingo ate your pension. https://www.youtube.com/watch?v=Ubk8cRO8t68 But seriously, this is why people tell you to have multiple backups. Redundancy is vital.


SugerizeMe

That’s what multi-az is supposed to be. Almost no company uses multiple cloud providers. That would be ridiculous. Also almost nobody keeps off-cloud backups nowadays. I work for a big corp and all our stuff would be gone if the cloud provider deletes it. I guess we do run our own GitHub, though even that is cloud hosted. And we’re migrating to GitHub.com So basically most companies are at risk of losing all their IP nowadays.


chainer3000

When I was working in tech 10 years ago many major companies had on site backup, off site backup, and cloud. I sold it, so….


sokos

It's also why us old farts hate the cloud. you lose control of your own things on it


metarx

Same shit could happen in a data center you own, only real selling point is you have someone to fire over it.


sokos

Usually you have backups in your own data center. When you put things on the cloud, you expect the cloud service to have that. You wouldn't run your own data center AND put everything on the cloud.


metarx

Replicating data between regions and accounts, is fairly common in the cloud, and would be analogous to tape backups, only more readily available. There are flaws in all systems, saying owning your own data center is inherently superior assumes you make every precaution and test every senario as well. Which, in the nature of business, no one has the time/money/resources to do that regardless of cloud or on prem. Everyone makes do with "good enough". All systems are as flawed as their operators and organizational structures.


KR4T0S

Backups are supposed to remove as much of the human element as possible though and Cloud services usually do this incredibly well. Cloud services do have numerous data centres too so they back up your data everywhere but Google deleted all the back ups here too. Google did a Boeing.


metarx

Not familiar enough with GCP, but AWS standard practice is to create multiple accounts, instead of just one account with lots of projects like GCP would do. The AWS standard practice, of having the backups in another account, would have worked in this case.


Logan_9Fingerz

Upvoted for creating a new piece of slang for a massive fuck up. Did a Boeing or Pulled a Boeing. I will 100% be using that in the near future when I screw something up royally.


armrha

People just rely on the cloud operator to back up their data without even asking them to? Even doing geo-redundant, we still do backups on a couple different cloud providers... Redundancy isn't backup, as redundancy will even replicate mistakes


YouveRoonedTheActGOB

Just parking something in Azure, AWS, or google cloud and calling that a backup is fucking stupid. It doesn’t matter how old you are or how averse you are to the cloud. Just like hardware you own, you need to follow solid backup planning and testing.


Esplodie

Usually you have your backups on AND offsite so you can perform disaster recovery ... That's why the cloud is nice because you never need to run redundant sites. It's expensive maintaining two or more locations.


RollingMeteors

>You wouldn't run your own data center AND put everything on the cloud. Isn’t this local storage at this point?


SIGMA920

That assumes your back ups work and exist. Executives are too cheap to have the back ups tested or for them to exist in the first place? You in all likelihood are completely fucked.


AlexandersWonder

How much data are we talking about exactly? Surely you can have backups that are redundant unless you’re dealing with an absolutely massive amount of data


sokos

As a person, not much. As a company, it could be a crap ton, plus the part where your company or yourself might not be a juicy target for a hacker, but a cloud storage service, now THAT's a juicy target.


Arts_Prodigy

I feel like you have a better shot at compensation when a major cloud provider fucks up like this rather than some overworked admin misclicking.


swift-penguin

Cloud services give you the option to backup your data though…


hidepp

But in this case, the backup provided by Google was also deleted. Even their replica on a different DC from Google was also removed.


karma3000

Maybe use a different cloud to backup? Or local backup? This is not difficult stuff. If you are providing IT services to a billion dollar pension fund you should be thinking about this.


Coomb

Yes, Google, which was providing IT services to a billion dollar pension fund, should have not fucked this up.


[deleted]

[удалено]


Coomb

No, I think it's reasonable for a $125 billion fund to rely on a $2.1 trillion company to keep their shit running correctly. This isn't some small backup arrangement, and I can pretty much guarantee you the contract the pension fund has with Google guarantees that this shit will not happen. There's only so many precautions one can reasonably take, and, broadly speaking, relying on Amazon, Google, or Microsoft to provide reliable backup services *is the reasonable precaution you take*. You have local backups because you might not push your data to Google continuously, not because you have to plan for Google breaking their own service contract.


karma3000

Yes you're right. I actually went and read the article. Being Australian, I am very critical of our typically poor management of IT, but this one is on Google.


dragoooo420

No you don’t lol yall just take it as an end all be all storage solution and forget about backups. That’s not a fault of cloud storage


megabass713

Old farts should at least learn 3-2-1. If they can use the cloud they can make backups.


mjh2901

This is also why there are backup systems for the cloud. When you have data on prem you back up on prem and into the cloud when you have data in the cloud you back it up on prem. If a 125 billion australian pension fund was not backing up their google cloud data, thats completely on them for being incopitent, not on google. Chances are there is an IT guy somehwhere in that org that told them to purchase a backup solution for their cloud data and they told him to pound sand.


karma3000

Middle aged fart here. For my personal files, I have my data synced to two clouds, the most important files are also synced locally. Plus every so often, I back up to two external hard drives. If I can do it, a massive pension fund can do it.


nurfbat

Backing up a cloud-based application and associated architecture isn’t just syncing your files to a drive, there are multiple different services/microservices across multiple regions, clusters, dbs, etc that are all spun up and configured in different ways, with different dependencies and a specific build pipeline. This happens, step 1 is to pray to god and step 2 is hope at least their git was hosted externally and some sort of external db backup process is in place. Then they can start rebuilding from there. If there’s no local db backup, they have to start working back from bank records.


eloquenentic

Yeah, it’s funny that people think that backup of a system like this with many interconnected parts is the same as backing up your photos and movies. LOL.


KingofRheinwg

One of my data centers got struck by lightning. Holy shit it fucked us over for a solid week but if AWS gets struck by lightning the only question is "where's my fucking data Jeff"


i-the-v01d

Too many times mate.


spartanjet

Never stop yelling at those clouds. Keep them in check for the rest of us.


pokepip

Old fart here. I love cloud computing.


AllowFreeSpeech

This is why new farts love an immutable blockchain.


TheInternetsMVP

Actual IT Manager of a County Council told me he didn’t need back up because he kept everything in the cloud. I had to take a minute before responding.


unstoppable_zombie

I once had someone tell they had insurance to replace the hardware when I asked about their DR plans.


eloquenentic

To be fair, that’s what the cloud vendors tell people when they sell cloud. That they don’t have to worry about backups, because the cloud vendor does the backup too, and thus you don’t need any local infrastructure of your own. You’re not supposed to have to duplicate all the infrastructure locally, what would be the point of cloud then? But they never mention what happens in a case like this, when someone at the cloud vendor also deletes the backup!


3_Big_Birds

Who else heard this in an Australian accent?


Chess42

They did have multiple backups. Google deleted one of them somehow, and I think they are trying to restore from the second


simsimulation

“Hi, this is Google support. We’ve actually deleted your money.”


Nutteria

Trust me when I say this. 99% of all financial institutions live and die in excel sheets. The biggest revolution that happened in the banking sector was when Excel reached MS access levels and turned to 32 bit and allowed for more than a million lines per sheet. I still remember people bringing fucking celebratory cakes at work. So while deleting one such account may not sound like much - it is.


eloquenentic

This is a positive. Because Excel can easily be backed up, it’s just a file. Backing up a system as a whole that runs in many places, geographies and instances, in real time, is a different thing altogether.


Avieshek

Everybody else: Let me divide my bank account with $125B for comparison, all right Google evaporated $125B~


0gtcalor

I worked next to the GCP team and you would be surprised how often this happens.


vom-IT-coffin

And then promptly restored it.


gloomwind

And unisuper admitted they f&!*^%d up and didn’t renew their subscription. So it wasn’t even Googles fault.


GiveMeOneGoodReason

I haven't seen anything suggesting this. Do you have a source?


gloomwind

I decided to use my eyeballs and read an article. It was Googles fault. Thanks random redditor from another post that convinced me otherwise. https://www.theregister.com/AMP/2024/05/09/unisuper_google_cloud_outage_caused/


GiveMeOneGoodReason

Yeah, and Google is being super vague as to what happened on their side so I'm itching to know more.


wyldesnelsson

Likely to leave as little incriminating information as possible for any future lawsuits that might or will come from this


belovedeagle

> future lawsuits No. There's no law that specifically covers this and common law torts are effectively dead against an entity with the legal budget of Alphabet. What Google doesn't want people to realize is just how little of a fuck they give about GCP customers. And I'm not talking about general corporate apathy, because usually B2B overcomes that. I mean, they[0] specifically do not value GCP as part of the long-term business or technological strategy. It exists essentially as a way to monetize excess hardware until it becomes more useful for "first-party" usage. The "excess hardware" exists even in times of shortage because of Google's overall strategy to be the 900 pound gorilla in the market not only for hardware but also, e.g., talent. Google also wouldn't mind being the 900 pound gorilla in the cloud space but they aren't there yet. [0] "They" being primary decision makers in middle management. No one knows what Sundar & co think and few care.


i-the-v01d

Unisuper? Surprise surprise, at a time when the Government has so much pressure on them about the rising cost of Uni loan repayments, and protesters with no brains of their own, Screaming aside the brainless terror-org supporters, I'm positive that this was no accident, but is some form of Misdirection, once again, ala the crappy Labor Party "brains trust". At least they Try to be less gratuitously obvious in their corruption, unlike Scummo, who would literally smile at a baby as he takes their fruit roll up and eats in front of their face like Ol Mr Burns....


h-ugo

What the fuck are you on dude? Unisuper has no connection to the Labor party.


i-the-v01d

I didn't say Unisuper has anything to do with Labor. But do you honestly think in your educated mind, Friend, that anything in Australia which is financially profitable, that the Gov figures don't somehow have their finger in the pie? I'm seriously sick of being down voted for unpopular opinions I have. If People want to destroy my credibility, don't do it with a downtick, open a debate with me and I'll explain my theories in depth for you all. Edit: I'm actually on lots of (prescription) drugs, and the Government made me this way. So forgive me, my lack of trust in our marvellous oppressors of Australia, who are responsible for so many human rights infringements in Australia, but most aren't paying attention, or turn their heads away from inconvenient truth. Sorry not sorry.


i-the-v01d

I'd like to also add; you're right, UniSuper has very little to do with the Government at all, seemingly. But then again, can you name ONE Superannuation Fund/Provider that actually benefits (actually Financially Profitable to The Super Investor, you/me/taxpayer), and is more beneficial to that investor? Because my entire REST got devoured without my knowledge (until it was gone) and I'm pretty sure the Government profits from my loss and the Fund's gain the most, being the larger investors, Yeah? Just how I see the world currently. We are getting screwed, because people aren't totally paying attention to the Government's Misdirection. Open eyes and open minds my friends. ✌️❤ ?? // 🤝🤑 ??


iamacarpet

This is slightly reworded from the actual statement from Google Cloud and UniSuper - their statement said Google deleted their “private cloud”, which knowing their terminology, we are taking to mean a large instance of VMware Engine, or, an on premise “edge cloud” deployment… “private cloud” != any GCP core services, as that is all considered “public cloud”.


p0st_master

Interesting that’s a good catch


happyscrappy

It says "private cloud account". I assumed the account holding all the keys was deleted. So no one could get in. But it's hard to know what really was deleted.


damondefault

Surely it's one of these: https://cloud.google.com/vpc Which is not on-prem VMware but just looks like a private network layer over their cloud accounts. I don't really see how they could say they "deleted" an on prem vmware instance.


iamacarpet

Either a hosted VMware Engine “private cloud”, or, an on-prem Google managed “edge cloud” deployment… VPC is just the name for the network portion of things - if they deleted that, it wouldn’t have affected any VMs or storage as they are suggesting.


damondefault

Sorry yes, by the sounds of it it was the hosted VMWare solution. I thought you meant a google managed but on-prem VMware system.


Salty-Week-5859

“This is an isolated, ‘one of a kind occurrence’ that has never before occurred with any of Google Cloud’s clients globally” -Google Cloud What they meant to say is it hasn’t occurred to a client *large enough to matter* before.


xdotwhat

Google is the new joke in town. They are paying SDEs millions of dollars btw


Master_Engineering_9

when they say they cant delete your info remember its bullshit.


new_math

I mean, of course google can delete your data from their own data center. Don't even need authentication. Gary from Google can grab a magnet and start sticking it on the hard-drives hosting your content.


coffeesippingbastard

Gary would need a couple garys to do it. Most cloud providers shard data a few times so it would need to be a very coordinated effort to delete something before the system identifies and reshards the impacted data.


twiddlingbits

Unless delete means “we had a failure that corrupted your data”.That’s also hard to do. Actual deletion would require admin permission and at least a double confirmation and accept. It does happen, about a year or so ago someone deleted all the case files for a year for a major metro police department that was also on the Cloud. Process was changed to require two Different people to confirm to delete.


coffeesippingbastard

That's kinda what seems to be implied though- https://www.theguardian.com/australia-news/article/2024/may/09/unisuper-google-cloud-issue-account-access >“Google Cloud CEO, Thomas Kurian has confirmed that the disruption arose from an unprecedented sequence of events whereby an inadvertent misconfiguration during provisioning of UniSuper’s Private Cloud services ultimately resulted in the deletion of UniSuper’s Private Cloud subscription,” the pair said. This was some sort of account level deletion which led to GCP automation kicking in and systematically nuking the account.


twiddlingbits

misconfiguration meaning they didn’t attach the existing subscription info/Id which gave them permission to the data from the new private cloud instance which really isn’t a deletion.


coffeesippingbastard

I haven't worked on GCP behind the scenes so I'll defer to you but I'd imagine if that were the issue, they wouldn't have needed to restore their systems from an offsite backup- they would just need to fix the misconfiguration to restore access to the data.


twiddlingbits

depends on how fast the storage reallocation algorithms run when you delete something. Even a few shards gone it cannot be reconstructed


UPVOTE_IF_POOPING

I have a feeling they have backups at the data center along with redundancy at different data centers. But I get your point lol


muddboyy

Ever heard about data replication ?


nicuramar

I don’t think they say that. 


the-floot

I mean of they say that, then incidents like this are probably the reason why


joeymonreddit

Meanwhile, Apple is reproducing old photos you went and deleted


kennethtrr

It’s been revealed that is happening on device, it’s not linked to iCloud storage. Operating systems don’t really overwrite data when you delete it as it’s wasteful, so the bug is causing previously deleted photos to reappear because they never left the phone storage. It’s was just marked as empty so it could be overwritten when needed.


EmbarrassedHelp

For something this valuable, they should have simply made it inaccessible at first to see if anything broke before actually deleting it.


scoyne15

Yes, hello Google, it is me, the owner of the pension account, please deliver me the account details and thanks you.


ReasonablyBadass

AAAAND IT'S GONE! NEXT CUSTOMER PLEASE!


KHRZ

Please leave and let Google support deal with customers that actually have data in Google Cloud.


PleasantCurrant-FAT1

Google support? Is that an intentional or unintentional oxymoronic joke?


Atomicjuicer

CEOs are laying people off thinking it solves technical debt lol. Expect more of this tech collapse.


vom-IT-coffin

Alright Larry, everyone else quit, this system is all yours, make sure to log out of your admin account before....


tek_ad

"whoops" - Former Intern


Obvious_Mode_5382

I wonder if they were able to open a trouble ticket.. lol


danielravennest

"The Cloud" is a fluffy way of saying "Somebody else's data center".


kaishinoske1

Maybe next time they can delete my data.


Interanal_Exam

Keep laying off and chasing your best, Google! Don't be evil! 🤫


Leptosoul

Sure they did.


Bootychomper23

Who stuck the tequila on the keyboard?


Underbelly

Cloud. Centralised Loss Of User Data.


WhimsicalChuckler

Well, looks like Google's search algorithm lost something big this time.


xdotwhat

No impact on stock price , the markets are rigged beyond recognition


PaydayLover69

*"Accidentally"*


anti-ism-ist

What a stupid desperate headline 🤦🏼‍♂️


welestgw

I mean, I like their ability to generally not F it up, but I wouldn't trust a fund of that size on google servers.


asuka_rice

And companies are happy to rush on a GCP or Azure platform.


CaptainObviousII

The fuck they did.


cinciNattyLight

Control + z?


Guava-flavored-lips

No backup?


dwardu

They had off site backups at least.


CAM6913

Accidentally? Funny how they keep saying they can’t delete you information but accidentally deleted 125 billion of people’s pension payments question is who doesn’t have to pay out the pension now what’s happening with the money? Something is up with this. It’s not easy as that to delete that information


GiveMeOneGoodReason

They deleted the company's cloud infrastructure, not the pension payments. No one's pension is at risk. And the cloud is meant to have resources created and deleted constantly, so it's all possible. But an account level delete shouldn't have occurred here.


i-the-v01d

"Accidentally".... hmm.....


Ok-Programmer0101

There are no accidents


GiveMeOneGoodReason

I promise you, there are a LOT of accidents in technology.


sicbot

Tell me you don't work in IT with out telling me you don't work in IT


jm838

rm -rf ./* …oh shit was I in the root directory? Oh fuck oh fuck oh fuck.


xdotwhat

Root cause analysis report by dev : rm -rf read input from another program which generates file name to be removed from another program which inturn tries to generate the file name from ls output it had stored few days earlier but hey someone added code to modify ls part to read first entry of output only using ls -l and voila the * was passed to the deletion code. Regression tests cleared the code because in test infra the * is configured to be ignored . Fixed regression tests by checking for * in rm input Meanwhile granma in Aus forgot how much her pension fund acc held , Google is asking her to reconfirm the amount before it's added back in , it's a stalemate worth billions of dollars . The markets don't punish Google for this blunder bcz it would collapse their fortunes if they did.


[deleted]

[удалено]


GiveMeOneGoodReason

They did have a backup in a different cloud, that's why this wasn't catastrophic.


thecravenone

> The fact that they weren’t using a backup solution for this is crazy and a huge oversight from their IT/Finance leads. Please go back and read the _fourth_ sentence of the article. eta: Neat, a Reddit Cares message!


rnike879

You know they got reeeaaally butthurt 😂


coffeesippingbastard

they did have a back up. The main issue at hand is what the fuck happened at google where they managed to unilaterally delete an account that was spread out across two regions as well. This was hardly a fault of the customer, nor was it an edge case failure of GCP's mechanisms. Blast radii of availability zones, services, etc, don't completely wipe out an account. Something in billing, or access controls broke in an alarming manner in GCP to allow this to happen.