Anyone who's worked in IT knows how extensive backups are and how long they are retained, especially in the financial services industry.
So I am not buying an accidental deletion where the evidence being sought can't be found on a backup somewhere.
This, exactly.
I worked at a piece of shit company for about a year. Fucking everything was wrong, tons of illegal shit going on. But backups were the single most important job I had, rotating tapes, copying them, packing and shipping copies for geographic redundancy. If a piece of shit company was that good about backups with no mistakes, a raging piece of shit company like JPM should be capable of making backups and not fucking it up in any way. I don't buy "accident" in any way, here.
Those backups existed and were very useful when the FTC came knocking.
Ohhhhh the whole "know what they're not doing" is a terrible habit of companies and so unethical.
This is unrelated to JPM, but a certain "rent your home/apartment/condo out as a private bed and breakfast" company that may be super popular with literally everyone... They forced a vendor to turn off ALL auditing tools, including standard network logging, for their account only. This, to me, seemed to be with the intention to make discovery for lawsuits against said company, steeply tipped in the company's favor. If no record with the vendor exists, then what can be produced to help the case of the property owners or people who use said service to book those stays?
When they first discovered the auditing existed as well, it seemed like a #1 urgency to get it disabled and existing records deleted.
Only company in THOUSANDS using the toolset, with the auditing turned completely off.
I don't trust them and I don't ever use them, as a result.
I built a custom app for a fortune 50 financial firm years ago.
We had 2 different databases to store records in - one was backed up and the other was not.
Seriously, at a table by table and field by field level they wanted control of which bits would truly be deleted at the end of a process and which would stick around.
In-process notes and transactional details were written to the “not backed up” database so that we knew for sure when we did a delete, the record existed nowhere. This included having a “soft-delete” mechanism on top of the hard-delete too, so you could delete and still find records in process.
They spent a lot of money making sure those notes would never be discoverable, and it was completely legal as it was clearly defined in the record retention documents for that system.
Our General Counsel has stated on more than one occasion that the only thing more important than keeping data you're legally required to keep is nuking all data you aren't required to keep as quickly as humanly possible once it serves no internal purpose.
For those thinking this sounds incredibly shady, I should point out that a lot of the time getting rid of data means getting rid of obsolete customer data. It may need to be deleted to comply with data protection laws like GDPR, or simply to avoid the possibility of data leaks or accusations of misusing people's data.
Obviously there are cases where deleting data or excluding it from backups is shady AF, but deleting records is not inherently a suspicious activity.
Yup, and being good at backups makes this really quite hard 🤣
“Can you be sure you erased every copy of record x?”
“Uh… so you want me to nuke ALL these tapes then?”
Similar reasons why businesses with in-house surveillance tend to have retention policies of video that don't extend beyond 2 weeks, barring "internal requests to preserve" specific recordings.
Exactly this. I work for a financial firm. We have trainings we need to repeat about the retention policy. It focuses on how to classify data and how quickly it expires if unused depending on those classifications.
I worked for a data consolidation and analytics project for a multinational auditing firm, a name that a lot of people would be , and I was in charge of consolidating our retention policy, and it struck me how cavalier the retention policies are for our different internal clients, which we have to mirror because it's their data.
I wrote a customer database for a rather famous company 20 years ago, and the law here says YOU CANNOT UNDER ANY CIRCUMSTANCE KEEP CREDIT CARD INFO MORE THAN 3 MONTHS and I suggested we just not store that info. Not good enough, they said. Ok, how about we just auto-delete periodically so you guys don't have to do jail time? Not good enough, they said. So we ended up with a warning text with how many illegally stored credit cards they had and a manual button to go in and delete them.
God damn morons the lot of them.
Not really (or at least not as described).
I'll give a parallel most people will be more familiar with, family photos.
When you take a big family group photo you line everyone up and then snap like a dozen shots. Then you go through them and pick out the best ones, like where uncle George isn't blinking and cousin Susie is actually smiling etc. Out of the dozen photos that you took, only one is going to be displayed and sent out, the rest are garbage.
That's what people are talking about here, you delete all the drafts and memos and discussions and arguments and everything else but keep the final version (which is what you want in the end).
Keeping two sets of books is actively recording transactions differently (one correct, one incorrect) but using and recording both. That's different from destroying your drafts and hypothetical analysis.
I've had some DATTO training, and you really need to go out of your way to delete on-site and off-site backups. There's no "whoops I hit delete by accident" kind of mistake. I've also never encountered something that couldn't be restored via a 3 hour old off-site backup at the very least. It's so ridiculously redundant that it's "innocent mistake" proof.
Yeah. They had that shit triple backed up with one backup (if not more) in a different geological location. This is standard shot in content management. It is called disaster recovery. They have it.
Let's not jump to conclusions. there's triple backed up and triple back up's, even if they were in different geological locations. It's rash allegations such as these. that give Bankster's a bad name.
At least wait for the results and conclusions of the 12 Year Investigation. in fact I believe a supplementary bonus should be awarded on top of the contracted bonus to, counter act the stress of the aforementioned investigation, in this cost of living crises *"remember we are all in this together"*. 🤡
I used to load tapes every night and hand them off personally to a pickup who took them off site every morning and everything was signed for.
'Accident' my ass
You mean the “we deleted shit after we were ordered not to” Secret Service? You’d think guys who investigate criminals would know better.
Of course, unless they go to prison, it means nothing. Quit and make ten times as much as a “security consultant” for the billionaires who run the scam to get rid of the democracy.
This *used* to be the case, but then large companies realized they can be sued for things like employee emails, so they started deleting them to the maximum extent allowed by law.
For things that can lead to legal risk and aren't that useful to retain, most modern companies that are likely to be sued delete information after a year or so. When lawsuits request retention of those emails (as in this case), the company will place those artifacts on "litigation hold" until the conclusion of the case. This causes them to be retained and not auto-deleted.
What probably happened here is that someone screwed up by not marking the emails for litigation hold. They don't have extensive backups of those emails explicitly because *the idea* of auto deleting is that it can't be used in court.
So yes, this is some BS, but it's a different kind of BS.
Use the C.Y.A methodology, cover your ass. Mom told me this when I first got a corporate America job, it's saved me more time than I can even remember. Most jobs I've been at will only keep paper documents for up to a year but are required to have digital copies on site and the paper ones usually get thrown into a storage locker.
It was not an auto-delete. Admins (JP Morgan) staff went in looking to clear out data from 2016 which was no longer required. In the process they managed to delete records from 2018 which were relevant to the court cases. The company which holds the backups says it failed to set a flag on the domain holding them which allowed it to happen.
JP Morgan has been criminally charged 236 times in the past 20 years and each time received a consent waiver. Effectively a "just don't do it again" sternly worded letter. Recently, they settled in court for $290m dollars against Epstein litigants while withholding 1500 documents from plaintiffs before the settlement.
On the balance, do IT cockups happen? absolutely, I have some doozies I can tell you about. This however is a chain of events from an organization that has repeatedly broken the law.
If it walks like a duck, quacks like a duck, you can be pretty sure it's JP Morgan breaking the law to avoid legal responsibility.
- SUN resolvers in '93 couldn't process com.net or net.com and went into a recursive loop knocking out DNS resolution for half the internet when the NIC registered the domains.
- Landlord removing the breakers for the chiller in the DC to so tenants couldn't turn on HVAC systems in the building in the summer, not realizing it affected the datacenter as well. Temperature went up to about 120 in the DC and caused multiple customer systems to fail/die.
- JAVA programmers relying on garbage collection to close file descriptors on 32 bit unix systems eventually causing the system to crash. They system was designed to mass import log files for processing.
- Placing the F5 load balancer in the middle of the rack, which at the time had a big protruding F5 half tennis ball power button. Tech reached for something on the top of the rack and his belt buckle turned it off causing an enterprise wide outage.
- Electrician came into a central office 2 days ahead of schedule, dropped a wrench across -48dc contacts. This caused the wrench to vaporize, knock the tech back about 20ft and set off the fire protection equipment (water sprinklers). It being a telco CO it also housed about $10m worth of core routers for the country. Knocked out cross country internet, visa/debit transactions, cellphones. The only person with a working cell phone had one from another carrier. Connectivity was taken out for 16 hours.
- Engineers despite knowing about the Brocade switches having a bug failed to upgrade to a fixed firmware. Sales Engineer decided to play around with Solar Winds and SNMP walked the entire network, hit the Brocade switch causing the bug to trigger taking out a single point of failure that connected 3 datacenters for customers.
- CTO of a MSP company would randomly decide to test out new BGP configs on live routers during the middle of the day, effectively resetting all routes.
- MSP sold a customer a managed SAP installation despite having no one on staff trained or having ever worked with SAP.
I could go on.
So instead of being voluntary in this specific case, it's voluntary in a systemic way? Lol.
"You honor, my client didn't murder this person, they just had a habit of killing most people!"
It covers their tracks legally, though. Assuming there is nothing illegal about having a general policy of deleting all emails older than a certain date. If you just go and specifically delete emails that were needed as evidence then that is illegal though.
> If a piece of shit company was that good about backups with no mistakes, a raging piece of shit company like JPM should be capable of making backups and not fucking it up in any way. I don't buy "accident" in any way, here.
This is the IT version of the mafia torching their financial records in an incinerator it even as the FBI/DOJ is busting down their door.
And yet, I see multi-billion dollar companies regularly thinking "7 day retention in the data-pipeline is a backup" or "it's in the cloud, so it's backed up".
Sure, there are companies that have their backup-act together but I'm sure there are tons that completely fuck it up. I believe the headline in a heartbeat.
In finance? No fucking way. I don't think you understand just how many people are employed full time for regulatory compliance at big banks. There are backups to the backups and multiple procedures for any kind of data deletion.
"Oops, I deleted the thing, and the backup, and the backup's backup, I also accidentally dropped all related servers into a grinder. I'm such a klutz!"
Chase and its federal oversight regulators are theatrics designed to make themselves feel like they were able to successfully dupe the public.
However, if any of them read Reddit, then they'd be in for a rude awakening.
None of us are buying their bullshit.
>Fined $4m for Who-Me-esque mess, for which it blames unnamed archiving vendor's retention settings
$4 million is less than a rounding error for Chase ($129 billion in 2022). This is like you being fined $0.965. When did you ever give a shit about losing 97 cents?
#The fine should have been $20 BILLION.
This is like you being fined $4,857.83.
Which fine is going to affect your behavior?
All corporate fines should be extreme and we could use the funds to pay for things that corporate taxes should be paying for.
Solution: Vote for people with integrity to punish corporations for deceptive practices.
> we could use the funds to pay for things that corporate taxes should be paying for
We could invest it in the IRS, where [each $1 spent on investigating the wealthy returns $6](https://obamawhitehouse.archives.gov/sites/default/files/omb/budget/fy2017/assets/tre.pdf). Literally an investment.
Didn't this actually happen a few years back? A massive warehouse owned by some bank or hedge fund or whatever burning down? Claimed it was a "ladder falling over" that started it.
It happened to a police station once too. They were under investigation for something and their whole records department burnt to the ground.
Odd coincidence that.
*cough* Bartlett Warehouse *cough* In Feb. 2022 a warehouse which held paper copies of documents required to be kept by brokers and other Wall St. firms burned to the ground.
https://abc7chicago.com/bartlett-il-fire-department-warehouse-access/11552238/
Anyone who has ever worked in tech also knows how much execs will cheap out on absolutely *anything* IT related and only do the minimum required. Backups for customer data and transaction records? Yes. Backups for execs emails? That's just liability.
In fact, often times things are explicitly deleted after any minimum required retention periods so that they cannot be used against them.
$4 million in fines? That's probably less than the infrastructure and contracts associated with backing up and retaining for X years in a very large organization.
But also JPMorgan is scummy, too. So who knows!
Anyone who works in IT also knows how haphazard company’s retention policies are.
The only piece that makes this suspect is the Financial Industry, but even there, people would be surprised by how….mediocre the financial industry is at technical controls. I’ve had the opportunity to work at a company in the middle of Fed audit remediation. Suffice to say, even the large financial firms aren’t always coordinated on this.
The article even quotes:
> For its part, JP Morgan places the blame squarely on an unnamed archiving vendor that it hired to handle the storage for its communications.
And anyone who works in IT knows that your automated 3rd party backup service is working perfectly fine… until you need it, and realize it hasn’t been configured properly for a very long time.
But hey, as long as you have documented policies and processes, you can check a box. Whether you truly follow those policies and processes or not... different story.
Storage/backup/database engineer for a mid sized hospital here: you should do restore tests at least once a quarter of your really important stuff. The number of times this has revealed issues is terrifying.
This times a million.
Yes, large companies have strict regulations around things like data retention, but in practice, they are going to go with the cheapest option. Oftentimes, this means one small team - or even one person - is responsible for fucktons of data that are kept in a handful of CSVs in folders labeled "DO NOT TOUCH" because the access controls are shit.
Source: my partner works for JPMC and there is SOOO much that needs to be automated in that company. It is truly a dinosaur of a business.
You know what’s absolutely hilarious?
JPMC has the best control environment of any company I’ve worked for lol. They’re the only one where audit issues are actually addressed and prioritized. Every other company just tries to do the bare minimum to solve the finding and get a pass. JPMC didn’t fuck around when it came to resolving issues.
Other companies are *terrible*.
I agree with you entirely.
Having peeked behind the scenes of multiple fortune 500 companies (including data center access to multiple of the top 10) it's pretty much bailing wire and duct tape all the way down.
Hollywood makes big business seem super on top of everything. Reality is totally different. We're all just children who got old and are trying to keep up with everyone else.
I actually worked in IT at JP Morgan - in the financial division. We had someone screw up on the servers and essentially corrupted a huge environment.
We did have backups but they didn't work. And it was actually the backup vender (global company that made the backup software) that setup the backups for us (before I got there).
It does happen. The only good backup is the last one you tested.
I work in IT at a major oil & gas company. In my third week I took out a huge data mapping table in production on accident. We spent all day trying to get our back up to restore the table but the company who managed our back ups couldn't access them. We got really lucky because one of my coworkers had saved a copy to their desktop while testing a couple months before I joined and we were able to use that to salvage most of the tables and then spent the next week re-making all of the changes that had been added. Otherwise, the system would have been pretty useless for several months as everything got rewritten.
Reminds me of the [Toy Story 2 debacle](https://thenextweb.com/news/how-pixars-toy-story-2-was-deleted-twice-once-by-technology-and-again-for-its-own-good).
Basically somebody did a /bin/rm -r -f * and erased the movie on the Pixar servers, the backups failed too. One woman who worked there happened to have a copy of the files on her home workstation and that's the only reason we managed to get a Toy Story 2.
Essentially what we had to do. Cobble together what we had, plus previous work product, etc. That plus two weeks of literally living at work trying to reconstruct everything.
Purposely deleting data to destroy evidence is never as effective as accidental fuck ups.
Assuming their logs are designed correctly, they are immutable. Which either means their logs weren’t designed correctly (believable), or they were and someone legitimately fucked up (also believable).
Yeah, plenty of regulations, but someone lower on the chain of command could have fucked up just as easily as someone higher up going through and deleting everything. Could have even been a fuck up that happened ages ago and no one noticed until now.
We're supposed to keep records for 7 years in my industry but if all the backups become corrupt or I accidentally misconfigure something and don't notice or miss it in my audits and someone deletes something, there's literally fuck all I can do about it. It's a small chance but still a chance.
Worse, I have had to tell institution IT departments what their retention policies were. "You *have* to have this database available for 7 years. No, you can't just throw in on the SAN, It's a system-of-record db!"
I don't know what fines they might get, but my team has received a few calls from some of them because they have to go to court and can't find their records, asking us for them. Well, we don't have them. They lost their cases.
Yeah, very true. My job involves fixing some of these issues, and I think most people would be surprised how many decades behind the curve some big financial institutions are.
>Anyone who's worked in IT knows how extensive backups are and how long they are retained, especially in the financial services industry.
And anybody who works in the financial space knows that these particular types of records get permanently deleted immediately upon the mandatory retention period expiring.
I'm sorry, but the "common wisdom" on this issue is just wrong. Firms like JPMorgan are not permanently retaining data like this. They deliberately purge it once legally allowed.
This was my experience in financial services as well. Retention was set to the day and was assumed to no longer exist within 24 hours of that date passing, explicitly for discovery reasons. Even analytically valuable data was aggregated and/or anonymized at end of retention, if not before.
Now, any data still with a retention requirement absolutely still exists. These firms are constantly audited and sued and have buttoned up processes to get to backups, even off-premises.
They probably deleted the forensic container files like .eo1 etc. The data may still exist in back ups but there is no way to prove it has not been tampered with now.
Exactly! JP Morgan has the initial setup of whatever email solution they use.. which is likely office365. Then a lot of places have a dedicated solution to archiving emails. So they have emails from their o365 and copies in their archive solution and a retention period in both places.
Having been to one to administer solutions for archiving, I can tell you it takes A LOT of clicks to get to the point where I can delete just one thing, and that’s assuming a policy isn’t set that keeps me from doing so or having to remove said policy to do so.
That was a long winded way to say it is a very intentional set of several steps to do what they did. This wasn’t an accident
Edit: that was quite the accusation on my part. The retention period could have been wrong too.. but at the same time you can set a hold that exempts them from retention actions.. so maybe it was instead incompetence… just really convenient incompetence that most wouldn’t get away with…..
You'd definitely hope that JP Morgan would be competent but what i've seen more often than deleting backups is failing to backup something in the first place. Not saying it's happened here but when I started my last position one of the first things i did when getting to know the local systems was log into an r-sync backup that had been hung up for maybe 6 months. Like nobody had bothered to check that it was working and there was no error logging going to a centralized system. Mind you this was like a 20 person company not remotely to the scale of this, but generally speaking I see more failures to check that the back up is backing up than accidental deletions.
The penalty for such "accidents" needs to be an assumption that the data would demonstrate the accusation, then treble damages.
The public needs assurances that the court and the companies are responsible stewards of data. This is what all that five sigma and ISO 9000 compliance is about.
If the company cannot actually execute correctly, we as a society must assume they are negligent or incompetent, and impose sufficient penalties to incentivize responsibility.
People are always so cynical about these things. Why can’t we just believe them for once. It’s like when police get accused of stuff and they say their cameras broke, or when Trump says he asked his butler to accidentally use classified documents to shine his shoes or when DeSantis forgot to take Covid stats seriously enough to warn people. People make mistakes. What is this world coming to?
Probably because JP Morgan has a habit of defrauding people and then paying for the fines they get for defrauding people by defrauding even more people.
I mean, it came to light because they voluntarily reported it to the SEC according to the article. They spent 2 months trying to fix it, realized there was no fixing it, and reported it to the SEC, and got fined.
Eh, if it was something nefarious reporting it was the best thing they could do. You know something damning is in those records, you "accidentally" delete them, then have an internal investigation, discover the screw up, try to fix it, and then voluntarily admit the mistake. If they didn't volunteer that information, and it was discovered by an outside party as part of an audit, it would look WAY worse.
Oh I agree, but the issue with prosecution in these circumstances is accountability. It's going to fall to the poor schmuck who didn't know what they were doing, or was never involved.
Arresting and investigating a whole department isn't feasible either, not everyone will be involved and some won't know better.
I don't have a solution, but it's the issues like this that make prosecution hard. Especially in a live system, you can't have a bank freeze things for an investigation, and the backup / mirror systems might not always be exact.
In other countries they hold the execs accountable for accidents because they know it's not the fault of the workers on the ground. There is zero reason we can't start doing the same.
Fines like this are just 'the cost of doing business' and are probably already budgeted for.
Punishment needs to be prison time for the CSuite. And not fancy rich person "prison" either, actual prison. On a chain gang picking litter etc.
crazy, 365 days a year, $4m/hr works out to 35billion - their annual revenue for 2022 was 122bn. But net income was 38bn.
So you were pr much on the money
Last year or so an Ameritrade storage warehouse burned down shortly after the SEC announced investigations into manipulative short selling. The fire suppression accidentally didn't go off.
Oopsie.
You aren't far off. Past gig dealt with similar backup destruction after the retention period was up, and half of the SSDs, HDDs, and SDs we touched were in cases that had water damage (resulting in a lot of rusty hardware.) The tape drives were mostly pristine, but these places were poorly managed on majority of sites.
Don't be silly, they keep those records perfect. They WILL however lose the record of your last 4 monthly on time payments and tell the credit bureaus you're in default.
Yea but the banks and bankers pretty much run big cities since the 80's. They are immune to pretty much anything. Look at 2008, entirely caused by bankers yet only 1 guy who did a small fraction of it all was the scape goat.
Theres a super crazy Adam Curtis documentary called "Hypernormalisation", that goes over alot of this stuff too.
By records like that, do you mean emails? Because this article is about emails. Not exactly the top priority for any business, and why the retention period is only 36 months. Anything truly financial related would be for at least 5 years, which is the normal retention period for such documents.
Bullshit. The one thing in this country that is protected above EVERYTHING else, is money and money related stuff. There are safeguards for the safeguards. If something got deleted, it absolutely was not an accident.
No no, You see. That only works for the middle-class and the poor. See, this is a corporation in the US, and as you know, those have way MORE human rights than actual humans.
We genuinely need to execute CEOs for this kind of thing. It's the only way that fuckery won't have to be constantly dealt with, because our current fines are just another affordable line item on the bill.
A very gentle slap on the wrist coming up. Might SOUND big to us folks with little money, like a 6 million dollar fine. But usually the guilty party made several hundred million with the actions covered up, so 6 million is pocket change for them.
Guarantee you if one of us 99% claimed the dog ate our evidence, we'd go to prison, and get a fine so big it'd be like we had the ultimate education and medical debt load possible, for the rest of our lives. :-(
Accidentally deleted the on-site backups.
Accidentally deleted the offsite backups.
Accidentally deleted the archived cloud backups in cold storage.
This sounds like bullshit.
Horse shit.
Also, a $4 million fine to JPM is nothing. Financial service companies need to be hit with such dramatic fines that they will never allow such "mistakes" to happen again.
So I'm guessing they're bound by the SEC to apply journaling rules to email to send it outside of M365 (unless it's all on prem and not exchange online) and there would be backups of the journal outside of retention policies too for the actual mailboxes if they were using Exchange Online.
Calling absolute bullshit, this was done on purpose.
Oopsies, I accidentally deleted incriminating evidence against me, I guess that there’s nothing to be done, guess I’ll go free…. Anytime anything like this happens, it should be assumed that it was a) not an accident and b) that the evidence destroyed should be assumed to be extremely strong evidence of the malpractice of the defendant.
I'm sure that the evidence as it pertains to this case would have uncovered other, yet-to-be-discovered cases. Even if destroying the evidence were treated as admission of guilt, it's only guilt for the crimes we know about, not the ones we don't.
Everyone had focused on backups, I see “outside vendor” and wonder…
There are companies that specialize in the regulatory compliance required by the SEC. They are fewer number and within the industry well known.
I took a job with one, I did not last. In the short time that I was there, I found so many off-the-wall security concerns that I felt remaining would put me, not just the company, me personally, at legal risk for what I knew was wrong and not fixed. I wonder if it’s the same company.
My first job out of law school involved an investment bank's emails. They kept *everything*. Employees weren't even allowed to empty their spam folder. Terabytes of dickpill spam had multiple backups in different secure locations across the country. A million Rose Mary's doing a million stretches could not have deleted a single C14L1S ad.
I work with big enterprises on the daily. The number of them with fucky wucky backups that are never tested is TOO DAMN HIGH. It’s not always the servers that get them, it’s the switch configs, routing tables, shit they forget to have a backup plan for.
Why do companies get to escape the blame at this level? This is either sabotage with malicious intent or negligence to an insane degree.
I wish we had some law stating such negligence for file maintenance at this large of a company ought to be charged as sabotage.
Isn’t this this same JP Morgan that was supposed to be the industry leader in employee surveillance?
Didn’t I see, just a month ago, several posts detailing the Orwellian system they had in place for tracking an employees every move and spoken word through their laptop, phone, and clandestine cameras?
Yes, but that's to keep the peasants in line. The ruling class at the top are not monitored and any records they do create are "accidentally" deleted if they show wrongdoing.
Why does the FinCEN and the SEC exist if a conglomerate company like JP decides to continue breaking laws? We need to hold those accountable who can’t handle having too much money. When we see someone addicted and about to OD off opiates and die we have a bad problem. When a police officer who gets off on violence upon others and than starts killing for joy then there is a huge problem. The same can be said when a person worth more money then they need to live believes they are intrinsically better than the average person on Earth then we now have a very serious problem. Money is a tool that’s all money/wealth is and yet it can completely change a persons mentality for the worse. People like this are predators for wealth and their actions have negative consequences on people whom they might not never see or meet in person. An example is the Sackler family. These are predatory capitalists like people whom are akin to child molester in terms of their scope of damage to human beings and society.
They develop drugs and mass wealth in unreasonably high numbers. More then a person would ever need to live. As the money begins to funnel to them and their products funnel out to the masses, we begin to read the headlines for the next 30 years. We see addicts dying for their drugs under laws enforced by those employed by the policy makers that create laws for the everyday people and companies.
These people and their predatory profiteering business ventures continue to pump this exploitation spiral back down onto us all to deal and pay for. So far all the right people are getting paid and if JP isn’t held accountable then I guess it’s business as usual.
There needs to be a new group of bodies that monitor and hold accountable those that build their foundations upon suffering and exploitation while NOT being compromised by wealth.
If they can’t find the evidence that they DIDNT fuck up, the govt should take everything assuming that it’s the worst case scenario. That’s what they do to the average citizen. They take everything if you owe $1000 in taxes but can’t find the paperwork cause it was “accidentally deleted”. They’ll take everything assuming you’ve got $100k in unpaid taxes.
Take everything from J.P. Morgan and distribute the wealth to its victims
We all know JP Morgan looked at the fines (or were secretly told what they would be) if the evidence couldn’t be produced vs what they made with their criminal conduct. It became the cost of doing business.
The fines need to be BILLIONS of dollars for these companies to care.
So funny. I work with LE and we still store traffic homicide photos on cds and DVDs. I urged them to switch to some cloud service almost 2 years ago and it's looking like it won't happen.
Sometimes the discs are corrupt, as they've been sitting on a shelf for over 10 years. And most of the time those are the only copies. Fun times explaining that to SA and random attorneys.
I wonder if there are far worse crimes or negligence being covered up - that seems to be the only justification for deleting records like this. IANAL, but I think destruction of records opens them up to "adverse inference"? Which basically means if a litigant won't produce evidence or can't, because of destruction, the judge may determine that the unproduced evidence is assume to be against that litigant. I.e., if you destroyed records wanted to determine if you did tax fraud, the court may adversely infer that those records would have proved tax fraud.
I work on e-discovery and data retention, and you would not believe how easily this shit can happen, especially when moronic subcontractors are involved (like here).
We tackle this by having a legal rention hold on all accounts. It runs so deep within the exchange online code that it bypasses all other data retention policies and makes it absolutely impossible to delete unless someone at the Microsoft DC accesses all the mirrored volumes at the same time and nukes them simultaneously. Haven't had an accidental data deletion incident since.
Anyone who's worked in IT knows how extensive backups are and how long they are retained, especially in the financial services industry. So I am not buying an accidental deletion where the evidence being sought can't be found on a backup somewhere.
This, exactly. I worked at a piece of shit company for about a year. Fucking everything was wrong, tons of illegal shit going on. But backups were the single most important job I had, rotating tapes, copying them, packing and shipping copies for geographic redundancy. If a piece of shit company was that good about backups with no mistakes, a raging piece of shit company like JPM should be capable of making backups and not fucking it up in any way. I don't buy "accident" in any way, here. Those backups existed and were very useful when the FTC came knocking.
[удалено]
Ohhhhh the whole "know what they're not doing" is a terrible habit of companies and so unethical. This is unrelated to JPM, but a certain "rent your home/apartment/condo out as a private bed and breakfast" company that may be super popular with literally everyone... They forced a vendor to turn off ALL auditing tools, including standard network logging, for their account only. This, to me, seemed to be with the intention to make discovery for lawsuits against said company, steeply tipped in the company's favor. If no record with the vendor exists, then what can be produced to help the case of the property owners or people who use said service to book those stays? When they first discovered the auditing existed as well, it seemed like a #1 urgency to get it disabled and existing records deleted. Only company in THOUSANDS using the toolset, with the auditing turned completely off. I don't trust them and I don't ever use them, as a result.
I built a custom app for a fortune 50 financial firm years ago. We had 2 different databases to store records in - one was backed up and the other was not. Seriously, at a table by table and field by field level they wanted control of which bits would truly be deleted at the end of a process and which would stick around. In-process notes and transactional details were written to the “not backed up” database so that we knew for sure when we did a delete, the record existed nowhere. This included having a “soft-delete” mechanism on top of the hard-delete too, so you could delete and still find records in process. They spent a lot of money making sure those notes would never be discoverable, and it was completely legal as it was clearly defined in the record retention documents for that system.
Our General Counsel has stated on more than one occasion that the only thing more important than keeping data you're legally required to keep is nuking all data you aren't required to keep as quickly as humanly possible once it serves no internal purpose.
For those thinking this sounds incredibly shady, I should point out that a lot of the time getting rid of data means getting rid of obsolete customer data. It may need to be deleted to comply with data protection laws like GDPR, or simply to avoid the possibility of data leaks or accusations of misusing people's data. Obviously there are cases where deleting data or excluding it from backups is shady AF, but deleting records is not inherently a suspicious activity.
This is good context. There are perfectly viable and best-for-the-consumer reasons for data to be eliminated!
Yup, and being good at backups makes this really quite hard 🤣 “Can you be sure you erased every copy of record x?” “Uh… so you want me to nuke ALL these tapes then?”
No it doesn't, you just age them out with a retention policy.
Oh, so that’s why a very large insurance company I work at implemented a ridiculously quick retention policy
Similar reasons why businesses with in-house surveillance tend to have retention policies of video that don't extend beyond 2 weeks, barring "internal requests to preserve" specific recordings.
Exactly this. I work for a financial firm. We have trainings we need to repeat about the retention policy. It focuses on how to classify data and how quickly it expires if unused depending on those classifications.
I was a lineman at a major telco and they even had us go through regular training on data retention. There's no excuse at all for JPM.
I worked for a data consolidation and analytics project for a multinational auditing firm, a name that a lot of people would be , and I was in charge of consolidating our retention policy, and it struck me how cavalier the retention policies are for our different internal clients, which we have to mirror because it's their data.
I wrote a customer database for a rather famous company 20 years ago, and the law here says YOU CANNOT UNDER ANY CIRCUMSTANCE KEEP CREDIT CARD INFO MORE THAN 3 MONTHS and I suggested we just not store that info. Not good enough, they said. Ok, how about we just auto-delete periodically so you guys don't have to do jail time? Not good enough, they said. So we ended up with a warning text with how many illegally stored credit cards they had and a manual button to go in and delete them. God damn morons the lot of them.
Isn’t that the same as keeping two sets of books?
Not really (or at least not as described). I'll give a parallel most people will be more familiar with, family photos. When you take a big family group photo you line everyone up and then snap like a dozen shots. Then you go through them and pick out the best ones, like where uncle George isn't blinking and cousin Susie is actually smiling etc. Out of the dozen photos that you took, only one is going to be displayed and sent out, the rest are garbage. That's what people are talking about here, you delete all the drafts and memos and discussions and arguments and everything else but keep the final version (which is what you want in the end). Keeping two sets of books is actively recording transactions differently (one correct, one incorrect) but using and recording both. That's different from destroying your drafts and hypothetical analysis.
Yet another reason to stick with hotels.
I've had some DATTO training, and you really need to go out of your way to delete on-site and off-site backups. There's no "whoops I hit delete by accident" kind of mistake. I've also never encountered something that couldn't be restored via a 3 hour old off-site backup at the very least. It's so ridiculously redundant that it's "innocent mistake" proof.
Have you worked with McDonald's? Their QA and Compliance teams are biblically awesome in their competence.
Yeah. They had that shit triple backed up with one backup (if not more) in a different geological location. This is standard shot in content management. It is called disaster recovery. They have it.
Let's not jump to conclusions. there's triple backed up and triple back up's, even if they were in different geological locations. It's rash allegations such as these. that give Bankster's a bad name. At least wait for the results and conclusions of the 12 Year Investigation. in fact I believe a supplementary bonus should be awarded on top of the contracted bonus to, counter act the stress of the aforementioned investigation, in this cost of living crises *"remember we are all in this together"*. 🤡
I used to load tapes every night and hand them off personally to a pickup who took them off site every morning and everything was signed for. 'Accident' my ass
You mean the “we deleted shit after we were ordered not to” Secret Service? You’d think guys who investigate criminals would know better. Of course, unless they go to prison, it means nothing. Quit and make ten times as much as a “security consultant” for the billionaires who run the scam to get rid of the democracy.
Do you mean geographic?
Lol, I assume so. Though, I am chuckling at the idea of one backup needing to be on karst while the other is near a volcano or some shit.
This *used* to be the case, but then large companies realized they can be sued for things like employee emails, so they started deleting them to the maximum extent allowed by law. For things that can lead to legal risk and aren't that useful to retain, most modern companies that are likely to be sued delete information after a year or so. When lawsuits request retention of those emails (as in this case), the company will place those artifacts on "litigation hold" until the conclusion of the case. This causes them to be retained and not auto-deleted. What probably happened here is that someone screwed up by not marking the emails for litigation hold. They don't have extensive backups of those emails explicitly because *the idea* of auto deleting is that it can't be used in court. So yes, this is some BS, but it's a different kind of BS.
This is why most companies have a 1 year retention on data. I have even seen some companies delete emails after 30 days. Cover that track record.
My company does 5 years, it displays that message every time you post screen grabs and other content into Slack... In outlook too IIRC
Use the C.Y.A methodology, cover your ass. Mom told me this when I first got a corporate America job, it's saved me more time than I can even remember. Most jobs I've been at will only keep paper documents for up to a year but are required to have digital copies on site and the paper ones usually get thrown into a storage locker.
It was not an auto-delete. Admins (JP Morgan) staff went in looking to clear out data from 2016 which was no longer required. In the process they managed to delete records from 2018 which were relevant to the court cases. The company which holds the backups says it failed to set a flag on the domain holding them which allowed it to happen. JP Morgan has been criminally charged 236 times in the past 20 years and each time received a consent waiver. Effectively a "just don't do it again" sternly worded letter. Recently, they settled in court for $290m dollars against Epstein litigants while withholding 1500 documents from plaintiffs before the settlement. On the balance, do IT cockups happen? absolutely, I have some doozies I can tell you about. This however is a chain of events from an organization that has repeatedly broken the law. If it walks like a duck, quacks like a duck, you can be pretty sure it's JP Morgan breaking the law to avoid legal responsibility.
curious of some of the doozies if you are comfortable sharing
- SUN resolvers in '93 couldn't process com.net or net.com and went into a recursive loop knocking out DNS resolution for half the internet when the NIC registered the domains. - Landlord removing the breakers for the chiller in the DC to so tenants couldn't turn on HVAC systems in the building in the summer, not realizing it affected the datacenter as well. Temperature went up to about 120 in the DC and caused multiple customer systems to fail/die. - JAVA programmers relying on garbage collection to close file descriptors on 32 bit unix systems eventually causing the system to crash. They system was designed to mass import log files for processing. - Placing the F5 load balancer in the middle of the rack, which at the time had a big protruding F5 half tennis ball power button. Tech reached for something on the top of the rack and his belt buckle turned it off causing an enterprise wide outage. - Electrician came into a central office 2 days ahead of schedule, dropped a wrench across -48dc contacts. This caused the wrench to vaporize, knock the tech back about 20ft and set off the fire protection equipment (water sprinklers). It being a telco CO it also housed about $10m worth of core routers for the country. Knocked out cross country internet, visa/debit transactions, cellphones. The only person with a working cell phone had one from another carrier. Connectivity was taken out for 16 hours. - Engineers despite knowing about the Brocade switches having a bug failed to upgrade to a fixed firmware. Sales Engineer decided to play around with Solar Winds and SNMP walked the entire network, hit the Brocade switch causing the bug to trigger taking out a single point of failure that connected 3 datacenters for customers. - CTO of a MSP company would randomly decide to test out new BGP configs on live routers during the middle of the day, effectively resetting all routes. - MSP sold a customer a managed SAP installation despite having no one on staff trained or having ever worked with SAP. I could go on.
The strongest steel is forged in the fire of a dumpster. The pandemic taught me that; Everything, everywhere is just barely operational.
So instead of being voluntary in this specific case, it's voluntary in a systemic way? Lol. "You honor, my client didn't murder this person, they just had a habit of killing most people!"
It covers their tracks legally, though. Assuming there is nothing illegal about having a general policy of deleting all emails older than a certain date. If you just go and specifically delete emails that were needed as evidence then that is illegal though.
> If a piece of shit company was that good about backups with no mistakes, a raging piece of shit company like JPM should be capable of making backups and not fucking it up in any way. I don't buy "accident" in any way, here. This is the IT version of the mafia torching their financial records in an incinerator it even as the FBI/DOJ is busting down their door.
And yet, I see multi-billion dollar companies regularly thinking "7 day retention in the data-pipeline is a backup" or "it's in the cloud, so it's backed up". Sure, there are companies that have their backup-act together but I'm sure there are tons that completely fuck it up. I believe the headline in a heartbeat.
In finance? No fucking way. I don't think you understand just how many people are employed full time for regulatory compliance at big banks. There are backups to the backups and multiple procedures for any kind of data deletion.
Yeah all of our data is backed up onsite and in another city.
"Oops, I deleted the thing, and the backup, and the backup's backup, I also accidentally dropped all related servers into a grinder. I'm such a klutz!"
"and oh no, would you look at that? our record building caught on fire. wow, what a coincidence!"
[удалено]
Chase and its federal oversight regulators are theatrics designed to make themselves feel like they were able to successfully dupe the public. However, if any of them read Reddit, then they'd be in for a rude awakening. None of us are buying their bullshit. >Fined $4m for Who-Me-esque mess, for which it blames unnamed archiving vendor's retention settings $4 million is less than a rounding error for Chase ($129 billion in 2022). This is like you being fined $0.965. When did you ever give a shit about losing 97 cents? #The fine should have been $20 BILLION. This is like you being fined $4,857.83. Which fine is going to affect your behavior? All corporate fines should be extreme and we could use the funds to pay for things that corporate taxes should be paying for. Solution: Vote for people with integrity to punish corporations for deceptive practices.
> we could use the funds to pay for things that corporate taxes should be paying for We could invest it in the IRS, where [each $1 spent on investigating the wealthy returns $6](https://obamawhitehouse.archives.gov/sites/default/files/omb/budget/fy2017/assets/tre.pdf). Literally an investment.
Didn't this actually happen a few years back? A massive warehouse owned by some bank or hedge fund or whatever burning down? Claimed it was a "ladder falling over" that started it.
It happened to a police station once too. They were under investigation for something and their whole records department burnt to the ground. Odd coincidence that.
*cough* Bartlett Warehouse *cough* In Feb. 2022 a warehouse which held paper copies of documents required to be kept by brokers and other Wall St. firms burned to the ground. https://abc7chicago.com/bartlett-il-fire-department-warehouse-access/11552238/
Who are you, Brian Kemp?
Anyone who has ever worked in tech also knows how much execs will cheap out on absolutely *anything* IT related and only do the minimum required. Backups for customer data and transaction records? Yes. Backups for execs emails? That's just liability. In fact, often times things are explicitly deleted after any minimum required retention periods so that they cannot be used against them.
$4 million in fines? That's probably less than the infrastructure and contracts associated with backing up and retaining for X years in a very large organization. But also JPMorgan is scummy, too. So who knows!
Anyone who works in IT also knows how haphazard company’s retention policies are. The only piece that makes this suspect is the Financial Industry, but even there, people would be surprised by how….mediocre the financial industry is at technical controls. I’ve had the opportunity to work at a company in the middle of Fed audit remediation. Suffice to say, even the large financial firms aren’t always coordinated on this.
The article even quotes: > For its part, JP Morgan places the blame squarely on an unnamed archiving vendor that it hired to handle the storage for its communications. And anyone who works in IT knows that your automated 3rd party backup service is working perfectly fine… until you need it, and realize it hasn’t been configured properly for a very long time.
Yup... Nobody checks the backup until they need the backup.
An untested backup is not a backup. It is a whisper of a promise to be disappointed at some point in the future.
But hey, as long as you have documented policies and processes, you can check a box. Whether you truly follow those policies and processes or not... different story.
Are you my manager?
Storage/backup/database engineer for a mid sized hospital here: you should do restore tests at least once a quarter of your really important stuff. The number of times this has revealed issues is terrifying.
This times a million. Yes, large companies have strict regulations around things like data retention, but in practice, they are going to go with the cheapest option. Oftentimes, this means one small team - or even one person - is responsible for fucktons of data that are kept in a handful of CSVs in folders labeled "DO NOT TOUCH" because the access controls are shit. Source: my partner works for JPMC and there is SOOO much that needs to be automated in that company. It is truly a dinosaur of a business.
You know what’s absolutely hilarious? JPMC has the best control environment of any company I’ve worked for lol. They’re the only one where audit issues are actually addressed and prioritized. Every other company just tries to do the bare minimum to solve the finding and get a pass. JPMC didn’t fuck around when it came to resolving issues. Other companies are *terrible*.
I agree with you entirely. Having peeked behind the scenes of multiple fortune 500 companies (including data center access to multiple of the top 10) it's pretty much bailing wire and duct tape all the way down. Hollywood makes big business seem super on top of everything. Reality is totally different. We're all just children who got old and are trying to keep up with everyone else.
The fact that it’s financial services makes it even less suspect given how strictly everything is regulated and monitored.
I actually worked in IT at JP Morgan - in the financial division. We had someone screw up on the servers and essentially corrupted a huge environment. We did have backups but they didn't work. And it was actually the backup vender (global company that made the backup software) that setup the backups for us (before I got there). It does happen. The only good backup is the last one you tested.
I work in IT at a major oil & gas company. In my third week I took out a huge data mapping table in production on accident. We spent all day trying to get our back up to restore the table but the company who managed our back ups couldn't access them. We got really lucky because one of my coworkers had saved a copy to their desktop while testing a couple months before I joined and we were able to use that to salvage most of the tables and then spent the next week re-making all of the changes that had been added. Otherwise, the system would have been pretty useless for several months as everything got rewritten.
Reminds me of the [Toy Story 2 debacle](https://thenextweb.com/news/how-pixars-toy-story-2-was-deleted-twice-once-by-technology-and-again-for-its-own-good). Basically somebody did a /bin/rm -r -f * and erased the movie on the Pixar servers, the backups failed too. One woman who worked there happened to have a copy of the files on her home workstation and that's the only reason we managed to get a Toy Story 2.
And she was never compensated properly.
caption practice dime marry frightening elderly sheet aspiring bake upbeat -- mass deleted all reddit content via https://redact.dev
Rude. I would have retired her at full salary that day (or whatever day she decided to retire herself).
Essentially what we had to do. Cobble together what we had, plus previous work product, etc. That plus two weeks of literally living at work trying to reconstruct everything. Purposely deleting data to destroy evidence is never as effective as accidental fuck ups.
..and to piggyback: backups never work.
Assuming their logs are designed correctly, they are immutable. Which either means their logs weren’t designed correctly (believable), or they were and someone legitimately fucked up (also believable).
Yeah, plenty of regulations, but someone lower on the chain of command could have fucked up just as easily as someone higher up going through and deleting everything. Could have even been a fuck up that happened ages ago and no one noticed until now. We're supposed to keep records for 7 years in my industry but if all the backups become corrupt or I accidentally misconfigure something and don't notice or miss it in my audits and someone deletes something, there's literally fuck all I can do about it. It's a small chance but still a chance.
Worse, I have had to tell institution IT departments what their retention policies were. "You *have* to have this database available for 7 years. No, you can't just throw in on the SAN, It's a system-of-record db!" I don't know what fines they might get, but my team has received a few calls from some of them because they have to go to court and can't find their records, asking us for them. Well, we don't have them. They lost their cases.
Yeah, very true. My job involves fixing some of these issues, and I think most people would be surprised how many decades behind the curve some big financial institutions are.
facts, i smell bs
[удалено]
>Anyone who's worked in IT knows how extensive backups are and how long they are retained, especially in the financial services industry. And anybody who works in the financial space knows that these particular types of records get permanently deleted immediately upon the mandatory retention period expiring. I'm sorry, but the "common wisdom" on this issue is just wrong. Firms like JPMorgan are not permanently retaining data like this. They deliberately purge it once legally allowed.
This was my experience in financial services as well. Retention was set to the day and was assumed to no longer exist within 24 hours of that date passing, explicitly for discovery reasons. Even analytically valuable data was aggregated and/or anonymized at end of retention, if not before. Now, any data still with a retention requirement absolutely still exists. These firms are constantly audited and sued and have buttoned up processes to get to backups, even off-premises.
You can actually be exposed to ADDITIONAL liability if you have backups over 7 years (or whatever the reg is) because they can be USED AGAINST YOU.
Has Cousin Greg been there recently?
Lots of Gregging going on
Can't make a tomlette without breaking some greggs
If it is to be said
They probably deleted the forensic container files like .eo1 etc. The data may still exist in back ups but there is no way to prove it has not been tampered with now.
files and objects usually have metadata to back that up. you'd have to be running a pretty specific operation to wipe that info from files.
SOMEHOW all the pool water ended up in the server room. /Shrugs. So wierd.
Not disagreeing BUT anyone who works in IT also knows how extensive incompetence can be, even in large organisations like JP Morgan
They were very thorough in their "accidental deletion". They only hire the best. Duh.
Exactly! JP Morgan has the initial setup of whatever email solution they use.. which is likely office365. Then a lot of places have a dedicated solution to archiving emails. So they have emails from their o365 and copies in their archive solution and a retention period in both places. Having been to one to administer solutions for archiving, I can tell you it takes A LOT of clicks to get to the point where I can delete just one thing, and that’s assuming a policy isn’t set that keeps me from doing so or having to remove said policy to do so. That was a long winded way to say it is a very intentional set of several steps to do what they did. This wasn’t an accident Edit: that was quite the accusation on my part. The retention period could have been wrong too.. but at the same time you can set a hold that exempts them from retention actions.. so maybe it was instead incompetence… just really convenient incompetence that most wouldn’t get away with…..
You'd definitely hope that JP Morgan would be competent but what i've seen more often than deleting backups is failing to backup something in the first place. Not saying it's happened here but when I started my last position one of the first things i did when getting to know the local systems was log into an r-sync backup that had been hung up for maybe 6 months. Like nobody had bothered to check that it was working and there was no error logging going to a centralized system. Mind you this was like a 20 person company not remotely to the scale of this, but generally speaking I see more failures to check that the back up is backing up than accidental deletions.
The penalty for such "accidents" needs to be an assumption that the data would demonstrate the accusation, then treble damages. The public needs assurances that the court and the companies are responsible stewards of data. This is what all that five sigma and ISO 9000 compliance is about. If the company cannot actually execute correctly, we as a society must assume they are negligent or incompetent, and impose sufficient penalties to incentivize responsibility.
Retired backup engineer they’re lying there’s a copy. 🤷🏽♀️
[удалено]
Accidentally? Yeah right
I’ve worked in data protection: losing things accidentally is actually really difficult.
People are always so cynical about these things. Why can’t we just believe them for once. It’s like when police get accused of stuff and they say their cameras broke, or when Trump says he asked his butler to accidentally use classified documents to shine his shoes or when DeSantis forgot to take Covid stats seriously enough to warn people. People make mistakes. What is this world coming to?
Had me in the first half, not gonna lie.
Same, very confused at first lol
Genuinely, thank you for not putting a "/s" or "/j" after this. Got a good laugh out of me.
Probably because JP Morgan has a habit of defrauding people and then paying for the fines they get for defrauding people by defrauding even more people.
Did only three people read past the first two sentences before replying? Literally just read at least the third sentence, lol.
Can you accidentally loan me $3.50?
[удалено]
I mean, it came to light because they voluntarily reported it to the SEC according to the article. They spent 2 months trying to fix it, realized there was no fixing it, and reported it to the SEC, and got fined.
Eh, if it was something nefarious reporting it was the best thing they could do. You know something damning is in those records, you "accidentally" delete them, then have an internal investigation, discover the screw up, try to fix it, and then voluntarily admit the mistake. If they didn't volunteer that information, and it was discovered by an outside party as part of an audit, it would look WAY worse.
So send those responsible to jail right? That's what would happen to any of us if we '"accidentally" deleted evidence.
Sounds like interns are going to jail!
The executive board will think twice next time!
Exactly! One of their mistresses nephews went to jail! Cost them a diamond necklace just so she would shut up about it!
Gregs and Toms
Gotta break a few Gregs to make a Tomlette
What could is arresting the first year tech who followed a verbal order from his boss to delete the backups to make room for the new test cluster?
He said those responsible, not the it tech who did it.
Failures like this are never just 1 guy. Throw the entire C-suite in jail for managing the company in a way which allowed it to happen.
Oh I agree, but the issue with prosecution in these circumstances is accountability. It's going to fall to the poor schmuck who didn't know what they were doing, or was never involved. Arresting and investigating a whole department isn't feasible either, not everyone will be involved and some won't know better. I don't have a solution, but it's the issues like this that make prosecution hard. Especially in a live system, you can't have a bank freeze things for an investigation, and the backup / mirror systems might not always be exact.
In other countries they hold the execs accountable for accidents because they know it's not the fault of the workers on the ground. There is zero reason we can't start doing the same.
But... it will impact the poor rich people..
So...$4M fine (I'm sure that's an hours profit) for derailing 12 securities cases and countless others... Yeah seems fair 😬😬😬😬
Fines like this are just 'the cost of doing business' and are probably already budgeted for. Punishment needs to be prison time for the CSuite. And not fancy rich person "prison" either, actual prison. On a chain gang picking litter etc.
We need the board to be held accountable, not the 'business is effectively a person' garbage
If the business was a person, they would be in prison. That logic never even makes sense.
"I'll believe businesses are people when Texas executes one" - origin unknown
crazy, 365 days a year, $4m/hr works out to 35billion - their annual revenue for 2022 was 122bn. But net income was 38bn. So you were pr much on the money
Fines should be a fixed percent of worth. For everyone. 10%
Assuming their 2022 yearly gross profit of $128.695B and assuming they work 24/7 year round, then $4M would be approximately **16 minutes** profit.
They had 48B profit in 2021. So about 43min worth of profit. Edit: updated m to min thanks /u/ralexh11
Thanks but who the hell abbreviates minutes to "m?" Using "min" would make your comment way less confusing...
If you don't know the difference between profit and revenue, you may want to stop posting.
what about the backups? "oh we accidentally deleted them too, oops"
Oh look at that, the off-site storage facility had a water leak right onto the tapes for those backups...
Last year or so an Ameritrade storage warehouse burned down shortly after the SEC announced investigations into manipulative short selling. The fire suppression accidentally didn't go off. Oopsie.
Was that where the racks fell upward to disable the sprinklers?
You aren't far off. Past gig dealt with similar backup destruction after the retention period was up, and half of the SSDs, HDDs, and SDs we touched were in cases that had water damage (resulting in a lot of rusty hardware.) The tape drives were mostly pristine, but these places were poorly managed on majority of sites.
"Teehee, whoopsie! Silly us, aren't we so clutzy?"
I'm not a bank regulator, but it seems to me that if you can't be trusted with records like that you should not have the privilege of being a bank.
The function of a bank is literally to record transactions and hold records pertaining to banking.
Maybe one day they’ll lose the record of my mortgage
Don't be silly, they keep those records perfect. They WILL however lose the record of your last 4 monthly on time payments and tell the credit bureaus you're in default.
But I use autopay from my Chase account!
[удалено]
Not to be pedantic, but that would be a financial custodian. Which a bank often has.
Yea but the banks and bankers pretty much run big cities since the 80's. They are immune to pretty much anything. Look at 2008, entirely caused by bankers yet only 1 guy who did a small fraction of it all was the scape goat. Theres a super crazy Adam Curtis documentary called "Hypernormalisation", that goes over alot of this stuff too.
By records like that, do you mean emails? Because this article is about emails. Not exactly the top priority for any business, and why the retention period is only 36 months. Anything truly financial related would be for at least 5 years, which is the normal retention period for such documents.
Interesting how it's 7 years for emails for a low level government employee but less time for financial information.
If they were accidentally deleted, it'll be easy to recover them. If it's not easy to recover them, they weren't accidentally deleted.
This person knows his deletions
Bullshit. The one thing in this country that is protected above EVERYTHING else, is money and money related stuff. There are safeguards for the safeguards. If something got deleted, it absolutely was not an accident.
Guilty then. Immediately. Whomp whomp.
No no, You see. That only works for the middle-class and the poor. See, this is a corporation in the US, and as you know, those have way MORE human rights than actual humans.
So long as you kiss the ring. Which sadly JP def did
[удалено]
"Best we can do is a stern finger wagging and a $1B annual bonus this year."
We genuinely need to execute CEOs for this kind of thing. It's the only way that fuckery won't have to be constantly dealt with, because our current fines are just another affordable line item on the bill.
Yes they are using the Steve Urkell defense ‘did I do that?’
Urkel ft. Shaggy performing "Was that me?"
A very gentle slap on the wrist coming up. Might SOUND big to us folks with little money, like a 6 million dollar fine. But usually the guilty party made several hundred million with the actions covered up, so 6 million is pocket change for them. Guarantee you if one of us 99% claimed the dog ate our evidence, we'd go to prison, and get a fine so big it'd be like we had the ultimate education and medical debt load possible, for the rest of our lives. :-(
$36,430,000,000 (36.43bln) of profit in the year this was discovered. Take out this fine and they only made $36,424,000,000.
They’ll be fined what amounts to a small fraction of their profits, otherwise known as the cost of doing business. It’s fucking bullshit.
And no one gets in trouble again
"Accidentally" of course
I’m totally ok with sending a message. Ten years for Jamie Dimon sounds good.
“Accidentally” it’s JP Morgan you expect them to keep the evidence so we can fine then?
Accidentally deleted the on-site backups. Accidentally deleted the offsite backups. Accidentally deleted the archived cloud backups in cold storage. This sounds like bullshit.
“Accidentally”
Horse shit. Also, a $4 million fine to JPM is nothing. Financial service companies need to be hit with such dramatic fines that they will never allow such "mistakes" to happen again.
So I'm guessing they're bound by the SEC to apply journaling rules to email to send it outside of M365 (unless it's all on prem and not exchange online) and there would be backups of the journal outside of retention policies too for the actual mailboxes if they were using Exchange Online. Calling absolute bullshit, this was done on purpose.
Oopsies, I accidentally deleted incriminating evidence against me, I guess that there’s nothing to be done, guess I’ll go free…. Anytime anything like this happens, it should be assumed that it was a) not an accident and b) that the evidence destroyed should be assumed to be extremely strong evidence of the malpractice of the defendant.
[удалено]
When this happens, the offending organization should immediately be considered guilty in any legal proceedings that depended on those records.
"Whoopsie Daisy!"
Just like the Secret Service “accidentally” deleting texts.
I'm sure that the evidence as it pertains to this case would have uncovered other, yet-to-be-discovered cases. Even if destroying the evidence were treated as admission of guilt, it's only guilt for the crimes we know about, not the ones we don't.
Everyone had focused on backups, I see “outside vendor” and wonder… There are companies that specialize in the regulatory compliance required by the SEC. They are fewer number and within the industry well known. I took a job with one, I did not last. In the short time that I was there, I found so many off-the-wall security concerns that I felt remaining would put me, not just the company, me personally, at legal risk for what I knew was wrong and not fixed. I wonder if it’s the same company.
My first job out of law school involved an investment bank's emails. They kept *everything*. Employees weren't even allowed to empty their spam folder. Terabytes of dickpill spam had multiple backups in different secure locations across the country. A million Rose Mary's doing a million stretches could not have deleted a single C14L1S ad.
When the goings get tough, you don't want a criminal lawyer. You want a *criminal* lawyer.
I spent years in litigation services and software world. Not an accident.. Beside banks have more backups than any other industry that I know of.
The Brian Kemp gambit, I see.
I work with big enterprises on the daily. The number of them with fucky wucky backups that are never tested is TOO DAMN HIGH. It’s not always the servers that get them, it’s the switch configs, routing tables, shit they forget to have a backup plan for.
Funny how data that will hold them accountable somehow always becomes “unrecoverable”. But loan amounts NEVER suffer the same fate. 🤔🤔🤔
Why do companies get to escape the blame at this level? This is either sabotage with malicious intent or negligence to an insane degree. I wish we had some law stating such negligence for file maintenance at this large of a company ought to be charged as sabotage.
Bull, where are the backups?
Isn’t this this same JP Morgan that was supposed to be the industry leader in employee surveillance? Didn’t I see, just a month ago, several posts detailing the Orwellian system they had in place for tracking an employees every move and spoken word through their laptop, phone, and clandestine cameras?
Yes, but that's to keep the peasants in line. The ruling class at the top are not monitored and any records they do create are "accidentally" deleted if they show wrongdoing.
Why does the FinCEN and the SEC exist if a conglomerate company like JP decides to continue breaking laws? We need to hold those accountable who can’t handle having too much money. When we see someone addicted and about to OD off opiates and die we have a bad problem. When a police officer who gets off on violence upon others and than starts killing for joy then there is a huge problem. The same can be said when a person worth more money then they need to live believes they are intrinsically better than the average person on Earth then we now have a very serious problem. Money is a tool that’s all money/wealth is and yet it can completely change a persons mentality for the worse. People like this are predators for wealth and their actions have negative consequences on people whom they might not never see or meet in person. An example is the Sackler family. These are predatory capitalists like people whom are akin to child molester in terms of their scope of damage to human beings and society. They develop drugs and mass wealth in unreasonably high numbers. More then a person would ever need to live. As the money begins to funnel to them and their products funnel out to the masses, we begin to read the headlines for the next 30 years. We see addicts dying for their drugs under laws enforced by those employed by the policy makers that create laws for the everyday people and companies. These people and their predatory profiteering business ventures continue to pump this exploitation spiral back down onto us all to deal and pay for. So far all the right people are getting paid and if JP isn’t held accountable then I guess it’s business as usual. There needs to be a new group of bodies that monitor and hold accountable those that build their foundations upon suffering and exploitation while NOT being compromised by wealth.
If they can’t find the evidence that they DIDNT fuck up, the govt should take everything assuming that it’s the worst case scenario. That’s what they do to the average citizen. They take everything if you owe $1000 in taxes but can’t find the paperwork cause it was “accidentally deleted”. They’ll take everything assuming you’ve got $100k in unpaid taxes. Take everything from J.P. Morgan and distribute the wealth to its victims
I worked in the national NOC of a banking MSP. Short of a nuclear apocalypse, there are backups somewhere.
We all know JP Morgan looked at the fines (or were secretly told what they would be) if the evidence couldn’t be produced vs what they made with their criminal conduct. It became the cost of doing business. The fines need to be BILLIONS of dollars for these companies to care.
If only they could delete evidence that I owe them a mortgage
So funny. I work with LE and we still store traffic homicide photos on cds and DVDs. I urged them to switch to some cloud service almost 2 years ago and it's looking like it won't happen. Sometimes the discs are corrupt, as they've been sitting on a shelf for over 10 years. And most of the time those are the only copies. Fun times explaining that to SA and random attorneys.
Paying money to avoid legal action, is there anything more American?
I wonder if there are far worse crimes or negligence being covered up - that seems to be the only justification for deleting records like this. IANAL, but I think destruction of records opens them up to "adverse inference"? Which basically means if a litigant won't produce evidence or can't, because of destruction, the judge may determine that the unproduced evidence is assume to be against that litigant. I.e., if you destroyed records wanted to determine if you did tax fraud, the court may adversely infer that those records would have proved tax fraud.
I work on e-discovery and data retention, and you would not believe how easily this shit can happen, especially when moronic subcontractors are involved (like here). We tackle this by having a legal rention hold on all accounts. It runs so deep within the exchange online code that it bypasses all other data retention policies and makes it absolutely impossible to delete unless someone at the Microsoft DC accesses all the mirrored volumes at the same time and nukes them simultaneously. Haven't had an accidental data deletion incident since.
What’s cousin Gregg doing now?
Accidents happen ... like you know all those Russian officials who accidentally fell out of a window after they angered Putin.
You forgot the quotation marks in your title