In the unlikely event that there are any negative consequences for this breach, they deserve every bit of them and more.
So they're pretty much taking the existing terrible nursing environment in healthcare, and weaponizing it. Nurses already have too many patients and not enough CNAs, on top of 12 hour shifts, needing to do charting after those 12 hours. Healthcare squeezes nurses to the breaking point. Data point: my wife is a nurse.
Garbage company, garbage culture, garbage business model.
And I’ve heard “it used to be so much worse”.
The American healthcare system is fairly well broken from virtually every angle.
However, as it applies to my parent comment, the companies mentioned were: Shiftkey, Shiftmed and Carerev. I do not see ENSHYFT mentioned, so I stand corrected.
Thus, I'm led to believe that nurses using this app have to have some sort of difficulty finding jobs for other reasons, or they're just not informed about their options.
Some may just want to pick up casual shifts without any obligation on top of their full-time work. This is kinda double dipping because your full time work is paying your benefits, so why work overtime at time and a half for them when you can get 2x+ somewhere else with + pay in lieu of benefits?
Big orgs don’t want to deal with 1000 different individual contractors (especially if it means taking potential misclassification of employee as a contractor) risk.
I think the bigger issue is the myth of nurse fungibility. A rando nurse unfamiliar with your setup/org is unlikely to be very productive.
But when it comes to private markets and semi-private negotiations that same sentiment doesn't easily transfer. Does society benefit in some unique way for allowing asymmetries in labor negotiations, private markets like Uber, or B2C relations like Robinhood (1,2)?
1. https://www.sec.gov/newsroom/press-releases/2020-321 2. Note, Robinhood was fined not for front-runniny customers, just for falsely claiming customers received quality orders. I suspect theyve only stopped the latter behavior.
I don't think that's true at all. Companies and individuals negotiate all the time with information the other party doesn't have. Insider trading is about fairness on public markets so every negotiating party of the same type has the same information, and is quite specific to that.
Insider trading is not about fairness. It’s about theft. If you overhear someone in a public place talking about an upcoming merger, you can trade on it.
They do? I’m quite happy when I have more information than the party I am negotiating with.
Do you tell your customers all of the input costs of the product or service you sell? I doubt it.
Also, certain parties that trade in public markets have way more information than any retail investor could ever hope to have, hedge funds buy satellite imagery of parking lots, track oil tankers at sea, etc to gain an edge.
Insider trading rules are meant to prevent the public bagholding stocks from the management team having insider information that no other market participant could or should have, there are no rules against legally gathering or purchasing information on your own to gain an edge over other market participants.
I don't disagree with you, but wow that requires a bleak outlook.
One example is biscuit manufacturing, where it’s a fairly open secret that supermarket own brand biscuits are the same product as name brand, because it’s better to capture that segment at a lower margin than to lose it to competition.
Tech now makes it possible to target individuals rather than demographics, but there’s nothing inherently against the status quo in doing so.
I’ll gladly take all the free alcohol an airline will give me, but other people don’t at all!
I sell some stuff on eBay. If you appear untrustworthy, I’ll spend more for tracking/better tracking on your order so you’ll actually get your stuff faster/more reliably.
Price segmentation was more palatable a few decades ago, but technology has enabled us to push it to this absurd (to me) extreme where individuals get different prices at different times of the day.
It feels wrong to size up your customer and pick the highest price you think they'll pay.
.... which, in the day and age of facial recognition, gives me an idea for a startup.
When you're using that fake email be sure to have a burner phone or public internet so they can't link it to your IP, also don't use your computer or any computer you've logged in on so that browser fingerprinting doesn't tag you, also turn off your GPS so they can't geo correlate you.
Of course the rich person is in the same boat, their geolocation will log that they went to Burger King, or their credit card company will snitch on them. Okay, fine, pay with cash, cover your phone in a tin foil faraday cage. Now you also will need to drive a 30 year old car to said establishment since the car manufacturer put a cellular modem and GPS in your car and sells the fact that you went to Burger King to the highest bidder.
That's all I can think of off the top of my head, I'm sure there are dozens of other ways people are tagged. At some level may as well either use the app or just not go.
These predators aren't scared of name and shame. Any publicity is good publicity (And if it actually gets bad, they'll sue the pants off you.). They are scared shitless of laws censuring their behavior. It's why they fight like mad to ensure that they aren't subject to them.
There are exceptions. See the ongoing kerfuffle over "DOGE" employee lists.
People have spouses.
People’s parents pay credit cards.
People with bad credit sometimes don’t care.
People have family money.
People with low debt can be desperate for work.
Does it even work?
I can't find it now, but I believe LexisNexis or another large similar reporting/data agency had a product catalog of dozens of products that spit out values for ability to pay, disposable income monthly, annual income, etc.
It makes you feel awful thinking about the direction things are headed. Corporations approaching omniscient regarding all facts of our lives that are reasonably of value to them.
Most people I know with bad credit aren’t desperate for money. At least not educated, highly paid ones like nurses.
Most just ignore their financial problems in the hope they go away.
Not to mention nurse demand outstrips supply, so they have options and can certainly turn down bad offers.
People happily give away lot of info voluntarily, for example by paying with a card instead of cash.
The idea that you'd offer less seems... counterproductive to say the least.
> We use certain physical, managerial, and technical safeguards that are designed to improve the integrity and security of information that we collect and maintain. Please be aware that no security measures are perfect or impenetrable. We cannot and do not guarantee that information about you will not be accessed, viewed, disclosed, altered, or destroyed by breach of any of our physical, technical, or managerial safeguards. In particular, the Service is NOT designed to store or secure information that could be deemed to be Protected Health Information as defined by the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”).
IANAL and all that, but I’m not sure you can use the excuse “We didn’t design our system to be HIPAA compliant, sorry,” and hope your liability disappears. Does anyone know?
0: https://eshyft.com/wp-content/uploads/2019/06/ESHYFT-Privacy...
> I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
It looks like providers accidentally uploaded some PHI.
IANAL so may be wrong, but I worked for a healthcare company. Whether HIPAA applies to them depends on if they are considered a covered entity or a business associate [0].
IMO they aren't bound to HIPAA requirements as a covered entity.
Business associate is a little tricky to determine. But business associates have to sign a BAA (Business Associate Agreement). And I doubt they would have signed one if they have that in their privacy policy.
Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant..
0: https://www.hhs.gov/hipaa/for-professionals/covered-entities...
You seem to imply using GMail is a bad thing? I think GMail, when appropriately configured to handle PHI, is probably a million times more secure than some crappy bespoke "enterprise" app.
The issue with Gmail is sending to the wrong email, sending to a broad email list, having people download it to their local machines. And the amount of PHI being transmitted in these files is larger than this s3 bucket.
When you've got a trickle of information coming and going from hundreds or thousands of other individuals working at tens or hundreds of other entities it is.
You'd eventually wind up developing the kind of ridiculous "secure messaging and file drop" type service that every megabank builds on top of their SFTP and ticketing systems for that purpose. That stuff ain't cheap to run and keep running.
Better to just start with a solution that's 99% there.
That being said, HIPAA isn't even relevant here because "ESHYFT" is just a provider a labor. No different than a big consultant providing staff augmentation services.
Again, HIPAA continues to be the most colloquially misunderstood law out there.
The rule that makes providers "covered entities" isn't really about insurance, it's about whether they transmit specific HIPAA "transactions" electronically. Now, yes, most of these transactions having to do with providers are thing like claim submissions or pre-authorizations to insurance. But there are other reasons a provider may need/want to send a HIPAA transaction electronically.
My point is that there isn't some sort of "loophole" where providers that don't accept insurance are somehow being sneaky. The whole point of the HIPAA security rule is to protect PHI when it is transferred around to different entities in the healthcare system. If the information is going just between you and your doctor, HIPAA isn't relevant, and that is by design.
That's correct, but if you don't accept insurance then you will not transmit anything that meets the criteria to be covered by HIPAA. At least, in terms of being a provider. Things are different if you're a health plan or clearing house.
I spent a lot of time and money questioning this with lawyers at a health tech startup I previously worked at. The underlying reality is nearly the entire US healthcare system falls under HIPAA because nearly everyone wants to accept insurance. However, if you're a doctor running a cash-only business you will not be a covered entity, even if you send PHI electronically.
That said, it's both less broad and more toothless than I'd like. If FB convinces you to install a tracking pixel (like button) stealing your private medical data, they likely haven't violated any laws. At most you'd be able to file a claim against the person who created the leak.
Not a lawyer and all that, but for TFA I don't think HIPAA would be a valid way to try to limit your losses. It's a bit closer to what would happen if you (a doctor) uploaded patient data to Google Drive and then somehow leaked that information (one of Google's contractors disclosing it, a hack, whatever). Nothing about ESHYFT's offerings requires or would be benefited by the data HIPAA protects, and (ignoring incompetence and other factors) I'd be as surprised to see my health data leaked there as I would to see a YT video going over my last lab reports because of some hospital's actions.
They could still be liable for all sorts of other damages (and maybe somebody can convince a court of a HIPAA violation), but it's not an easy HIPAA win.
>With persons or organizations (e.g., janitorial service or electrician) whose functions or services do not involve the use or disclosure of protected health information, and where any access to protected health information by such persons would be incidental, if at all.
Based on the context from the article of the PHI uploaded being incidental, it would probably fall under this exception. It sounds like ESHYFT isn't meant to be storing any PHI based on the privacy policy above.
0:https://www.hhs.gov/hipaa/for-professionals/privacy/guidance...
If you replaced nurses with gig workers and uber for nurses with something like WeWork this would just be like every other leak we talk about on HN.
>I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
The title is exaggerating what the article says and the article is making a big stretch about this being possibly HIPAA covered, I stand corrected, this has nothing to do with HIPAA.
What was leaked was nurses' doctors notes submitted justifying calling out of work. Still a serious leak but nowhere near what is being suggested.
If 'Uber for nurses' is acting on behalf of nurses, it probably doesn't apply? If it's acting on behalf of the hospitals (who are indisputably covered entities), then the situation is much less clear.
I encountered a similar situation with my startup many years ago and decided "better safe than sorry" after consulting the lawyer.
In general, I've found that people tend to think HIPAA applies much, much more than it actually does. Like people thinking if you're in a meeting at work with clients and say "Sorry, Bob couldn't be here today, he's got the flu" that that's a HIPAA violation. No, it's not.
This is just an employee data leak, just like a bajillion other employee data leaks. The fact that the employees happen to be nurses still doesn't mean it has anything to do with HIPAA.
Really, "Uber for Nurses" is a title to drum up interest. "Large Staffing Service" would be factually accurate.
Maybe you think the startup maintains patient records?
The article lays out the nurses uploaded them, the provider. This is a temp booking system. The health records were uploaded by the nurses to communicate reasons for absences to their employee and weren't required or requested
They have as much responsibility as Dropbox does. Nurses shouldn't have uploaded them.
The Hippocratic model isn't awesome.
For it to be effective, the money can't come from the provider, meaning it's either from the payer or the patient. The payer doesn't really care, costs are contained as far as they are concerned, with the various Quality Initiatives. That leaves the patient to sign up for a subscription model.
I explored that as a business 12 years ago, and sadly there is still a need. The worst part is that most clinicians actually want to do the right thing but it's the admins in their organization who set up processes that result in terrible outcomes.
HIPAA is wildly misunderstood by the public as a strong safeguard, meanwhile medical offices just get any patient (a captive audience) to sign a release waiver as part of patient intake ...
The reality is that for a small doctor/dental/whatever office, there is essentially 0 risk. HIPAA violations that carry significant penalties go to huge hospitals and healthcare companies.
Your neighborhood doctor has to screw up in a major way for an extended period of time to have a minute risk of any consequence.
If they provably expose your data, and you report them, they will get fined. Or they would have last year, who knows if those people still have jobs.
Last year the total HIPAA violations fines were less than $9.2 million.
A figure I could find for hospital revenue in the same year which is a good enough proxy for fines vs revenue is about $1.2 trillion.
Which rounding because who cares comes to 0.001% of medical revenue ends up being paid for HIPAA violation fines.
Or the equivalent ratio of about a cup of coffee for a typical enough person per year.
HIPAA needs teeth, what it says you're supposed to do is quite strong, the enforcement of it is pathetic.
(for SSN, never tried to prevent scanning of my ID)
I also always ask for a paper copy of the disclosures to sign, saying that "I don't sign blank checks" when asked to sign the electric pad. I've never had an issue with them printing it out, letting me sign, and them scanning it in.
Healthcare "security"/"authentication" is just "protected" by your name and date of birth which is easily discovered for anyone online.
Which means it's either old, or they recklessly opened it up because they couldn't get files uploaded/downloaded to the bucket from their mobile app/services.
The security procedures I take while hacking out something for my friends at 3am should not extend to products hosting PII. It's up to YOU to implement basic data security.
You definitely need to do this, but a platform should help where possible, and try to have users fall into a 'pit of success' where if a dev just goes with the defaults everything is fine. In this case, S3 buckets should be private and encrypted by default and devs should need to actively choose to switch those things off (which I think may be the case now, but it wasn't in the past.)
Yeah, that's the case right now. There's multiple screens you have to go to, that almost scream at you that you're making EVERYTHING PUBLIC. Also, in the overview, it distinctly says "!! PUBLIC".
Cloud technology allows us to build fantastic software very fast. But if you’re too lazy to implement a basic api to get S3 data on a needs to know basis, that’s on you.
AWS makes this very easy. You can’t blame anyone else.
- the uber-fication of nursing, bc of cheap and corporate owned hospitals won't just hire them as w2 employees
- cheapness probably led hospitals to this crappy app, which probably gave kick backs to the admins that approved it
- this should totally bankrupt the ESHYFT, but more likely nothing will happen
A big bad health system will have its own “float” pool of W2 nurses or internal offer system.
Heck, that’s part of the “sell” when selling out: to get access to a robust vacation/vacancy handling system.
Most importantly there's a large amount of highly incentivized people probing constantly at mass scale. These days it's very easy to scan the internet (github, IPs, domains, etc) for information and "bad S3 configuration" detection is just a script anyone can use. No advanced programming skills required.
Not saying it is right, it's just what happens.
Is there a different source for the "open S3 bucket" in HN title?
source: i run Wyndly (YC W21 https://www.wyndly.com), which is most easily understood as a telehealth allergist online.
https://www.hhs.gov/hipaa/for-professionals/covered-entities...
Covered Entity has a narrow meaning. Notably, if you don't accept insurance, it's very unlikely you're a covered entity.
At my old job we didn’t even allow PII to pass through our API so we couldn’t accidentally log it and kept all of it in its own VPS totally isolated from the rest of our system. When we needed a record we’d put it into an S3 bucket and hand back temp link that only the caller could access (and expired within a short period of time) Total pain but you could sleep at night.
The fine for one person’s information from this site should be equivalent to their entire revenue for the year; should not be permitted to be resolved by bankruptcy, and should be required to transfer to any company purchasing their assets.
Their entire executive team should be jailed for a minimum of 3 years per individual offense.
Only then will there be any modicum of an opportunity for us to see some real change.
This is so over the top reactionary and stupid, I can't help but write off your entire comment.
You want the Chief Accounting Officer to go to jail for 3 decades because of a data breach?
Now you've taken it too far.
As far as I know, never really took off, at least while I was maintaining it, but the gig economy wasn't in full swing yet.
All that said, the sheer number of forms and amount of paperwork the site required you to fill out just to sign up had to have been a limiting factor in getting you in the door. Real high friction getting people in the door.
I wonder if Eshyft was able to somehow simplify the process.
Covid lead to a wave of quitting, retirements, sick calls and increased health care demand so nurses could flex their spot in the market place and get the new market rate through contracting.
(vs current: "'Uber for nurses' exposes 86K+ medical records, PII via open S3 bucket")
Did the submitter intentionally change the post title to get more clicks?
The lack of respect that some companies have for their customers is appalling.
In the USA it is not that easy to achieve, as, historically, it is not a single country but a union of "states" that is countries, so the main boss should not interfere too much with local bosses and force on them particular "federal" laws.
Yet every month I see a story here about an huge data leak from an unrestricted bucket.
Here we are. I guess we can blame the users and not any shitty security architecture slapped on AWS.
Clearly what matters most is that legal culpability be avoided, not that users will be secure. The former is 'shite security' while the latter is good security
It's literally, and I do mean this literally, 1 click to block all public traffic to an S3 bucket. It can be enabled at the account level, and is on _by default_ for any new bucket. What exactly more do you want?
I'm reasonably certain that for quite a while blocking all public access has been the default, and it is multiple clicks through scary warnings (through the console; CLI or IaC are simpler) to enable public access.
The city decided to remove the life guards and replace them with signs saying "swim here at your own risk, people die here."
Having a simple classification system like "public" and non public with a system that ensures non public data isn't published might prevent data leaks with automation that checks for publishing non-public data.
A system that let's you publish non public data "with warnings" is just a sign saying "swimmers die here". Its not safe, it just excuses the city from culpability