See https://www.hhs.gov/hipaa/for-professionals/privacy/laws-reg... as a starting point. We might be able to recommend a lawyer to you if you tell us which state you're located in.
It appears that anonymized data medical data are being sold en masse by providers (*) because money. But it's also obvious to us tech folk how trivial it is to combine anonymized patient encounters with location and credit card purchase data etc to de-anonymize it and resell as enriched.
So the only people who are effectively bound by HIPAA are the well-intentioned ones who have to protect themselves and and comply; the rest are laughing at them on the way to the bank.
* https://www.theverge.com/2021/6/23/22547397/medical-records-...
* https://www.scientificamerican.com/article/how-data-brokers-...
* https://www.medicaleconomics.com/view/who-profits-our-medica...
My understanding is that HIPAA is intended to stop providers from colluding against the patient, not to stop providers or middlemen from enriching themselves with our data.
From your link:
> The Privacy Rule, as well as all the Administrative Simplification rules, apply to health plans, health care clearinghouses, and to any health care provider who transmits health information in electronic form in connection with transactions for which the Secretary of HHS has adopted standards under HIPAA (the "covered entities")
Kate's App would almost certainly fall under the definition of a business associate, and no health care provider should be entering protected information into the app without entering into an official agreement that the data will be protected according to HIPAA rules.
So technically Kate's App isn't doing anything illegal, but any health care provider entering info into this app would be. To fix the situation, Kate's App needs to certify that their app is compliant and provide an official agreement for providers. Otherwise healthcare providers should stay away, and this app would only be useful for friends and family members.
(I am not a lawyer, but I have analyzed health care data and it's cumbersome to deal with, especially if you are transmitting over a network). https://www.hhs.gov/hipaa/for-professionals/covered-entities...
The OP mentions this is for family members and specifically excludes medical providers from the intended audience.
> This is not a clinic portal, and is not associated with any insurance or medical providers.
It seems the target audience is multiple family members involved in coordinating a loved one's care.
My understanding is you're an actual attorney, yes?
Can you shed any light on this area...? My understanding is HIPAA and similar laws aren't applied as a result of a user disclosing their own information for their own purposes. For example, you can freely put your own personal medical information into Google Docs, Apple Notes, Facebook post, X tweet, Excel spreadsheet, etc.
I ask because Kate's App is similar in ways to my app BoldContacts, which is helps people care for their parents and disabled loved ones. I strongly believe that these kinds of apps need some kinds of privacy protections that are lighter-weight than HIPAA. I haven't yet found a perfect answer.
Anybody who is a healthcare provider, anybody who gets paid to do anything that smells even a little bit like health care shouldn't touch this with a ten foot pole. They shouldn't look at it or touch it or think about it very intensely.
If you don't want to be in violation, don't receive medical information, don't store it, don't advertise that you handle it in any way.
Good advice:
- don't do anything at all that suggests that you will handle anything that even slightly hints it is storing, transmitting, or in any way touching healthcare information without being HIPAA compliant.
- especially don't do this as a side project, have a corporate structure with a very solid liability shield and don't do anything to pierce the veil
- do you want to avoid a 5,6, or 7 digit liability? Do everything you can to appear to be trying in good faith to follow the law and comply with regulations. Do things. Keep records of doing those things.
- even if you're _not_ required to, look up and follow the regulations, better yet, actually be HIPAA compliant even if it's not required. Many of these things you should be doing anyway even in very different fields.
- for God's sake get a lawyer and don't ask for advice on the Internet. Pay for the time for someone to sign off on what you do and whether or not you're inside the law
Let the data stay client side. Facilitate secure client-to-client communication instead of relying on the gravity well of cloud servers.
It becomes a lot more light-weight, and if done right the rules and red-tape do too as you reduce your presence in the regulatory scope by verifiably preventing access to user data by yourself (and service providers, their partners, hackers, and three-letter agencies).
And perhaps look at Stripe Atlas for getting my corporate ducks in a row to start with. https://stripe.com/atlas
Wading into that to get oriented, you would then be better equipped to have at least a baseline. A corporate attorney would be the next step to verify what you’re doing.
Minnestar.org hosts networking events that may be useful for finding people in the intersection of tech, healthcare, and law. Attend and get some face time to find people who may want to help. Lots of corporate centers in Minneapolis (assuming you’re in or near the twin cities), including healthcare. Depending on financial considerations, you may be able to find on ramps to grants, investors, or donors to fund compliance. Not sure on that though, but it’s possible.
Good luck!
Beyond HIPAA and similar regulations, there's the broader challenge that part of the intended audience probably would not want to use it for the same reasons. Any health care professionals that handle information like this are subject to the same rules and would only use tools that comply to minimize liability.
And there's the related problem of those people probably already having a lot of tools that they use and prefer. Another tool adds to their work load.
But I don't want to completely discourage you. If you are serious about turning this into a business, I'd look into how to connect to other tools. Maybe add some IOT integrations to the mix, etc. Most GPs would love a good tool like that. Many of the tools in this space are more than a bit crap. The key to success is understanding who experiences the most pain here and taking that a way (which in this context is also a nice metaphor).
Some feedback:
- who or what is Kate? Not really clear what this name is about.
- what's the business model here? Who pays for what and why? How is that going to evolve.
- get a designer or level up your own design skills. I'm not one but I can see you didn't use one.
- work on your pitch, it raises a lot of questions. Like how you are storing information, what the pricing is, and how you deal with privacy issues, etc. Vaguely hinting at that being important in a hand wavy way doesn't make it better. Taking topics like that serious requires a more structured approach to address those things. This communicates the opposite of what you probably intend here.
Further, a lot of providers are very strict about what tools their organization is allowed to use. In the past I’ve tried to get providers to look at a personal web page where I’d had a medical history and links to imaging data, and they weren’t allowed to access it via policy.
(I then brought 10 disks of imaging on a thumb drive - but they wouldn’t take that either. So I re-burned them onto physical media, and they were ok with importing that.)
I do understand why those policies are necessary, and in the end I learned their systems and limitations. It’s actually been an ok experience.
And even so, nothing precludes people from pursuing civil damages if there's a data breach - this is far more likely with sensitive data coming from a medical provider to a third party.
And as has been hinted at, the lack of professional presentation is going to hurt a lot, and people will immediately ask "can I trust this platform with any of my information?"
Why? What's in it for them?
I'm not saying this can't happen, I'm just not sure I understand why you think it's so likely to happen.
Law department visits the box and finds nobody. Shell company changes name (indeed, perhaps they have a different name for every victim) and resume operation immediately. Hell, they never stop selling for a millisecond.
Follow the money? Ha. The modern ideas of currency make such schemes bulletproof.
Seems like it is intended to be used by covered entities. But it does depend a bit on what "medical caregiver" is intended to mean.
It sounds like a useful app but only if it can be safe. Which I don't belive it can become with a classic web2.x startup approach.
If your goal is to "find a learning project," I suggest finding a very different "learning project." Otherwise, keep "Kate's app" private, word-of-mouth, invite-only for under 20 people.
The 1980s and 1990s are long-gone, you can no longer "learn as you go" when the consequences of your application malfunctioning have real-world implications.
---
A few years ago, my employer used an HR app that appeared built by a novice. In that time period; they sent me a PDF with tax information for half the people in the company; and then they royally screwed up the tax information sent to the IRS for me.
It sucks that you've been burnt by that before, but it sounds like your employer was the one who screwed you there, not the author of the application.
The issue of my employer is an example of real world consequences when a novice builds a product without understanding the rules they need to follow.
Unfortunately, there is a cohort of people in the startup scene, and who also participate in Hacker News, who don't like to hear negative feedback even when there are very clear consequences that that feedback is trying to address. Don't be one of those people, especially around issues of legal compliance.
Startup names are so stupid
The sparse documentation makes claims about privacy and security, but there is no evidence to back those claims.
Assuming the last 5% is going to just take a few weeks is naive from a development point of view. Everyone learns this the hard way, so I don’t mean it as a dig.
Yeah, that smells amateurish. Maybe OP can code well, maybe they have domain knowledge in healthcare, but damn, definitely utterly clueless in the legal area.
Just thought, I'd share what I think about the substance of the idea (not the implementation). I think a big untold story in the US healthcare system is how it shifts the burden of coordinating care to patients and/or their loved ones.
To be sure, there is a lot of decisions that the individual (or their NoK) should be making but the amount of paperwork that flies around and lack of coordination between say an insurance company and the provider is astounding. This becomes very pronounced for every corner case and the entire machinery is wired to record things in myriad systems but somehow not make things better when it comes to the core outcomes -- providing healthcare. Every entity in the food chain is out to (and does!) make a buck. Meanwhile, there is a wait time of > 30 days to meet one's primary care physician over a video chat!
So, I absolutely LOVE your idea. The implementation probably requires a lot of iterations here. One suspects that there are ways in which a consumer facing app could make some real money to level the playing field in favor of the patient while being a sustainable busienss.
Sadly, it is also vert HN like in the not so good sense. Unlike the software world, the real world is not ours to program as we see fit. In the real world, laws matter. And I am concerned that you haven't really read upon the consequences of doing an app like yours without any due diligence. You can't just use people's health data like that.
Anyone using this app could potentially sue you as you are likely breaking the law of the country you live in (I am going to guess it is an Anglo-Saxon country).
You should asap bring the app down, contact all users, send them their info, delete them from your servers, notifying them of that and get a lawyer specialising in health related law. With their assistance, you can build an organisation to build the app. This should also limit your liability.
For example, I believe Brooke Shields told the world she had post-partum depression and was prescribed some anti-depressant and felt it helped her.
https://www.webmd.com/depression/postpartum-depression/featu...
That's "medical information" about "a prescription". She could have, instead, shuffled it into some rando app, and shared it with her family. I don't think any HIPAA laws were broken.
Of course, US laws https://www.hhs.gov/hipaa/for-professionals/faq/190/who-must...
The above doesn't describe anything about private parties. If this "Kate" is some rando app developer, they can do whatever they like. Anyone who is willing to trust a random developer with their information can do so afaict.
IANAL and YMMV etc.
The problem is that OP literally mentions "medical caregiver" as distinct from "families" which can be interpreted to mean someone that operates as covered entity. That alone puts OP under the risk of being sued and being punished with a very large fine. All a user needs to do is put their data there, share the info with their care assistant who works for a health company. Once that happens, OP is breaking the law.
"Comments on HIPAA: I'm 99% sure this does not apply, since the site is for patients and their families, and no doctors, clinics, hospitals, or insurance companies are involved. All information comes from the family, and stays in the family."
Insofar as no providers or non-family use this, developer may have a point: my comment's covered-entity reasoning can be disregarded.
---
> Anyone who is willing to trust a random developer with their information can do so afaict.
No, not "anyone" in a multi-party app when "someone" is regulated.
This reasoning (a patient can choose to disclose) doesn't apply here, as the app expects providers to info-share new info, ongoing.
The providers are regulated, they have to keep records, and their sides of their tools have to be covered.
That said, even some U.S. national insurance companies bury a clause in their agreement where, to your point, the patient agrees to sort of declassify their info such that it's (the insurer company's theory goes) no longer considered HIPAA and the insurance company can go bananas with it (e.g., sell it to drug companies).
I had lawyers look into this on behalf of our firm benefits, and we challenged that clause. The national insurance company everyone has heard of instantly gave us a new employee insurance agreement without that clause, which suggests to me they knew it was dicey. (Imagine pinging Google and them dropping a clause from their TOS "just for you". That would only happen if they knew it didn't have legs.)
But, dicey or not, it suggests a path to try if you want to attempt this!
You, Brooke Shields, can share your information with your boyfriend, Tom Cruise, about who you see for your anti-depressants: the amount, name of the doctor, dosage. You can even use a random app developed by some Joe Dev installed through f-droid as an APK with data stored in North Korean data centers (does North Korea have data centers?). The world is yours.
Shame this is such a legal minefield. I do not think you should put this on GA.
High on my list. Or youtube, or something like that.
"Comments on HIPAA: I'm 99% sure this does not apply, since the site is for patients and their families, and no doctors, clinics, hospitals, or insurance companies are involved. All information comes from the family, and stays in the family."
Insofar as no providers or non-family use this, developer may have a point: my comment's covered-entity reasoning can be disregarded.
---
Not saying don't do YouTube, there's a persona who wants to learn from being talked to and shown.
But there's a less online (socially noisy) persona who prefers to read, see, and take in information far faster than a video. So don't skip the screenshots!
PS. I participated in the first patient centered groupware app 15 years ago, sold to the provider networks, so all providers a patient is ping-ponged to can interact as if a virtual team with the patient.
Your idea is viable, and giant hospital networks will buy it. But the top comment on this thread is likely dead right. You likely need to be HIPAA compliant for the providers to participate, regardless whether you sold the app to the patient or to the providers. Because unlike a personal notes app, your entire premise is info sharing among parties.
There is possibly a model for this that is technically outside HIPAA, but what you're showing / saying doesn't sound like it's navigated that.
Even if you use that potentially compliant model, it's then highly unlikely the providers will play ball, as then they'd have to be running as many apps as they have patients and they are too busy and already have to know too many systems. Even if they felt like setting a precedent of installing whatever apps patients ask them to use (they don't), the last thing they want is yet another place to redundantly key in information/communications. (They are required to have a record.) To get around that, you'd have to integrate with what they have, and boom, HIPAA again.
This seems like a good experiment in building a CRUD app, but I'd recommend doing that with something with less liability.
It's not a place where I'm going to store contact information for all my doctors, or appointments for doctors that aren't at that clinic, or all my prescriptions and all the pharmacies.
When your daughter is reacting badly to her new chemotherapy, and running fevers and throwing up, and somebody needs to call her palliative care specialist and it needs to be you, not her, then where will you find the specialist's phone number?
I hope you'll never be there, but if you are, I think you'll understand.
As someone who never heard of either MyChart nor Epic, I'm guessing it could be useful for people like me who don't have those things.
If you can’t answer that question you really need to listen to the people telling you to take it down until you can work it out.
Does the app/company fall under HIPAA regulation? If it does, what security & privacy measures are in place to guarantee compliance? If it does not, what security & privacy measures are in place to prevent government fishing expeditions?
Finally, what security & privacy measures are in place to prevent app developer having a change of heart about selling the data? What if, say, United Healthcare offers to buy the app and the data for $1B?
Yes. Two features high on my list of todos: 1) download all your data; 2) delete all data from the site.
The second is a bit more complicated, since multiple family members may have access to the same data, and may have different opinions on deleting it. I'll work it out.
Otherwise, you have only my integrity. I'm not looking to sell it, but I would love to hand this over to someone with more resources and bigger pockets. If I ever do, I would want those reassurances from them first, and I would definitely give all users fair warning, so they can pull out if they don't have the same confidence I do.
I know it's been said elsewhere, but you need a lawyer. This isn't something for you to work out, it's something for you to clearly understand your legal obligations, and what your exposure is based on which jurisdictions a user might log in from.
> This isn't something for you to work out, it's something for you to clearly understand your legal obligations
Like, is it really impossible to "understand your legal obligations" without help from a lawyer? Is it supposed to be like that? Why? Are the laws explicitly written to be impossible to understand if you're not a lawyer?
I might have lucked out, but in the few instances where I had doubts, just reading the relevant code gave me all the advice I needed. They are written to be clear and unambiguous as much as possible - in effect, they're tedious and wordy but perfectly understandable. It's easy to recognize the complex or unclear parts because they really stand out from the rest - and that's when you ask a lawyer.
Of course, if there's a significant penalty or otherwise stakes are high, consulting with a lawyer is a good idea. But the notion of "the people" only ever interacting with "the law" through intermediaries is... strange? Then again, you don't generally risk being shot in the head for arguing with a policeman here, which might or might not be a separate issue.
The app is designed to allow sharing of personally identifiable information, and apparently doesn't distinguish regions, age, etc.
Assuming OP is American, and hosting the service in the US, and given the target audience and proposed use case, I can think of a couple of regulations that apply:
FTC Act
COPPPA
CCPA
All of the privacy laws documented here: https://iapp.org/resources/article/us-state-privacy-legislation-tracker/
In addition, if a Canadian user signs up, then PIPEDA, and various other regulations come into play.If an EU user signs up, then obligations must be met under the FTC's Data Privacy Framework and compliance with various EU and national regulations come into play.
It's not impossible for someone to adhere to all of the laws, it's just a full time job to do it. It's probably not reasonable for a single person to build and operate a service with the privacy and security requirements and claims that the author of KatesApp makes, and meet the compliance requirements. It is abundantly clear to anyone who works in privacy or security that the website doesn't meet the bare minimum requirements, and has very little standing to defend itself.
For reference for anyone who hasn't signed up for it, there is no terms of service, and no privacy policy.
The service includes features to allow uploading of data related to:
Prescriptions - medication, dosage, instructions, prescriber, and pharmacy
Medical Appointments - who (presumably the medical professional), date/time, location, and reason for medical appointment
Doctors - a list of doctors, clinic, contact info
Upload files, with this helpful list of suggestions of medical records to upload:
- insurance information
- advanced directives or DNR/DNI
- a copy of your vaccination card
- lab test results, doctors' reports, x-ray, MRI, and CT - scans, or other images
- voice recordings of visits with the doctor or other providers
- self-monitoring logs (sleep, diet, exercise, etc.)
There are logs to show who created a data element under each of those types of records, but I didn't test the site deeply enough to determine if there are any audit controls or logs that are visible to users on who accessed what, but the privilege system implemented is rudimentary, and is fundamentally weak due to the fact that user accounts are unverified.Anyone can sign up and create and share files and resources using this service. From the main public page, the author requires a signup code, but signing up from the HN link on the post bypasses this. There is no validation of who the user is, no confirmation that the person who signed up owns the account, or options to delete my test account or data. There are no controls that appear to limit what might be uploaded other than file size.
As of right now, this site is in violation of Canadian law and EU laws regulations. I assume it is also in violation of American laws and regulations.
I understand what the author is attempting to do, and why they are doing it, and they are deserving of empathy (and in my other comment I provided them a road map to improve some of the security issues on the site), but launching a website into production that gathers this data, in the United States is not only unwise, it is probably negligent, and it's reasonable to expect that someone could sue the owner of the application.
From a user privacy and security perspective, a user of this service would quite literally have more protections and controls using a google spreadsheet or shared folder to store and share these documents.
My question was more about whether you need a lawyer to know you need a privacy policy... It was tangential, admittedly; sorry about that.
To make the direction of the tangent clearer (and please ignore it if it distracts from the main discussion too much): I'm in the EU, and I know that I'd need to read GDPR[1] before letting people see such an app. I haven't read it - I quite possibly would give up at Act 4 and decide I do need a lawyer. But my first instinct would be to go read the Regulation itself.
[1] Actually, RODO (official translation): https://gdpr.pl/baza-wiedzy/akty-prawne/interaktywny-tekst-g...
The bottom line is that the regulation is not a technical specification, it is a legal document, and parsing a legal document requires both the ability to read the regulation, and also to reason by applying the jurisprudence that is specific to the jurisdiction for the regualtion. Essentially, interpreting the law and translating it into requirements requires the ability to both outline the technical requirements and understand what is required to make the implementation legally defensible.
A good example of this is data deletion under GDPR. The expectation of the law is that when you get a deletion request, you will delete the data. In practice, deleting data is hard, unless you build your backup mechanisms to allow deletion of individual fields. With that in mind, companies meet this requirement by implementing a deletion scheme for production systems, and a mechanism such that datasets marked for deletion are logged, and when a restore from backup is performed, the restoration process references those deletion logs to ensure that deleted records are not restored. This, technically speaking, does not result in proper deletion of the data, but it has passed audits under data deletion regulations (Disclaimer: this is based on public documents detailing data deletion requirements, not my work directly. Consult your lawyer, I am not a lawyer, and I am not on your compliance or security team and this is not a recommendation).
Legal advice is part of working it out.
Comments on legal issues: I absolutely agree and 100% plan to get legal advice. In the meantime, if you have personal experience, I would love to learn from you.
Comments on HIPAA: I'm 99% sure this does not apply, since the site is for patients and their families, and no doctors, clinics, hospitals, or insurance companies are involved. All information comes from the family, and stays in the family.
Comments on security: This is a huge issue for me. I've followed best practices as nearly as I can, but I've also been asking around to find out who could do a comprehensive security audit, but haven't yet found anybody I trust. Does anybody have any recommendations on how to find someone?
Comments on terms of use, etc: Yes, this needs to be done, but I figured the terms of use are of no use until there's something to use.
Comments on "novice" and "learning projects": Yes this was absolutely built with love and grand intentions, and no, I'm not a novice. I wrote this because my adult daughter died of cancer recently, and we really could have used this. If I can help others deal with the pain of diseases like this, then I'm going to try. I'll work through the problems as they come up.
Aside from the security audit, I'm also looking for someone who'll do a much more professional design and L&F for the site.
Another issue I can really use advice on is how to show this to the people who need it. People who aren't dealing with the problem right now, aren't interested. How do I reach the maybe 5% to 10% of people who have the need right now?
The best first step is to conduct a review yourself; you may want to hire or recruit a volunteer to do a security review, but you can kick it off yourself by using free, open source tools to scan your application, your code, and your environment.
Your first stop should be https://developer.mozilla.org/en-US/observatory because there are some simple, prescriptive improvements you can make.
Your second stop should be using a container or cloud security scanning tool to check for vulnerable configurations and packages. There are a myriad of tools available, like Trivy for container scanning, Prowler https://github.com/prowler-cloud/prowler or ScoutSuite https://github.com/nccgroup/ScoutSuite for scanning your cloud environments, etc
Your third stop should be https://www.zaproxy.org/, which is a free download you can use, and https://www.zaproxy.org/getting-started/ is a great way to get started. This will help you quickly identify low hanging fruit that can be found through automated scanning.
Your fourth stop should be running language appropriate static analysis tools against your application. There are too many to mention, but here is a good starting list: https://owasp.org/www-community/Source_Code_Analysis_Tools
All of these will give you quick, tactical things you can address. Once you get through any critical findings (which frequently, but not always means they are directly exploitable without additional effort) you should threat model your application, and build a plan for security - https://owasp.org/www-community/Threat_Modeling
EDIT: In any case, you could take a look at https://github.com/YousefED/Matrix-CRDT. Matrix takes care of e2ee. CRDTs give you local-first super powers.
Then again, doing things this way might make data deletion and other privacy related issues quite difficult to achieve, especially if said Matrix servers are federated.
The primary goal of your site is to store medical data. For this, you'd need a dedicated data protection officer (DPO). Article 37 1c applies to your case: https://gdpr.eu/article-37-designation-of-the-data-protectio...
2. what happens if I at some point give access to this app to my care assistant who works for the state health department or a health company? Surely, those people are covered entities, and you would be then under HIPPA laws. There is nothing you can do to stop that and if your app becomes popular enough, given enough time it will happen
3. For countries in the EU, you are subject to the GDPR legislation. Who is the data processor, data protection officer and the supervising authority for the data handled by the app?
* More screenshots/use cases.
* Information about who you are/why it's called Kate's App. I think that especially for single/small dev teams, this can really help build trust and interest.
* Said elsewhere, but a publicly available privacy policy. Also not seeing any after signing up. Big red flag.
* IMO, don't have usernames AND emails at sign up. Choose one.
* Needs padding on either side. Other formatting issues too, but that was the most glaring one.
"Organize your support team for your health care."
"Kate's App is a tool created to support medical caregivers"
HIPAA rules apply to covered entities, and the developer of this app does not appear to be a covered entity. If a covered entity used this service, THEY would be required to enter into a Business Associate Agreement (BAA) with the developer, at which point the developer is on the hook and HIPAA applies.
If a covered entity engages with a platform like this, without a BAA, the liability under HIPAA is borne by the Covered Entity whom the rules apply to.
That said - if you want to engage with covered entities (and I think that should be a goal) you'll need to have all your ducks in a row before they'll be interested. It's all doable though, dont let the gatekeepers push you out.
One thing I've got my eye on right now is Palantir's HealthStart initiative that seeks to streamline the compliance requirements needed to operate in this space legally. Might be worth following if you plan to take this anywhere beyond a hobby.
Last note - my statement here is only about HIPAA. There are any number of state and federal level privacy rules where liability may or may not come into play here. Have a privacy policy, follow it, protect other people's data.If you're not confident you know how to do that, find someone who is. We do have a responsibility to our users that goes well beyond our desire to learn and experiment.
Good luck!
https://guides.rubyonrails.org/active_record_encryption.html
Basically, all the data in the app would be hidden to everyone except the users. I'm assuming this would be the case, and I'm assuming that you, with prod db access, wouldn't be able to directly read the text that is being written.
If that were the case, I'd say your ethical obligation is fulfilled, more or less. (obv implementing application-level 'everything is encrypted' is not trivial, but it makes it so that you couldn't ever see what was being said)
I don't believe in political authority, so when people say "But hipaa!" I hear "but I believe in the institution of authority" and I sorta tune out everything else that they say.
There's a LOT of people in the world who believe in authority/political authority, and it is tiring. sorry for us all.
This app is cool! Well done to you. Hope you don't have to spend thousands on lawyers and don't have to deal with coercive institutions based on the fantasy of political authority.
Tomorrow I will take down the HN front page. Your accounts will remain in case you still want to check things out. You can also delete your accounts yourself, or ask to have them deleted.
If you want, you will still be able to create a new account from the default main page https://katesapp.org. You will need to ask for an access code, but I'm happy to provide one.
Edit: Sorry, the option to delete your account has not been uploaded to the server yet. However, email me at the address in my HN account, or contact me through the app, and I'm still happy to delete your account for you if you want. Again, thank you everyone.
It is hosted by: HOSTINGER US
Organization name: Hostinger International Ltd.
IP address: redacted
AS(autonomous system) number and organization: AS47583 Hostinger International Limited
AS name: AS-HOSTINGER
Reverse DNS of the IP: katesapp.org
City: Phoenix
Country: United States
I don't want to discourage you because it's always good to have multiple options but I would look at what Cariloop (https://cariloop.com) is doing, try and focus it like that but with unique aspect you have they do not. This is only the second caregiving app/service that I have seen.
Additionally, we do offer medication tracking and other digital caregiving tools. I will also mention like others here, it is important to have things like HIPAA best practices in place for services like this. At Cariloop for example we follow HIPAA best practices, GDPR compliance, and are SOC 2 certified.
edit: not a relative link, but a 404 regardless
Many health care providers offer export of health records to FHIR format now. You can also retrieve those records on iOS via the HealthKit API.
Apple lets you log into your health care provider in the Health app and download all your records from supported providers. You can request access to those records from another app installed on your phone.
Would be nice to have a calender (weekly? monthly?) on landing page after log in to see what to prepare for the week.
Is this true? Is the data stored encrypted (or not stored at all in servers)? Or can a sysadmin see it?
and click the "Kate's App" button at the very top of the page, it takes you to a 404.
Just FYI if you want to fix that.
Unfortunately the bad actors have destroyed trust so much that I don't trust anyone no matter the words nor how authentic you sound.
Access to information is strictly limited only to specific individuals who must be explicitly granted access.
In this hallowed religion one of the most fundamental rules was that every domain object had to have both an integer identifier ("ID") and a UUID ("GUID", because Windows). When I asked why we didn't simply use one or the other I was told that we had to have an ID because we "need a primary key" and a GUID because "we can't put an ID in the URL because then you can go to another record by changing it!" It didn't matter that we performed permissions checks on these routes because _security_. As I learned more about data modeling and relational databases I periodically questioned this (in retrospect we should have just used UUIDs as the PK because we had no good performance or design reasons to have both) but never got a good answer. This religion didn't tolerate heretical nonsense - GUIDs are for URLs because of hackers.
No idea if that's what's going on here but it reminded me of it.
It's not, by itself, deadly but it does lower the safeguards against ACL slip-ups, which could easily exfiltrate the entire customer base
The other very common pattern is https://example.com/profiles/852c1a9a-29ae-4638-9d82-50e0d40... or its b36 encoding which are shitty for reading over the phone but otherwise definitely safe from enumeration
Second, HN usernames are 100% enumerable. 'asdfgf' is an example of account which has never posted.
So you're hack proof and idiot employee proof?
I couldn't find a privacy policy so it's likely to be criminal to supply this software to EU citizens.
- What country/ies do you accept users from and which jurisdiction do you store their data in?
- Get a HIPPA/GDPR/PHIPA audit by a legal professional ASAP!
While you might not fall directly under HIPAA laws (as I don't think you a covered entity nor a Business Associate) you definitely are aware that you will have PHI and thus you have to protect it - especially if you're saying that it's "Private" and "Secure."
I'd focus on making sure that all data is encrypted in transit and at rest and that all systems on your side are locked down. You and anybody that might have access to your database shouldn't have free access this data. I'd read through some of the HIPAA guidelines especially from the business associate side and conform to those.
Don't be scared by everyone here. Read up on the HIPAA guidelines, check out HITRUST, never take your eye off security. Keep getting better.
If you're worried, you can always consult a lawyer or even an auditor for some advice (I'm neither).
"Go talk to a lawyer" is not an attempt to scare or some impossible abstract advice. It's a very concrete, and very reasonable step that really ought to be taken early on in this effort.
Maybe everyone here is off base. How might the app developer determine this? By talking to a lawyer.
Speaking to a lawyer is not the first step when building something in this domain (unless you already have someone bankrolling you).
In this case there's an app that this guy built for families to use. It's obviously in it's infancy. The helpful advice here would be about posting that this is in beta or maybe reading the HIPAA guidelines and ensuring that he's adhering to those guidelines where applicable. Focus on tightening up security. What's his plan to ensure that data in encrypted in transit and at rest? What kind of monitoring will the app have? Does he need to be thinking about intrusion detection? Will he need to enforce 2FA?
Does he need to stop everything and start speaking to lawyers? Probably not.
First - the TOP comment in this post: >>" I would advise you to temporarily close your site and hire a lawyer straight away."
And other top level comments: >> You should asap bring the app down, contact all users, send them their info, delete them from your servers, notifying them of that and get a lawyer specialising in health related law.
>>If you can’t answer that question you really need to listen to the people telling you to take it down until you can work it out.
>>Speaking as someone who works in IT in healthcare - you need to close your site down immediately, do not pass Go, etc., and hire a lawyer.
No idea who this person is. Could be some 15 year old scammer in Florida. Could be a billionaire heir in London. No contact information. The domain registration is hidden.
What is this "product" solving that is not much different from a shared Google Doc?
As another commenter commented, this would be a good candidate for a local-first app. I'd love to do that at some point.
sure, there are risks, but take them. make a thing for people who take care of other people. this is for a woman who takes care of her husband with alzheimers, or a man who takes care of his wife with parkinsons. fuck the system. make something someone wants.
good luck.
What a claim to make.
This app is the system, with a "trust me bro" approach to privacy and security.
Its creator is probably well intentioned, but this is likely to result in bad things for its users.
No privacy policy, no real information about the owner behind it. Seems all "trust me, it's private, I pinky swear".
I don't blame you for not using them though since evidently you never looked at your page on mobile ;)
The landing page doesn't make it clear whether providers are expected to use it or not.
HIPAA very much applies to this type of app or any other type of app that may deal in personally identifying information (PII) related to healthcare.
Edit: If no healthcare provider has access then maybe it could skate by. I interpreted "any user making notes to your account" to mean healthcare providers would have access. Even if not, they should still seek legal counsel. And this app is literally promising safety and security of healthcare information.