That mental health app might share your data without telling you ‘Do I trust the person who made the app, and do I understand where this data is going?’ April 20, 2019 https://www.theverge.com/2019/4/20/...a-sharing-privacy-facebook-google-advertising
Anyone who would share "health diary entries" and "self reports about substance use" with an app is a bloody idiot
I disagree. I have been a substance abuse counselor for the past 25 years. One of the most important steps for individuals with a substance abuse diagnosis is accountability. That's largely the purpose of a sponsor. It's no different than logging what foods you eat in an app if you are trying to lose weight. It's about accountability. Now, you're probably going to bring up the issue of privacy and/or the issue of someone tracking or hacking into the app. Who cares? What's more important; privacy or getting clean and staying sober? I would argue that getting clean and staying sober far outweighs the risks of self-reporting substance abuse.
Not everybody have good friends or family who could help. For most people with i.e. depression it is better to seek help even if it is just via app than let depression take over. Remember that some people commit suicide, because of the mentioned illness.
You can be plenty accountable without using some stupid nosy app. How sad that people have lost the concept: giving up your privacy is to open a can of worms. What?? Well, I care. If you lose your privacy, you have more to worry about than just staying sober, which I'd argue would be a whole lot harder once you've had a privacy breach.
You are making a judgement without even downloading or looking at the app. You call it stupid with no knowledge other than it MIGHT share some data. How sad that people value privacy over almost everything else; especially their health and the well being of themselves and their family. You obviously have not experienced the devastation that a serious mental illness or a serious substance abuse problem can cause a person and their family.
We'd assume that readers of a privacy forum care. In addition, and from the paper we're talking about: I have a feeling that many other patient-advocacy groups, as well as patients themselves, place some importance on confidentiality/privacy too.
OK, to be fair, I ought to acknowledge the argument that part of recovery from substance dependency is admitting that you're powerless to stop by yourself, without submitting to something outside you. Maybe some deity. Or a group of supporters. I don't buy that. But many do, and claim that it's worked for them.
Last I looked, this subforum is for those who have privacy in mind. Seems instead of defending it you're working against it. No need to download any app or pass judgement since the thread title speaks for itself. Huge difference between "MIGHT share some data" and DONT share any data. One indicator of sanity is how well a person regards privacy. Health would be at or near the top of what people want to keep private and that is far from unreasonable. And no, I don't mean never tell anybody anything, nor do I condone people who use privacy as a blind to hide some evil deed. I'll tell you what IS obvious - you don't know what I've experienced, so what you said is at best, presumption. Even with people we know in real life, it's not wise to presume things about them. How much more so from someone you've never met, who writes a few comments in a forum.
I don't think the study addresses the issue of informing Google/Apple (and any partners that may learn of app store activity) of your health condition when you search-for/install/update a condition-specific app on Android/IOS (especially cellular devices). Wouldn't virtually all users be personally identifiable when they do that? Edit: clarification.
Some people can't afford to pay for that. Waiting weeks or months for free help can't be too much. For somebody with episode of major depression it may seem like a lot of effort. For me it is not black and white scenario.
Indeed. Unless they at least use a VPN service, and/or Tor. And better, compartmentalized in a dedicated VM.
Well, I (like the article) was talking about Android and IOS apps. I was also thinking about common (or reasonably achievable) scenarios. For example, start with: SmartPhone with cellular service that *is* linked to the user's name/identity. A health condition specific app (reveals that you have said health condition) Said app *doesn't* use the network for anything (just eliminating that exposure to make this easier). From a privacy POV, you would not want the OS developer (Google/Apple) to learn that you installed/have this app on your device. For that would reveal to them that you have said condition and they have no need to know your health conditions. I get the impression that identity linked/linkable unique identifiers (hardware, OS, app store account ID, ...) would be passed to Google/Apple when the user installs the app from the app store and/or when the OS checks for an update to said app. If the user was using a VPN service and/or Tor during those times it would only obscure their IP Address. It wouldn't have an impact on the passing of the identity linked/linkable identifiers I'm talking about.
One can run Android as a VM in VirtualBox etc. Or install it on a computer. See http://www.android-x86.org/
@mirimir: Interesting, bookmarked. If I may just ask: does the Google Play Store require a Google Account that in turn requires a mobile number for SMS verification? IOW, if going that route and not wanting to deal with virtual SMS, would the user safely download the APK and then sideload the app on android-x86?
I doubt that stable Google accounts are possible without realistic mobile numbers. but hosted SIMs don't cost very much.
Are Mental Health Apps Used by Colleges Risking Students’ Privacy? April 18, 2020 https://sites.suffolk.edu/jhtl/2020...ps-used-by-colleges-risking-students-privacy/