Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-26T07:15:13.119Z Has data issue: false hasContentIssue false

Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy

Published online by Cambridge University Press:  10 January 2022

Abstract

Surveillance capitalism companies, such as Google and Facebook, have substantially increased the amount of information collected, analyzed, and monetized, including health information increasingly used in precision medicine research, thereby presenting great challenges for health privacy.

Type
Columns: Currents in Contemporary Bioethics
Copyright
© 2021 The Author(s)

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

About This Column

Mark A. Rothstein serves as the section editor for Currents in Contemporary Ethics. Professor Rothstein is the Herbert F. Boehl Chair of Law and Medicine and the Director of the Institute for Bioethics, Health Policy and Law at the University of Louisville School of Medicine in Kentucky. ([email protected])

References

Halamka, J.D., “Early Experiences with Big Data at an Academic Medical Center,” Health Affairs 33, no. 7 (2014): 11321138, at 1132.10.1377/hlthaff.2014.0031CrossRefGoogle ScholarPubMed
Pub. L. 111-5 (February 17, 2009), 42 U.S.C. § 300jj et seq.Google Scholar
Office of the National Coordinator for Health Information Technology, Health IT Dashboard, “Non-federal Acute Care Hospital Health IT Adoption and Use: State Rates of Non-federal Acute Care Hospital EHR Adoption, Health Information Exchange and Interoperability, and Patient Engagement (2015),” available at <https://www.healthit.gov/data/apps/non-federal-acute-care-hospital-health-it-adoption-and-use> (last visited July 21, 2021).+(last+visited+July+21,+2021).>Google Scholar
Office of the National Coordinator for Health Information Technology, Health IT Dashboard, “Office-based Physician Health IT Adoption: State Rates of Physician EHR Adoption, Health Information Exchange and Interoperability, and Patient Engagement (2015),” available at <https://dashboard.healthit.gov/apps/physician-health-it-adoption.php> (last visited July 21, 2021).+(last+visited+July+21,+2021).>Google Scholar
See Centers for Medicare and Medicaid Services, Department of Health and Human Services, Final Rule, 85 Fed. Reg. 25510-25640 (May 1, 2020). See also Rothstein, M.A. and Tovino, S.A., “Privacy Risks of Interoperable Electronic Health Records: Segmentation of Sensitive Information Will Help,” Journal of Law, Medicine & Ethics 47, no. 4 (2019): 771777.10.1177/1073110519897791CrossRefGoogle ScholarPubMed
See, e.g., Topol, E.J., Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again (New York: Basic Books, 2019); J. Couzin-Frankel, “Medicine Contends with How to Use Artificial Intelligence,” Science 364, no. 6446 (2019): 1119-1120; E.J. Emanuel and R.M. Wachter, “Artificial Intelligence in Health Care: Will the Value Match the Hype?” Journal of the American Medical Association 321 no. 23 (2019): 2281-2282; E.J. Topol, “High-Performance Medicine: The Convergence of Human and Artificial Intelligence,” Nature Medicine 25, no. 1 (2019): 44-56, doi: 10.1038/s4159-018-0300-7.Google Scholar
See Hoffman, S., Electronic Health Records and Medical Big Data (New York: Cambridge University Press, 2016); M.A. Rothstein, “Ethical Issues in Big Data Health Research,” Journal of Law, Medicine & Ethics 43, no. 2 (2015): 425-429; E. Vayena and A. Blasimme, “Health Research with Big Data: Time for Systematic Oversight,” Journal of Law, Medicine & Ethics 46, no. 1 (2018): 119-129.10.1017/9781316711149CrossRefGoogle Scholar
See generally Lane, J. et al., eds., Privacy, Big Data, and the Public Good: Frameworks for Engagement (New York, Cambridge University Press, 2014); S. Lohr, Data-ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else (New York: HarperCollins, 2015); V. Mayer-Schönberger and K. Cukier, Big Data: A Revolution That Will Transform How We Live, Work, and Think (Boston: First Mariner Books, 2014); B. Schneider, Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (New York: Norton, 2015).10.1017/CBO9781107590205CrossRefGoogle Scholar
Zuboff, S., The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019): at 8.Google Scholar
Id. at 98.Google Scholar
Id. at 498.Google Scholar
For example, according to Facebook, at the end of 2020, it had 2.8 billion monthly active users and 1.8 billion daily active users. Facebook Revenue and Usage Statistics (2021), available at <www.businessofapps.com/data/facebook-statistics> (last visited July 17, 2021).+(last+visited+July+17,+2021).>Google Scholar
Zuboff, supra note 9, at 383.Google Scholar
The right to be forgotten emerged from Europe as the right of a private person to have private information about the person removed from internet searches and other directories. It was adopted in the European Union’s General Data Protection Regulation (GDPR). See Post, R.C., “Data Privacy and Dignitary Privacy: Google Spain, the Right to Be Forgotten, and the Construction of the Public Sphere,” Duke Law Journal 67, no. 5 (2017-2018): 9811072.Google Scholar
See generally Pasquale, F., The Black Box Society: The Secret Algorithms that Control Money and Information (Cambridge: Harvard University Press, 2015).10.4159/harvard.9780674736061CrossRefGoogle Scholar
In a controversial study involving 689,003 Facebook users, one group of users was mostly exposed to positive messages in their news feeds and the other was exposed to mostly negative messages. There was a statistically significant, but small effect on the tone of the users’ own postings. See Kramer, A.D.I. et al., “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” Proceedings of the National Academy of Sciences 111, no. 24 (2014): 87888790, available at <https://doi.org/10.1073/pnas.1320040111> (last visited October 27, 2021). The study was criticized because there was no external IRB review, no informed consent other than the general Facebook user agreement, and the study involved manipulation. See D. Hunter and N. Evans, “Facebook Emotional Contagion Experiment Controversy,” Research Ethics 12, no. 1 (2016): 2-3, doi: 10.1177/174016115626341.CrossRefGoogle ScholarPubMed
In 2013, Edward Snowden, a National Security Agency contractor, revealed the details of a massive surveillance program using commercially developed spyware that, once loaded on a device, can harvest data from emails, text messages, GPS data, and other sources and transmit the information to the attacker. See generally Greenwald, G., No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State (New York: Henry Holt, 2016).Google Scholar
J. Ruttenberg, “Data You Can Believe In: The Obama Campaign’s Digital Masterminds Cash In,” New York Times, June 20, 2013, available at <https://www.nytimes.com/2013/06/23/magazine/the-obama-campaigns-digital-masterminds-cash-in.html> (last visited July 18, 2021), quoted in Zuboff, supra note 9, at 123-124.+(last+visited+July+18,+2021),+quoted+in+Zuboff,+supra+note+9,+at+123-124.>Google Scholar
19. “The turmoil associated with the 2016 US and UK political disinformation campaigns on Facebook was a well-known problem that had disfigured elections and social discourse in Indonesia, the Philippines, Colombia, Germany, Spain, Italy, Chad, Uganda, Finland, Sweden, Holland, Estonia, and the Ukraine.” Id. at 508.Google Scholar
See Gostin, L.O. et al., “Health and Privacy in the Digital Age,” Journal of the American Medical Association 320, no. 3 (2018): 233234; J. Isaak and M.J. Hanna, “User Data Privacy: Facebook, Cambridge Analytica, and Privacy Protection,” Computer 51, no. 8 (2018): 56-59.CrossRefGoogle Scholar
See Benkler, Y., Faris, R., and Roberts, H., Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics (Oxford, UK: Oxford University Press, 2018).10.1093/oso/9780190923624.001.0001CrossRefGoogle Scholar
Mack, See D., Mac, R., and Bensinger, K., “‘If They Won’t Hear Us, They Will Fear Us’: How the Capitol Assault Was Planned on Facebook,” BuzzFeedNews, January 21, 2021, available at <https://www.buzzfeednews.com/article/davidmack/how-us-capitol-insurrection-organized-facebook> (last visited July 29, 2021).+(last+visited+July+29,+2021).>Google Scholar
See Romer, D. and Jamieson, K.H., “Conspiracy Theories as Barriers to Controlling the Spread of COVID-19 in the U.S.,” Social Science and Medicine 263 (2020): 113356. See also M. Fisher, “Disinformation for Hire, a Shadowy Industry, Is Booking Around the World,” New York Times, July 26, 2021, at A8; S. Frenkel, “Disinformation Is Big Business for One Doctor,” New York Times, July 25, 2021, at 1.CrossRefGoogle ScholarPubMed
Zuboff, supra note 9, at 128.Google Scholar
Id. at 164.Google Scholar
Id. at 170.Google Scholar
See generally Maple, C., “Security and Privacy in the Internet of Things,” Journal of Cyber Policy 2, no. 2 (2017): 155184, available at <https://doi.org/10.1080/23738871.2017.1366536> (last visited August 27, 2021).Google Scholar
Zuboff, supra note 9, at 480.Google Scholar
See Adams, M., “Big Data and Individual Privacy in the Age of the Internet of Things,” Technology Innovation Management Review 7, no. 6 (2017): 1224.Google Scholar
Maple, supra note 27, at 74.Google Scholar
A. Elise, “Toy Company Settles Lawsuit after Kids’ Information Hacked,” WCVB Boston, January 10, 2018, available at <https://www.wcvb.com/article/toy-company-settles-lawsuit-after-kids-information-hacked/15049212> (last visited July 29, 2021).+(last+visited+July+29,+2021).>Google Scholar
See S. Gibbs, “Hackers Can Hijack Wi-Fi Hello Barbie to Spy on Your Children,” The Guardian, November 25, 2015, available at <https://www.theguardian.com/technology/2015/nov/26/hackers-can-hijack-wi-fi-hello-barbie-to-spy-on-your-children> (last visited August 1, 2021).+(last+visited+August+1,+2021).>Google Scholar
See R. Copeland, D. Mattioli, and M. Evans, “Inside Google’s Quest for Millions of Medical Records,” Wall Street Journal, January 11, 2020, available at <https://www.wsj.com/articles/paging-dr-google-how-the-tech-giant-is-laying-claim-to-health-data-11578719700?reflink=desktopwebshare_permalink> (last visited July 20, 2021). It is debatable whether the arrangement was ethical. For example, it is questionable whether the records needed to be accessible in identifiable form. A technology company of Google’s sophistication could have deidentified the records without sacrificing the research significance of the data. Furthermore, patients should have been informed of the goals, methods, and parties involved in Project Nightingale and given the opportunity to opt out of the program. With 50 million records, the loss of a small percentage would not be detrimental and if a substantial number of patients elected to opt out, perhaps it would have convinced Ascension that the promised ends of the research did not justify the means.+(last+visited+July+20,+2021).+It+is+debatable+whether+the+arrangement+was+ethical.+For+example,+it+is+questionable+whether+the+records+needed+to+be+accessible+in+identifiable+form.+A+technology+company+of+Google’s+sophistication+could+have+deidentified+the+records+without+sacrificing+the+research+significance+of+the+data.+Furthermore,+patients+should+have+been+informed+of+the+goals,+methods,+and+parties+involved+in+Project+Nightingale+and+given+the+opportunity+to+opt+out+of+the+program.+With+50+million+records,+the+loss+of+a+small+percentage+would+not+be+detrimental+and+if+a+substantial+number+of+patients+elected+to+opt+out,+perhaps+it+would+have+convinced+Ascension+that+the+promised+ends+of+the+research+did+not+justify+the+means.>Google Scholar
45 C.F.R. § 164.502(a)(1)(ii).Google Scholar
45 C.F.R. § 164.501 (the term health care operations, includes “conducting quality assessment and improvement activities, including outcomes evaluation and development of clinical guidelines…”).Google Scholar
45 C.F.R. § 164.504(e).Google Scholar
Copeland, Mattioli, and Evans, supra note 34. See Dinerstein v. Google, LLC, 484 F. Supp.3d 561 (N.D. Ill. 2020) (dismissing class action for invasion of privacy and other causes of action arising from the University of Chicago Medical Center’s providing Google with access to all patient health records for analysis).Google Scholar
See E.L. King, “Top 15 Causes of Car Accidents and How You Can Prevent Them,” HuffPost, December 6, 2017, available at < https://www.huffingtonpost.com/laiza-king-/top-15-causes-of-car-accidents_b_11722196.html?ncid=engmodushpmg00000004> (last visited July 29, 2021).+(last+visited+July+29,+2021).>Google Scholar
The lack of tickets, accidents, or damage claims would seem to be the best evidence of safe driving.Google Scholar
On the other hand, sensors damaged in a car accident make it much more expensive to repair cars. See A. Davies, “New Safety Gizmos Are Making Car Insurance More Expensive,” Wired, January 26, 2020, available at <https://www.wired.com/story/safety-gizmos-making-car-insurance-more-expensive/> (last visited August 1, 2021).+(last+visited+August+1,+2021).>Google Scholar
Ignition interlocks connected to breathalyzers long have been proposed to prevent drunk driving. See National Highway Safety Administration, Ignition Interlocks — What You Need to Know (2019), available at <https://www.nhtsa.gov/sites/nhtsa.gov/files/documents/ignitioninterlocks_811883_112619.pdf#:~:text=An%20ignition%20interlock%20is%20an%20after-market%20device%20installed,above%20a%20pre-set%20limit%20or%20set%20point%2C%202> (last visited July 17, 2021).+(last+visited+July+17,+2021).>Google Scholar
See, e.g., Abraham, J.M., “Employer Wellness Programs – A Work in Progress,” Journal of the American Medical Association 321, no. 15 (2019): 14621463.10.1001/jama.2019.3376CrossRefGoogle ScholarPubMed
National Cancer Institute, National Institutes of Health, Cancer Moonshot, available at <https://www.cancer.gov/research/key-initiatives/moonshot-cancer-initiative> (last visited July 18, 2021).+(last+visited+July+18,+2021).>Google Scholar
National Institutes of Health, What Is the Brain Initiative? available at <https://braininitiative.nih.gov/ > (last visited July 18, 2021).+(last+visited+July+18,+2021).>Google Scholar
National Institutes of Health, All of Us Research Program, The Future of Health Begins with Us, available at <https://allofus.nih.gov/> (last visited July 18, 2021).+(last+visited+July+18,+2021).>Google Scholar
Centers for Disease Control and Prevention, Precision Health: Improving Health for Each and Every One of Us, available at <https://www.cdc.gov/genomics/about/precision_med.htm> (last visited July 18, 2021).+(last+visited+July+18,+2021).>Google Scholar
See Collins, F.S. and Varmus, H., “A New Initiative on Precision Medicine,” New England Journal of Medicine 372, no. 9 (2015): 793795.Google ScholarPubMed
National Institutes of Health, All of Us Research Program Overview, available at <https://allofus.nih.gov/about/all-us-research-program-overview> (last visited July 18, 2021).+(last+visited+July+18,+2021).>Google Scholar
The All of Us Research Program Investigators, “The ‘All of Us’ Research Program,” New England Journal of Medicine 381, no. 1 (2019): 668-676.10.1056/NEJMsr1809937CrossRefGoogle Scholar
See Price, W.N. II and Cohen, I.G., “Privacy in the Age of Medical Big Data,” Nature Medicine 25, no. 1 (2019): 3743; C.O. Schneble, B.S. Elger, and D.M. Shaw, “All Our Data Will Be Health Data One Day: The Need for Universal Data Protection and Comprehensive Consent,” Journal of Medical Internet Research 22, no. 5 (2020): 1-8, available at <http://www.jmir.org/2020/5/e16879> (last visited Oct. 27, 2021) (mass linkage of non-health data could transform it into heath data); E. Vayenna and A. Blasimme, “Biomedical Big Data: New Models of Control on Access, Use and Governance,” Journal of Biomedical Inquiry 14, no. 5 (2017): 501-513 (biomedical Big Data now includes environmental, lifestyle, and other data).CrossRefGoogle ScholarPubMed
See, e.g., Chowkwanyun, M., Bayer, R., and Galea, S., “‘Precision’ Public Health — Between Novelty and Hype,” New England Journal of Medicine 379, no. 15 (2018): 13981400; J.P. Evans et al., “Deflating the Genome Bubble,” Science 331, no. 6019 (2011): 861-862; H. ten Have and B. Gordjin, “Precision in Health Care,” Medicine, Health Care and Philosophy 21 (2018): 441-442.10.1056/NEJMp1806634CrossRefGoogle ScholarPubMed
See, M.A. Rothstein, “Structural Challenges of Precision Medicine,” Journal of Law, Medicine & Ethics 45, no. 1 (2017): 274279; M.A. Rothstein, “Some Lingering Concerns about the Precision Medicine Initiative,” Journal of Law, Medicine & Ethics 44, no. 2 (2016): 520-525.Google Scholar
See Jain, J.H. et al., “The Digital Phenotype,” Nature Biotechnology 33, no. 5 (2015): 462463 (discussing composite picture of digital data).10.1038/nbt.3223CrossRefGoogle ScholarPubMed
See All of Us Research Program, National Institutes of Health, Protecting Data and Privacy, available at <https://allofus.nih.gov/protecting-data-and-privacy> (last visited July 18, 2021).+(last+visited+July+18,+2021).>Google Scholar
See All of Us Research Program, National Institutes of Health, Core Values, available at <https://allofus.nih.gov/about/core-values> (last visited July 18, 2021) (“participants have access to their information”).+(last+visited+July+18,+2021)+(“participants+have+access+to+their+information”).>Google Scholar
Reportedly, prospective and current participants in All of Us are not informed about the risk of privacy caused by compelled disclosure of their “enhanced” health records. The same process threatens the privacy of individuals who use direct-to-consumer genetic testing and then have the results added to their health records.Google Scholar
45 C.F.R. pts. 160, 162, 164.Google Scholar
45 C.F.R. § 160.102.Google Scholar
A covered entity is merely required to mention the disclosures in its Notice of Privacy Practices. 45 C.F.R. § 164.520.Google Scholar
45 C.F.R. § 1964.512.Google Scholar
45 C.F.R. § 1964.524.10.1002/cpt196454524CrossRefGoogle Scholar
Rothstein, M.A. and Talbott, M.K., “Compelled Disclosures of Health Records: Updated Estimates,” Journal of Law, Medicine & Ethics 45, no. 1 (2017): 149155.10.1177/1073110517703109CrossRefGoogle ScholarPubMed
For a discussion of early congressional proposals, see Miller, A.R., The Assault on Privacy: Computers, Data Banks, and Dossiers (Ann Arbor: University of Michigan Press, 1971): at 220238. The Privacy Act of 1974, 5 U.S.C. § 552a, was enacted in response to the Watergate scandal, but it was limited to protections for information maintained by the federal government.Google Scholar
By contrast, under the European Union’s General Data Protection Regulation (GDPR), access or use of personal data is illegal unless there is an express provision permitting it. European General Data Protection Regulation, available at <https://gdpr.eu/> (last visited July 22, 2021).+(last+visited+July+22,+2021).>Google Scholar
Id. Recital 32. On surveillance capitalism and the GDPR, see Aho, B. and Duffield, R., “Beyond Surveillance Capitalism: Privacy, Regulation and Big Data in Europe and China,” Economy and Society 49, no. 2 (2020): 187212.CrossRefGoogle Scholar
42 U.S.C. §§ 12101-12213.Google Scholar
42 U.S.C. § 12112(d)(3).Google Scholar
42 U.S.C. § 12112(b)(6).Google Scholar
See Rothstein, M.A., “Predictive Health Information and Employment Discrimination under the ADA and GINA,” Journal of Law, Medicine & Ethics 48, no. 3 (2020): 595602.CrossRefGoogle ScholarPubMed
See, e.g., Rothstein, M.A., “Time to End the Use of Genetic Test Results in Life Insurance Underwriting,” Journal of Law, Medicine & Ethics 46, no. 3 (2018): 794801.CrossRefGoogle ScholarPubMed
See, e.g., Geiger, B.B. et al., “Assessing Work Disability for Social Security Benefits: International Models for the Direct Assessment of Work Capacity,” Disability and Rehabilitation 40, no. 24 (2018): 29622970.CrossRefGoogle ScholarPubMed
See Cortez, E.K., ed., Data Protection Around the World: Privacy Laws in Action (The Hague: Springer, 2021).10.1007/978-94-6265-407-5CrossRefGoogle Scholar
A few states provide a cause of action. See, e.g., Cal. Civ. Code § 56.36(b).Google Scholar
Cal. Civ. Code §§ 1798.100-1798.198 (2018). The law applies to for-profit entities that do business in California, that collect consumers’ personal information, and that meet certain financial thresholds. The law does not apply to, among other exempt entities, covered entities and business associates regulated by the HIPAA Privacy Rule. The law provides for civil damages, civil penalties, injunctive or declaratory relief, and other relief that a court may deem appropriate. See Rothstein, M.A. and Tovino, S.A., “California Takes the Lead on Data Privacy Law,” Hastings Center Report 49, no. 5 (2019): 45.Google ScholarPubMed
Virginia Consumer Data Protection Act, H.B. 2307 (2021), applies to entities that conduct business in Virginia or produce products or services targeted to Virginia residents and that either control or process personal data of at least 100,000 consumers in a calendar year or control or process personal data of at least 25,000 consumers and derive at least 50% of gross revenues from the sale of personal data. Among the exemptions from the statute are entities subject to HIPAA.Google Scholar
Colorado Privacy Act, S.B. 21-190 (2021), applies coverage standards identical to Virginia. Although the law exempts certain controllers of health data, it does not exempt them completely, as in California and Virginia.Google Scholar
740 Ill. Comp. Stat. Ann. 14/1 et seq. (2008). The law prohibits private entities from obtaining, using, or selling a person’s biometric identifier or information without first obtaining the individual’s written, informed consent. A “biometric identifier” means “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” Any person aggrieved by a violation of the act may recover from an entity that negligently violates any provision of the law, liquidated damages of $1,000 or actual damages, whichever is greater. If the violation is intentional or reckless, the liquidated damages are $5,000. Reasonable attorney fees and costs, and injunctive relief also are recoverable. See Patel v. Facebook, Inc., 932 F.3d 1264 (9th Cir. 2019), cert. denied, 140 S. Ct. 937 (2020) (holding that class action status was proper in action challenging Facebook’s “tag suggestions” photo feature); Vance v. Microsoft Corp., 2021 WL 963485 (W.D. Wash. 2021) (action brought by Illinois residents alleging Microsoft downloaded and conducted facial scans of plaintiffs’ photos without consent to improve its facial recognition technology).Google Scholar
Vernon’s Tex. Bus. & Com. Code Ann. § 503.001 (violators subject to $25,000 civil penalty in action brought by state attorney general).Google Scholar
West’s Wash. Rev. Code Ann. §§ 19.375.010 et seq. (act does not provide for a private right of action).Google Scholar
California’s Consumer Privacy Act, supra note 77, includes “biometric information” within the definition of “personal information” protected by the statute, but damages are limited to $100 to $750 per violation if there is unauthorized access, theft, or disclosure because of a business’ violation.Google Scholar
See New State Ice Co. v. Liebmann, 285 U.S. 262, 311 (1932) (Brandeis, J., dissenting) (“It is one of the happy accidents of the federal system that a single, courageous state, may, if its citizens choose, serve as a laboratory, and try novel social and economic experiments without risk to the rest of the country.”).Google Scholar
“Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.” GDPR art. 9, § 1 (emphasis added).Google Scholar
See text accompanying notes 51-52 supra. Google Scholar
See Terry, N.P., “Big Data Proxies and Health Privacy Exceptionalism,” Health Matrix 24, no. 1 (2014): 65108.Google ScholarPubMed
See Litman-Navarro, K., “We Read 150 Privacy Policies. They Were an Incomprehensible Disaster,” New York Times, June 12, 2019, available at <https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html> (last visited August 1, 2021); See also A. Bruvere and V. Lovic, “Rethinking Informed Consent in the Context of Big Data,” Cambridge Journal of Science and Policy 2, no. 2 (2021), doi.org/10.17863/CAM.68396.Google Scholar
In an assessment of the 36 top-ranked apps for depression and smoking cessation, 29 transmitted data for advertising and marketing purposes to Google and Facebook, but only 12 of 28 transmitting data to Google and 6 of 12 transmitting to Facebook disclosed this fact. Huckvale, K., Torous, J., and Larsen, M.E., “Assessment of Data Sharing and Privacy Practices of Smartphone Apps for Depression and Smoking Cessation,” JAMA Network Open 2, no. 4 (2019): e192542. Also, in a study of 211 Android diabetes apps, permissions required to download the app authorized collection of tracking information (17.5%), activating the camera (11.4%), activating the microphone (3.8%), and modifying or deleting information (64.0%). S.R. Blenner et al., “Privacy Policies of Android Diabetes Apps and Sharing of Health Information,” Journal of the American Medical Association 315, no. 10 (2016): 1051-1052.10.1001/jamanetworkopen.2019.2542CrossRefGoogle ScholarPubMed
42 U.S.C. § 2000ff. Google Scholar
42 U.S.C. § 2000ff-1(b).Google Scholar
42 U.S.C. § 2000ff-1(a)Google Scholar
For example, Title VII of the Civil Rights Act of 1964, 42 U.S.C. §§ 2000e-2000e-17, does not prohibit employers from asking about the race of applicants and employees, but virtually no employers do so because inquiries about race might be offered as evidence of discrimination if a lawsuit were brought. See U.S. Equal Employment Opportunity Commission, Prohibited Employment Policies/Practices, available at <eeoc.gov/prohibited-employment-policiespractices> (last visited August 29, 2021).+(last+visited+August+29,+2021).>Google Scholar
See note 73 supra.Google Scholar
The possible regulation of surveillance technology companies by application of antitrust, consumer protection, or other laws is beyond the scope of this article.Google Scholar