Should privacy be a tradeable right? This is an issue for urgent consideration, given how much personal data collated from different sources can reveal about our personal lives. The rise of digital technologies and of the digital economy on the one hand, and of data mining capabilities on the other, present economic opportunities that are being harnessed, often at the cost of our privacy. Some see this as a case of “missing markets”, where appropriate markets with adequate rules and regulations should be put in place. In this paper, I argue that the creating of a market for personal data, amounts to making the right to privacy a tradeable right. Further, a market for personal data/ privacy has all the characteristics of what Debra Satz characterises as “noxious markets”. Other economists, notably Bowles, Hausman and MacPherson and Sandel, have sought to delineate the moral limits to markets in cases of child labour, the organ trade, etc. I argue that the market for personal data should be treated similarly.
Keywords: privacy, personal data, limits to markets, ethics,
In 2008, we met a National Rural Employment Guarantee Act (NREGA) worker in Sonebhadra district, Uttar Pradesh, who was owed nearly Rs 20,000 in NREGA wages, for work done over two years. A few months later, we learnt that his arrears were cleared. Out of curiosity we asked him what he did with the money, he told us that he had “invested” it. When we examined his papers, we realised that he had been duped of the entire amount. Information about his payment had been leaked to an unscrupulous operator. This was a salutary lesson on the importance of privacy of seemingly innocuous financial information, and the vulnerability of people to fraud when their personal (financial) data is compromised.
In 2017, a nine-judge Constitutional bench of the Supreme Court of India declared the right to privacy a fundamental right [1]. The case arose in the context of the Aadhaar (India’s biometric/digital ID programme) matter and marked an important moment in the debate on privacy in the digital age. In the course of these hearings, there were attempts to trivialise privacy concerns by the government as well as by influential media voices [2]. Partly as a result of that judgment, debates in India today do not always need to start with a discussion on why the right to privacy is not an elitist concern, but of consequence to everyone. What is not as well understood is how our personal data can compromise our right to privacy.
The scope of our digital lives has expanded exponentially beyond economic activities (from prescriptions to food, work) to social media and personal activities (entertainment through OTT). What I eat, read, watch, who I meet or talk to, and how often, where, when and how I travel, and so on, is information readily available to others. Some products of companies such as Facebook and Google, pioneers in these intrusive technologies, could access private information [3, 4]. Easier and cheaper storage implies more data can be stored for longer. Acquisti, Taylor and Wagman provide an overview of the thinking in economics on privacy and personal data, including the varied use of personal information by firms to influence consumer behaviour [5]. They suggest that privacy is a hard problem, that protection of privacy can “enhance and detract from individual and society welfare” and how consumers are at a particular disadvantage due to asymmetric and imperfect information.
This, in turn, has led to the use of our personal data to profile us, with the intention of predicting and even manipulating our behaviour [6]. The data broking industry is thriving and facilitates data mining [7]. Data analytics has proven profitable for businesses: for instance, in recent years, estimates suggest that around 80% of Google’s total revenues come from advertising [8]. Much of this is driven by making our profiles ― our searches (on Google and related products), characteristics (eg, our location and language), even our “personally identifiable information” (PII), data (address, date of birth, mobile phone numbers) — available to advertisers to help them target ads better [9].
One response to such worries is that we should treat this as a “missing market” [10] or incomplete markets problem. We need to put in place a regulatory framework, the argument goes, clearly specifying the rules for such transactions [11]. The authors state that “The ideal public policy setting is one in which individuals have property rights over personal information and can control and monetize their own data.” Right now, individuals are not really reaping the fruits of being “data rich” because companies are able to mine their data without (adequately) compensating people. Thus arises an argument for creating “data rights” or “property rights” over data, that would allow people to monetise it, should they so wish [12].
For Matthan, a data rights-based approach will help overcome the limitations of the current consent-based model [13]. By allowing the rules of the game to be specified in law, we can “empower the individual” [14]. An extension of this view is that privacy is a “luxury good” for which those who value it will have to pay a suitable price [15]. Incidentally, this view is diametrically opposite to the Supreme Court ruling in India declaring the right to privacy a fundamental right [1].
Besides the private sector, the government, too, is a vast repository of personal data ― vehicle and property ownership, financial information, access to welfare, education, birth, death and health records, and much more. Though such data is held in a fiduciary capacity, it is also beginning to be seen as a business opportunity. For instance, “Over the last few years, the government and its agencies have acquired a goldmine of data with roll-out of initiatives like GST, Aadhaar, FASTag, e-bills, banking data with regulators, etc. Management and monetization of this large pool of data can be more effective and revenue-generating with private sector partnership with all relevant data protection covenants. Some of this data may already be getting used by the private sector for expanding their business and reach. An adequate framework on right mining and monetization of this can be invaluable for the government. Given the focus on digitization, the data can be used more productively and generate value for everyone.” [16]
This view was exemplified in the Economic Survey 2018-19 in a chapter titled “Data ‘Of the People, By the People, For the People’ ” [17]. In 2019, the Indian government monetised motor vehicle registration data under the “Bulk Data Sharing” policy [18]. In response to privacy concerns from such sales, the data sharing policy was scrapped and no other incident has been reported since. Karnataka has already moved to put a “consent” framework (e-Sahmati) in place to allow private companies to access and verify education records with the state government, also held in a fiduciary capacity [19]. What one observes is a worrying marriage of convenience between “Big Data” and “Big Brother” [18, 20].
Some of this thinking was reflected in a draft of a Data Protection Bill proposed in 2018. That Bill and the Digital Personal Data Protection Act 2023 create a legal framework for facilitating the use of personal data for commercial purposes and protecting privacy when this happens. As the objectives of the 2023 Act put it: the law is intended to “provide for the processing of digital personal data in a manner that recognises both the right of individuals to protect their personal data and the need to process such personal data for lawful purposes”. In the earlier Bill, the objective was worded slightly differently as being “necessary to create a collective culture that fosters a free and fair digital economy, respecting the informational privacy of individuals, and ensuring empowerment, progress and innovation through digital governance and inclusion…”. “Lawful purpose” is defined in the law as “any purpose which is not expressly forbidden in the law” (Section 4(2)). Section 36 grants similar blanket permission to the government to “furnish such information as it may call for” from data fiduciaries, the Data Protection Board or others. I argue here that a framing where we are faced with a trade-off between the right to privacy and reaping the economic opportunities from digitalisation and data mining is problematic. We need to step back from the data rights line of thinking and critically view a market for personal data as one that makes the right to privacy a tradeable right.
In many situations, third party sale and/or mining of our personal data amounts to a violation of our privacy. However, even if individuals were to sell their own data, it would amount to surrendering their privacy. Narayanan and Schmatikov explain how our personal data and personally identifiable information are related to privacy violations [21]. Some examples help to think through some of these concerns. For instance, a food aggregator selling information about the frequency with which a customer orders food and the type of food ordered to a private health insurance company, that in turn, classifies this information as healthy vs greasy food to determine whether that customer gets medical insurance or not, would harm their interests.
“Flo”, a period tracking app available globally, lied to its users about its privacy policy and shared information about menstruation dates and pregnancy plans with Facebook and Google [22]. Following the United States Supreme Court’s decision that makes abortion illegal, there were worries about reproductive health apps becoming turncoats and handing over sensitive data to government agencies during criminal investigations [23, 24].
Similar worries arise in other contexts as well. Mental health apps are one such, where users share intimate information that can be stigmatising. Mozilla Foundation have flagged this issue: “The vast majority of mental health and prayer apps are exceptionally creepy. They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state and biometric data”, cited in Goswami [25]. Further, the report says, “they routinely share data, allow weak passwords, target vulnerable users with personalized ads, and feature vague and poorly written privacy policies.” Singh reports on a Politico investigation by Alexandra Levine where they found that a “Crisis Text Line” (CTL), a suicide helpline, shared data on conversations with people they had supported with their for-profit subsidiary to develop a related software [26]. CTL’s defence was that data was anonymised. If the mental health apps have data broking subsidiaries with whom data is shared (as part of its privacy policy), the user of the app may end up getting de-anonymised and then being red flagged in all sorts of situations, eg, by a Curriculum Vitae (CV) sorting software to select job applicants.
When you order using an app or website for your first online order, you may be offered a first-use free coupon/ top-up. By downloading and using the app, you create a data footprint. In addition, you might be offered another reward worth, say, Rs 100 if you “refer a friend”. This is a useful example to distinguish between data footprints and “data shadows” [27] or “data externalities” [28]. Shadows refers to information about me on the internet, not because of my own activity, but through the activities of others — eg, when a person is trading someone else’s data. Or, when I download a taxi hailing app, and it accesses contacts from my contact list and sends them notifications and promotions. Inadvertently, my digital footprint creates a data shadow of all my contacts, and compromises their privacy. Acemoglu provides a helpful overview of the harms that arise from such data practices [28]. For instance, platforms end up capturing most of the consumer surplus (the difference between what a consumer is notionally willing to pay and what they actually pay), with AI intensive firms benefitting more than others. Tirole also describes some of the analytical challenges for economists that arise with rapid digitisation [29].
Thus far, such private sector practices have hidden behind a facade of consent-based data mining (which implies that people have provided consent when they signed up for an app or used a website) and de-anonymisation of data (ie, that “personally identifiable information” is removed when data is sold), when it is sold. Personal data is being monetised supposedly with the individual’s consent, whereas in fact, consent is only theoretical. A summary of a similar view advanced by economist Hal Varian (at Google) is available in Zuboff [30]. The limits of the consent-based market for data sharing, for example, people blindly accepting Terms and Conditions of privacy policies are now well understood. Privacy policies essentially ask for permission to access data on our devices. Another defence has been that data, when sold, are anonymised. This again, is an inadequate defence, as de-anonymisation has been shown to be easy to do [31]. A market for personal data, then, amounts to trading our privacy. Further, given that even governments succumb to the temptation to buy personal data for profiling, there is little hope of ethical practices surviving, even in businesses where they hope to do so [32].
Should privacy be a tradeable right? Several economic philosophers have cautioned about the limits of markets. For instance, Hausman and MacPherson discuss the work of Richard Titmuss and Kenneth Arrow on blood donation [33, 34, 35]. Would blood donations be more efficient if they were commercialised, or voluntary? Both suggest that voluntary blood donations may be more efficient; because in a commercial system there can be an incentive for the donor to lie about whether their blood is safe. In addition, in certain situations, Titmuss felt a commercial system can discourage giving; since a potential blood donor may feel less moral compulsion to go out and donate if she knew that blood was available for a price 34]].
Satz discusses “noxious markets” that are “toxic to important human values” [36]. Some markets shape our politics and culture and our identities, and may evoke discomfort, even revulsion. She argues that efficiency cannot be the only metric to evaluate markets. From markets for weapons and pollution on the one hand, to votes and mercenaries on the other, bans may be justified even if bans lead to inefficiencies. Further, there may be reasons to worry about certain markets even when egalitarian concerns are absent (say, that child labour is wrong even if it is not forced by destitution). Further, she endorses TH Marshall’s view against markets for education, employment, healthcare and votes, because these are essential for a functioning democracy, and such goods should be guaranteed as rights. If markets undermine the equal status of people in what she calls the “social democratic” sense, ie, “… certain goods need to be provided outside the market if citizens are to be equals. Equality in these goods is necessary for democratic citizenship …”, then such markets are questionable [36: p 208]. I believe that privacy (and therefore, personal data) should be treated in a similar way ― as a social good and a right that is necessary for democratic citizenship.
Satz lists four parameters to gauge markets for noxiousness [36]. In the next section, I discuss to what extent personal data markets have these features:
1. Vulnerability: When people accept the terms of an exchange due to desperation (eg, the sale of child labour due to poverty).
2. Weak agency: When participants have poor information (information asymmetry), or in which they are not direct participants (eg, when elected leaders decide on behalf of voters).
3. Extremely harmful outcomes for individuals: When participation makes them destitute or harms their basic interests.
4. Extremely harmful outcomes for society: When they undermine an equal society or support relations of humiliating subordination or unaccountable power.
Sandel raises similar concerns [37]. Commodifying a good can alter its character. Market transactions in spheres of life that were governed by nonmarket transactions, crowd out the nonmarket moral and civic norms. Apart from a “fairness” argument, he makes a “corruption” argument, the degrading effect of such transactions (eg, if justice can be bought or sold), which applies even in the absence of inequality. I argue that personal data markets also create this sense of unease.
Privacy falls in the same category of social goods/ rights as education, healthcare, justice, etc. In this section, I show that the market for data/ privacy has all the troubling characteristics discussed by Hausman and McPherson, Satz and Sandel [33, 36, 37]. In particular, it satisfies the test for a “noxious market” described by Satz, summarised above.
EfficiencyOne argument for sharing personal data is that they can enhance “efficiency” in our online lives. However, that is not always the case. Often, the argument in favour of such markets highlights the monetary value of small bits of data and how we all benefit, say, personalised and auto-suggestions while shopping online can reduce search time if information on our preferences is made available through sharing personal data. This seemingly “win-win” situation (sellers gain buyers and buyers’ search costs are reduced) in fact hides an inherent asymmetry. The same data mining algorithms that can target ads also enable companies to extract all consumer surplus and to manipulate consumers. Cathy O’Neil gives the example of how algorithms suss out our vulnerabilities to exploit them. For instance, when a person goes looking for shoes online, the algorithm may suggest socks or other complementary goods [38]. If you want to educate yourself about “how to invest”, you may be targeted with ads for investing options.
Some products that are purportedly created to enhance consumer utility, like suggestions and targeted advertising on Netflix, Amazon, YouTube, etc, are, in fact, designed to promote addictive behaviour. Are such practices morally acceptable? One could argue, for instance, that the algorithms help save search time and that people who browse are doing so of their own choice. On the other hand, promotion of other addictive products (eg, alcohol, tobacco products, etc) is restricted precisely on the grounds that it undermines a person’s ability to make a choice. Moreover, there is a question about how prices for our personal data are to be determined in such markets.
Vulnerability/egalitarian concernsOne illustration of unfair, or at least, questionable practices is that of Byju’s, an online tutoring firm, and the world’s highest-valued edtech start-up at one time. During the school closures due to Covid in 2020-21, Byju’s pushy and aggressive agents and managers preyed on parents’ anxieties about school closures and their child’s future. Byju’s offered a free download for the first 15 days, monitored usage in that period, and those who used the app frequently were called by Byju’s agents. Children were asked questions that the agents knew the kids were likely to fail, heightening the anxiety parents felt for their child’s future, got them to sign up, offered loans with hidden terms and conditions, only to disappear or under-deliver subsequently [39, 40].
There are similar reports of people entitled to welfare benefits (wages under the National Rural Employment Guarantee Act 2005, social security pensions, maternity benefits and so on) who were forced to first get Aadhaar, then sign a “consent” on data sharing. The “option” they had was to walk away from these welfare benefits or consent to data sharing ― the consequences of which they did not understand.
Weak agency due to information asymmetry and lack of meaningful consentIn case of morally acceptable data trades, how does one determine whether people are getting a fair compensation? After all, the company is likely to benefit from a stream of purchases from the person you refer. When someone shares her data because a Rs 100 coupon is of some consequence to her (eg, students recommending friends), both parties are better off, the user has been compensated for sharing her data; yet Satz’ and Sandel’s fairness argument is very much at play. The app developers are targeting a market segment, eg students, that is willing to share data at an undervalued price because that segment is likely to be cash-strapped. In this example, a friend who receives a Rs 100 coupon may not have got the true value of the information she shares (phone numbers from her contacts list). Further, she may not have comprehended what she has consented to: whether the app will share such information to monetise it further, or whether the app will share information with other apps that this is a customer who is willing to sell information for a Rs 100 coupon. According to one website, one sign up for an SMS service is worth USD 8 to a firm (see: https://tasil.com/data-value-estimator/).
More importantly, if she were fully aware of the uses to which such data are being put, would she still have shared the information? Data mining is used to extract consumer surplus (eg, the allegation that Uber used surge pricing when mobile battery charge was low [41]), to nudge us to buy more (“frequently bought with this product” suggestions), to put us in silos (based on what we read and who we follow), polarises society [42] and manipulates our economic and political decisions [43]. When customers download/ sign up for apps, are they aware of the wide-ranging consequences?
Incidentally, increasingly in online markets, sale/purchase is only possible if you part with your data (basic personal information) in a way that consumers are denied choice and freedom (for instance, the food aggregator Zomato only allows orders if their app is downloaded, not through the web). The lack of meaningful and informed consent is another aspect of the vulnerability in such markets.
Weak agency where digital shadows are involvedIf a friend refers me, she creates a data shadow about my personal information (name and mobile number); while my own consent is bypassed (a routine occurrence). This is another aspect of the weak agency problem in the data market.
Extremely harmful outcomes for individualsWhen we use mobile payment apps for digital transactions, we are putting some information about our financial transactions in the hands of the apps we use. Are people really aware of whether information about those transactions is protected or shared further by the payment app? If all these transactions are routed through one server (National Automated Clearing House/National Payments Corporation of India), are they aggregating these data about us? If so, are they allowed to share this with anyone? Also, with whom and under what conditions?
What sorts of identity fraud do data mining opportunities open up, to the detriment of the consumer? Data breaches have jeopardised people’s economic and personal lives. Phishing scams are often the result of such loss of control over one’s data (eg, I share some personal information with a platform for verification and that is later shared with a third party)1, and often the loss of control is executed surreptitiously. In India, reported cybercrimes especially those related to identity theft, online fraud (credit and debit cards, online banking, etc.) have increased more than five times, from just under 10,000 in 2014 to almost 53,000 cases in 2021, according to the National Crime Records Bureau [44, 45].
Extremely harmful outcomes for society due to negative externalitiesThe fact that data trades are essentially creating a market for privacy, and that the right to privacy is important for human growth, is the main reason that personal data markets are contentious. One could also characterise a compromised right to privacy as the negative externality associated with, or a toxic by-product of, data markets. Another harm would be to public discussion and democracy, when practices of social media platforms lead to polarisation. The testimony by Frances Haugen on how Facebook amplifies and monetises hate speech, and the effects of this on society is a recent example of this [46]. The practices of Cambridge Analytica in terms of profiling voters with a view to influencing outcomes on the basis of such analysis undermines democratic practice.
Degradation of social normsSome social norms are endangered by our digital lives. I have in mind the degrading of personal relationships (birthdays, friendships, holidays are played out on social media), the phenomenon of “viral videos” that make lynchings and accidents a spectacle to capture on camera. Recently, in India, an app prompted users to “find your sulli deal of the day”. (“Sulli” is a derogatory term for Muslims). The app used photographs of Muslim women, harvested from the internet (without their consent or even knowledge), and offered them to users. The app is both Islamophobic and misogynistic.
What do we do if a market is potentially noxious? Economists have proposed various remedies for dealing with market failures (such as the ones seen in the data market). However, Satz argues that certain market failures “shape our relationships with others in ways that goes [sic] beyond the idea of unabsorbed economic costs” [36: p 93]. These failures affect our standing before, during, and after the exchange.
Satz suggests a differentiated response depending on context, ranging from creating an alternative distribution mechanism alongside the market, redistribution, regulation and restraining such markets. Bowles warns that “fines, rewards and other material incentives often do not work very well” [47]. In a famous experiment by Gneezy and Rustichini, introduction of a fine for parents who did not punctually pick up their children from a day care centre, increased the number of such parents; the number did not go down even after the fine was removed [48].
Two prominent options for data/ privacy markets have been regulation and better control over data through creation of “data rights”. The General Data Protection Regulation (GDPR) was guided by the EU charter which guarantees the right to privacy and data protection, and has been held up as a fine example. Yet, as I suggest below, this route has not been particularly effective.
Further, it has become accepted practice that apps are uploaded directly to mobile phone platforms offered by Apple and Google. While Apple has human review before approval is granted, in the case of Google, human review is only undertaken in case of complaints. Academy of Medical Sciences provides a helpful illustration of the challenges of regulating apps [49]. Developers of apps are not required to get any government clearance. This has created a free-for-all situation and given the scope of the “services” provided through apps (health advice, loans, payments, education, etc); this is an alarming situation. Yet one hears very little about bringing in government oversight or accreditation before apps are uploaded. Only in 2021, the Reserve Bank of India ordered Google to remove 30 loan apps [50]. RBI’s actions were triggered by reports of intimidation, cyber bullying and even suicides on account of defaults by borrowers. According to news reports, some loan apps require the user to give access to their phone gallery, and later photographs taken from the gallery are used to bully borrowers [51].
In fact, in the Indian case, given that the debate was steered away from the right to privacy and data protection, towards the creation of suitable regulation for exploitation of this economic opportunity, legal safeguards are even less likely to work. This is evident from the Economic Survey 2019, the 2019 draft Bill on data protection, and government committees including the Srikrishna Committee and the joint Parliamentary committee [52]. For instance, the Srikrishna Committee report was titled A Free and Fair Digital Economy – Protecting Privacy, Empowering Citizens2. In its opening paragraph, the notification of the constitution of the committee states, “The need to ensure growth of the digital economy while keeping personal data of citizens secure and protected is of utmost importance.”
When they work, markets regulation strategies can undermine marketsBowles argued that incentives (or fines) alone cannot provide good governance [47]. One can see a glimpse of that in a widely reported case, where French authorities imposed a fine of USD 57mn on Google for violating GDPR norms on transparency [53]. This seemingly large fine was in fact less than 0.1% of Google’s annual turnover (USD 161 billion) and profit (USD 90 billion) in 2019. The GDPR, among the strongest data protection laws right now, allows a fine of up to 4% of the annual global turnover. But such cases are rare, and corporations are more likely to treat fines as operational costs, rather than as warnings to alter their practices substantively.
Market regulation strategies are a smokescreenZuboff (2019) warns that individuals “wrestling with the myriad complexities of their own data protection will be no match for surveillance capitalism’s staggering asymmetries of power and knowledge” [30: p 482]. Can we truly expect that data markets will develop into ideal-type perfectly competitive markets, and that when people are allowed to trade their data if they like, that they themselves will be the beneficiaries? In a society with low literacy among certain segments, poor financial and digital literacy, can we be so naïve? In fact, even those with digital and financial literacy, easily and regularly fall prey to these techniques.
Another weakness in the argument that proposes the creation of a regulatory framework with penalties for violations is that the field is not balanced. It pitches poorly empowered citizens against powerful and invisible entities. If I am fortunate enough to understand how a data fraud was inflicted on me, and I seek redressal, it is not evident (as Zuboff anticipates) that pinning responsibility on the private entity is going to be possible, easy or costless to me. Individual or collective attempts to hold big corporations to account have been rare.
Like Zuboff, Milanovic argues that while globalisation and technological revolutions have created markets for new things (personal data, rentals for own cars and homes, etc), the commodification of ever-increasing spheres of human life — politics, leisure, data — is likely to lead to a crisis, if its “field of action” is not controlled and reduced to more traditional areas [54]. For Satz, “some markets are noxious and need to be blocked or severely constrained if the parties are to be equals in a particular sense, as citizens in a democracy” [36].
A reader might ask, where does this leave us? Perhaps greater digital literacy, privacy-respecting software, open standards, can make a difference. One way to move ahead is to ask whether tweaks in the data protection framework can show the way forward. Can the principles (purpose limitation, data minimisation, etc), grounds (consent, medical emergency, legal obligation), and rights (access, correction, erasure etc) provide remedies such that our equal standing (as Satz puts it) in such transactions is either promoted, or at least not undermined?
To suggest that such markets should be controlled, reduced or blocked is seen as impractical and idealistic. Yet, as Zuboff says, “the message of inevitability [that] is power’s velvet-gloved right hand” must be rejected [30: p 522]. She ends her account of surveillance capitalism with a call to action, to “be the friction” and invokes Orwell who likens “the instinct to bow down before the conqueror of the moment” to a bad habit and a mental disease [30: p.523]. In The age of surveillance capitalism, Zuboff is open-minded about the effects of regulations such as GDPR, and stops short of suggesting a ban.
In India, we have allowed the debate on data markets and data protection to be framed in a way that accords primacy to harnessing the economic opportunities arising from data mining, or puts these opportunities on an equal footing with the right to privacy. This comes out clearly in the objectives of the DPDP Act 2023, as stated above. Perhaps this was because the link between data markets, data mining and the right to privacy was not apparent at the beginning. Perhaps this was because the scope of data mining has expanded beyond what one could have imagined. Or perhaps this is because it serves the interests of the powerful public and private entities, who in turn control the narratives. It is time to re-examine these issues by putting markets in their place: placing democracy and human rights above markets.
Note:
1Say, my data (mobile number) is shared with an online platform for verification of a purchase on that platform. The platform is allowed to trade it further. A few such trades later, it falls into the hands of phishing scamsters, who are also buying similar personal data (eg, birth date, mother’s maiden surname, etc) from other sources or advertisers of various goods (property deals, weight loss programmes, medical tests).
2https://www.thequint.com/news/india/key-highlights-from-srikrishna-committee-report-on-data-protection#read-more
Author: Reetika Khera (reetika@hss.iitd.ac.in, https://orcid.org/0000-0001-9955-8087), Professor of Economics, Indian Institute of Technology Delhi, MS603 Department of Humanities and Social Sciences, New Delhi, INDIA
Conflict of Interest: None declared Funding: None
Acknowledgments: The author thanks Kritika Bhardwaj, Arudra Burra, Jean Drèze, Rajesh Jha, Anja Kovacs and G. Sampath for helpful discussions and comments.
To cite: Khera R. Data and privacy: Putting markets in (their) place. Indian J Med Ethics. Published online first on March 07, 2026. DOI: 10.20529/IJME.2026.014
Submission received: April 18, 2025
Submission accepted: February 11, 2026
Manuscript Editor: Sunita Sheel Bandewar
Peer Reviewer: Akshay S Dinesh
Copyright and license
©Indian Journal of Medical Ethics 2026: Open Access and Distributed under the Creative Commons license (CC BY-NC-ND 4.0), which permits only noncommercial and non-modified sharing in any medium, provided the original author(s) and source are credited.
References