©2020 Canadian Journal of Communication 45(3) 473–490  doi: 10.22230/cjc.2020v45n3a3901


Policy Portal

Platforms and Power: A Panel Discussion

Sara Bannerman & Christina Baade, McMaster University

Rena Bivens, Carleton University

Leslie Regan Shade, University of Toronto

Tamara Shepherd, University of Calgary

Andrea Zeffiro, McMaster University

Sara Bannerman is Canada Research Chair in Communication Policy and Governance and Associate Professor in the Department of Communication Studies and Multimedia at McMaster University. Email: banners@mcmaster.ca

Christina Baade is Chair and Professor in the Department of Communication Studies and Multimedia at McMaster University. Email: baadec@mcmaster.ca

Rena Bivens is Associate Professor in the School of Journalism and Communication at Carleton University. Email: RenaBivens@cunet.carleton.ca

Leslie Regan Shade is Professor in the Faculty of Information at the University of Toronto. Email: leslie.shade@utoronto.ca. 

Tamara Shepherd is Associate Professor at the Department of Communication, Media and Film at the University of Calgary. Email: tamara.shepherd@ucalgary.ca

Andrea Zeffiro is Academic Director for the Lewis and Ruth Sherman Centre for Digital Scholarship and an Assistant Professor in the Department of Communication Studies and Multimedia at McMaster University. Email: zeffiroa@mcmaster.ca.


ABSTRACT

Background This article is based on a panel discussion at McMaster University in 2019.

Analysis  Five questions are posed: 1) What is pressing about research on platforms and power right now? 2) What is the most powerful example of a research design that could disrupt or transform platform power? 3) Can platforms and algorithms be liberating? 4) How can researchers and policymakers work together for change? 5) What regulatory futures should researchers attend to, and how can research contribute to platform regulation? 

Conclusions and implications  Panel participants provide insights into the questions posed.

Keywords Regulation; Platform imperialism; Data discrimination; Digital labour; Speculative design 

RÉSUMÉ 

Contexte  Cet article se base sur une table ronde qui a eu lieu à l’Université McMaster en 2019.

Analyse  Cinq questions furent posées : 1) Qui a-t-il de plus pressant actuellement dans la recherche sur les plateformes et le pouvoir? 2) Quel est l’exemple le plus puissant d’un plan d’expérience qui pourrait ébranler ou transformer le pouvoir des plateformes? 3) Les plateformes et les algorithmes peuvent-ils être libérateurs? 4) Comment les chercheurs et les décideurs peuvent-ils collaborer pour encourager le changement? 5) Pour quels avenirs réglementaires les chercheurs devraient-ils se préparer, et comment la recherche peut-elle contribuer à améliorer la réglementation des plateformes?

Conclusion et implications  Les participants à la table ronde proposent diverses réponses aux cinq questions.

Mots clés  Réglementation; Impérialisme des plateformes; Discrimination relative aux données; Travail numérique; Design spéculatif


Introduction

Platforms and algorithms increasingly mediate everything, from the micro level to the macro level, including our social lives (Cotter, 2019); applying for jobs (Kircher, 2020); healthcare, where algorithms are used in both the tracking and treatment of COVID-19 (Parry, 2020; Suciu, n.d.); culture and entertainment, where platforms and algorithms replicate or disrupt the hierarchies of the star system (Van Dijck, 2009); and politics, from political and activist communications to international relations (Bucher, 2018; Roose, n.d.; Velkova & Kaun, 2019). The increasing platformization of everyday life (Helmond, 2015) raises questions of privacy, transparency, and fairness, leading to questions about platform regulation. This interest in regulation comes following a long period of non-intervention by governments in many forms of regulatory intervention (Gillespie, 2010).

This article, which is based on a panel discussion held at McMaster University on May 3, 2019, entitled “Platforms and Power,” asks: How should platforms and algorithms be held to account? What role can researchers play in holding platforms to account and in transforming platform power? This discussion is structured around the following key questions: 1) What is pressing or timely about research on platforms and power right now? 2) What is the most powerful example of a research design that has or could disrupt or transform platform power? 3) Can platforms and algorithms be liberating? 4) How can researchers and policymakers work together to effect change? 5) What regulatory futures should researchers attend to, and how can research contribute to establishing appropriate forms of platform regulation?

Panelists included Sara Bannerman (McMaster University), whose work focuses on platform regulation in relation to privacy and intellectual property; Christina Baade (McMaster University), whose work focuses on the intersection between popular music, sound media, and power, with particular attention to questions of labour, gender, race, and national belonging; Rena Bivens (Carleton University), whose work focuses on the software underlying platforms and identity-based programming practices, or how identity categories are shaped by software; Leslie Regan Shade (University of Toronto), whose work examines the social and policy aspects of information and communication technologies (ICTs), with particular focus on issues of gender, youth, and political economy; Tamara Shepherd (University of Calgary), whose work focuses on the feminist political economy of digital culture, online privacy, intellectual property, and infrastructure regulation; and Andrea Zeffiro (McMaster University), whose work is concerned with how social, political, and economic processes normalize behaviours and attitudes about the value of datafication, and how systems of power, in/justice, and in/equity are sustained through the quantification of collective life.

The context for the discussion reflects the Canadian regulatory landscape, oriented primarily around the main media and communications regulator at the federal level, the Canadian Radio-television and Telecommunications Commission (CRTC) and other government agencies also implicated in the regulation of platforms—for example, the Office of the Privacy Commissioner of Canada; provincial information and privacy commissioners; Innovation, Science and Economic Development Canada, the Department of Canadian Heritage, and the Competition Bureau Canada. At the time of this panel discussion, both the Broadcasting Act (1991)and the Telecommunications Act (1993) were undergoing a legislative review process largely due to the changes wrought on the media and communications landscape by digital platforms (Government of Canada, 2018). While the Broadcasting and Telecommunications Legislative Review Panel (2020) released its recommendations in January 2020, the outcome of the review process on any actual revision to the legislation remains speculative. It is with this context in mind that the panel reflected on the diverse considerations at play when suggesting regulatory strategies to deal with the potential social consequences of platformization. 

What is pressing or timely about research on platforms and power right now?

Leslie Shade

In the 1990s, we were talking about digital divides, digital inclusion, and communication rights; these are the policy concerns we are still talking about today. However, there are some major differences between the 1990s and today; in 1999, the CRTC (Canadian Radio-television and Telecommunications Commission, 1999, 2009) decided during its new media hearings not to regulate the internet. Today, there is much more interest in regulation, whether this be through reigning in the power of the digital monopolies through anti-trust mechanisms or holding social media companies accountable for their cavalier approach toward user’s privacy. In Canada, this was highlighted recently when federal Privacy Commissioner Daniel Therrien and Information and Privacy Commissioner of British Columbia Michael McEvoy announced in April 2019 that they would sue Facebook because there is no other way there can be recourse under privacy laws (Office of the Privacy Commissioner of Canada, 2019, 2020). Indeed, Sara Bannerman’s (2019) fantastic Policy Options article details Canada’s failure to regulate Facebook because of continued weak privacy legislation. The issues go beyond social media; it is also about ethics around artificial intelligence (AI) and issues of data discrimination. It is a volatile time politically, with global authoritarian regimes exploiting social media and concerns about a potential flood of misinformation in the 2019 federal election, so there are things to keep an eye on, and efforts around the world to explore digital regulation and to create national digital strategies. 

Tamara Shepherd

A pressing issue for regulators, considering how best to approach social media platforms, is platforms’ global reach. The national-level jurisdiction for traditional media regulation in broadcasting and telecommunications has given way to a system dominated by a few massive operators such as Facebook, Google, and Netflix. Due to their size and reach into diverse international markets, it becomes difficult to address the regulation of such platforms using existing national approaches, particularly in Canada.

Key here is the way that many of the most dominant platforms in our media landscape hail from the U.S., which has been described as a kind of “platform imperialism” by Dal Yong Jin (2013, p. 146). As he notes, “the current state of platform development implies technological domination of U.S.-based companies that have greatly influenced the majority of people and countries” (Jin, 2013, p. 154).

I think we can see this in Canada, where our federal regulators have struggled with how to hold platforms to account. For instance, in April 2019, the Office of the Privacy Commissioner found Facebook in contravention of Canadian privacy legislation for its role in the Cambridge Analytica scandal, and yet Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg ignored a subpoena to testify before an international grand committee in Ottawa the following month (Office of the Privacy Commissioner of Canada, 2019; Tunney, 2019). This sets an important precedent and one that is indicative of the regulatory shift precipitated by platforms, whereby the traditional processes used by individual governments to regulate media companies are no longer tenable.

Zuckerberg and Sandberg’s refusal to appear before the Grand Committee on Big Data, Privacy and Democracy in May 2019 made it apparent that Zuckerberg himself serves as a totemic figurehead for the hubris behind large platforms such as Facebook. His appearance (or non-appearance) before various regulators offers insightful moments of friction between the Silicon Valley ethos, the Californian ideology that combines technological determinism with libertarian politics (Barbrook & Cameron, 1996), and the regulatory conceit that the public has a vested stake in the way media systems shape information in a democracy. Because Zuckerberg is, in his way, a ruler over a domain far larger than any single country (at the time of writing, Facebook has over 2 billion monthly active users), his actions might be read through the lens of political leadership. His 2017 tour of the U.S., in fact, performed the quintessential version of traditional campaigning.

As an interesting counterpoint to U.S. platforms, and perhaps a glimpse into the future of Western platforms, the Chinese tech context offers some further considerations. In many parts of China, super apps such as WeChat are essentially mandatory for accessing any service. Yujie Chen, Zhifei Mao, and Jack Linchuan Qiu (2018) use the term “super-sticky” to describe WeChat since “it includes so many functions and it keeps growing to the extent that its average Chinese users are glued to the meta-platform whenever they use their smartphones” (pp. 5–6). Since users never leave the app, there are major concerns not only for market competition but for the relationship between WeChat’s parent company Tencent and the Chinese surveillance state. For instance, the “digital totalitarianism” at play in China harnesses the massive dataveillance infrastructure of super apps in apparatuses of social control such as the social credit system (Qiang 2019). Given that Chinese apps are now inspiring the U.S. platforms that were initially barred from operating in China, users’ sense that they do not have much choice of opting out may only increase in the coming years. 

Rena Bivens

It is important, today, to attend to the different layers of software, and to explore how each layer structures identity differently and with shifting levels of visibility. As well, it is important to consider the range of platforms that exist today, including those that are not digital.

One of my articles looks at Facebook, for instance, to investigate how gender has been coded over ten years (Bivens, 2017). There is a fascinating history to it. In the beginning, gender was not part of the sign-up sheet, nor was it a required field in your profile. Gender continues to play a large role in the advertising world, and so over time gender became much more important. It became a mandatory, binary field on the sign-up sheet and did not change, even when users were eventually able to identify in a much broader way within their profile page.1 What is also interesting is how they [Facebook] code gender in the database. They effectively return everyone to a binary based on the forced pronoun selection that accompanies the more expansive gender identity options on your profile (Bivens, 2017). I talked to them about this in particular and they said they did not want to break the product for their advertisers and third-party clients.

At times, the targeting options available to advertisers also do not match up with the privacy policies. When I looked into the categories available to advertisers on Twitter, it just blew my mind; I had to stare at the wall for a while afterward. Having read their privacy policies, I was curious about the lack of accountability in terms of using user data on the site, alongside other databases external to Twitter, to create extremely specific groups of people.

When it comes to software and the problems that we are addressing today I am interested in how we can better sense software’s role in our lives and how we can be involved more meaningfully in software development. A recent piece by Rianka Singh and Sarah Sharma (2019) reminds us that platforms can be many different things. Platforms do not have to be specifically digital and rooted in software. So what kinds of platforms—including platforms that are not mainstream or necessarily digital or even well-known—are operating as actors in, for instance, the radicalization of White supremacists?

Christina Baade

Platforms, especially streaming platforms such as Spotify and Netflix, have a tremendous influence on popular music and entertainment economies. In popular discourse, streaming “saved” the music industry (see Wolfson, 2018) after over a decade of struggle with digital piracy, crashing CD sales, and declining terrestrial radio listening. Following the difficult transformation from a goods-based to a service-based industry model (Anderson, 2014), streaming platforms are now established as a source of reliable income for record labels. But, while streaming platforms are credited with saving the music industry, they are ridden with problems of accountability, user agency, and privacy, as my fellow panelists and scholars such as Jeremy Morris (2016) have discussed. My interest is in how listeners and artists negotiate the affordances and limitations of music streaming platforms.

As I have written elsewhere (Baade, 2018), the logics of music streaming build on those of earlier technologies for domestic, ambient music, while much of the critical literature about music streaming echoes traditions suspicious of “passive” listening and musical “bad taste,” often framed in feminized terms. I believe a crucial strategy for critiquing streaming platforms involves being mindful about questions of labour, both for listeners (for whom affordable, effortless listening represents a “solution” to tedium, stress, and the yearning for a better life) and for artists, whose labour has historically been recast as avocation while being exploited by record labels and other industry actors. Insisting on the importance of labour offers a counterweight to the language of streaming, which casts music as a free-flowing, natural resource, rather than as the result of musicians’ creative, intellectual, and physical work.

Focusing on labour means that we take seriously the impact that streaming platforms have on the livelihoods—and lives—of musicians. Few artists make a significant income from streaming: based on the logic of radio play and licensing, the rate per stream is a fraction of a cent. The average payout per play on Spotify is U.S. $0.00318 (Pastukhov, 2019), and streaming services have resisted increases, such as when Spotify, Google, Pandora, and Amazon’s recently appealed the U.S. Copyright Royalty Board’s decision to raise payments to songwriters and publishers over the next four years (Aswad, 2019). Nonetheless, having one’s music on streaming platforms is an important form of promotion and marketing. Independent musicians must devote considerable effort to optimizing their discoverability on platforms that seem fair on the surface but perpetuate long-standing music industry biases regarding race, genre, and gender (Pelly, 2018). For me, the pressing question is how musicians negotiate this environment to make a livable life—and how we develop policies to support them. (Postscript: in this time of COVID-19, with its devastating impact on live performance and touring, this question is all the more urgent.) 

Andrea Zeffiro

Now more than ever we need to pay close attention to how Big Tech leverages ethics as a strategic trend. The rise of ethics in 2019 was swift. In a matter of months, ethics superseded transparency as a buzzword becoming a focal point for tech journalists and business leaders (Boyle, 2019; Gartner, 2018; Howard, 2018; West, 2018), while companies embarked on public ethics initiatives. We observed Microsoft and Google publish ethical principles to guide and govern their respective research and product development in AI (Microsoft, 2019; Pichai, 2018). We witnessed Axon (2018), the company that manufacturers TASERs, launch its AI and Policing Technology Board, and Google establish its now defunct Advanced Technology External Advisory Council (Walker, 2019). Ethics in these contexts conveyed openness and accountability across organizational cultures and technical processes. Yet uncertainty persists as to what exactly constitutes ethics in Big Tech.

Perhaps similar to transparency, ethics is used to signal a responsiveness to public scrutiny. Transparency as a key word used by tech companies reached its apex shortly after revelations were made public about the spread of propaganda and misinformation on social media platforms. In 2018, Facebook released its ad transparency tools by disclosing a limited range of backend operations (Leathern, 2018). The tools revealed the amount of advertising activity carried out on the platform but did not make transparent precisely how ads operated on the platform. Moreover, while the company moved to disclose its ad monitoring and enforcement processes, consumers were still expected to submit to the company’s opaque terms of service. In this respect, Facebook’s investment in transparency amounts to a kind of transparency theatre (see also, Schneier, 2003). The company made the tools available and cultivated the sense of improved transparency (Sonderby, 2018), while doing little or nothing at all to implement transparency across the platform (Merrill & Tobin, 2019).

As the tech industry continues to co-opt ethics, we will observe something similar, such as an ethics theatre or ethics washing (Hao, 2019; Wagner, 2018). In fact, we encountered this with Google’s Advanced Technology External Advisory Council (ATEAC). When Google launched the initiative in March 2019, the council was charged with helping implement the company’s AI principles, what Google described as “an ethical charter to guide the responsible development and use of AI in our research and products” (Walker, 2019, n.p.). Those asked to sit on ATEAC included the CEO of a drone company and the president of a right-wing think tank with a history of anti-immigrant, anti-transgender, and anti-LGBTQ advocacy (Levin, 2019). Shortly after announcing the formation of the council and naming its members, Google received significant pushback within the company and from the public, and the council was dissolved (Googlers Against Transphobia, 2019; Wakefield, 2019).

Google instrumentalized ethics through a public facing initiative to signal the company’s commitment to establishing ethical benchmarks for AI. AETEC presented the image of engaging with ethics, while doing very little or nothing at all. Furthermore, given the expertise and worldviews Google condoned in the steering of its ethical charter, ethics in this context feels counterfeit.

What is the most powerful example of research design that has or could disrupt or transform platform power?

Rena Bivens

We need to think about what kind of platforms we are talking about, what platform accountability means, and what kind of power we are discussing. Research designs that make power visible are crucial in the sense of demonstrating how power is operating, what sustains it, and how it circulates. Research designs that create and make accessible new tools and strategies that are fun, playful, experimental, and sometimes practical are quite interesting.

One example was discussed in Benjamin Bratton’s (2015) book The Stack: a piece of performance art by the Japanese artist Nobutaka Aozaki called Value_Added #240950 (Del Monte whole kernel corn no salt added), 2012. In the piece, the artist goes into 105 different supermarkets, carrying the same can of corn into the supermarket each time, and gets it scanned and pays for it without ever having taken a new can off the shelf. It is fascinating because, as Bratton explains, the can is registered anew each time by these systems, affecting the supply and demand chain. Perhaps an order for another can of Del Monte corn will be made from each of these supermarkets. This is an example of understanding the operating logics of platforms and undermining them by making noise. Broadly speaking, I think it is interesting as a strategy, especially given my interest in recasting the work of software in a way that helps us sense how it operates and incites a greater desire to understand and even undermine or co-opt it for our purposes.

One more example is the TrackMeNot browser extension from Helen Nissenbaum, Daniel C. Howe, and Vincent Toubiana (Howe & Nissenbaum, 2009; Howe, Nissenbaum, & Toubiana, n.d.). TrackMeNot runs on your browser, and as they explain, it “periodically issues randomized search-queries to popular search engines” (Howe et al., n.d., para. 2). It hides what you are searching for given the density of inquiries and it is particularly interesting in terms of undermining state attempts to surveil particular sets of search strings, including, for example, those that appear to indicate that someone is radicalized in one way or another.

Sara Bannerman

Some powerful research designs or tools have helped us to see something in a new way. This can be true for algorithm and platform audits. There are many types of audits: security audits, privacy audits, audits that check for discrimination, and more.

For example, Eslami, Rickman, Vaccaro, Aleyasen, Vuong, Karahalios, Hamilton, & Sandvig (2015) created a Facebook app that shows, alongside a user’s normal Facebook newsfeed, everything their friends have posted. This new view allows the user to see what the Facebook newsfeed is missing. The authors then ask participants in the study to respond to what they see: How does the fact that certain things are missing affect them? For example, some users were sorry they missed important news from certain friends. Others, who had not had a reply to their own posts, realized their posts may not have been seen by other users. The study is excellent not only because it provides users with new knowledge about how Facebook and the newsfeed algorithm works but also because it is oriented to transforming how users use and interact with Facebook and their Facebook friends. Further research could be geared toward regulatory change.

Andrea Zeffiro

In addition to looking inward at our scholarly communities for methods and tools to disrupt platform power, we should look to civic tech and community and activist initiatives for approaches to transform platform power.

DocNow, which is short for Documenting the Now (n.d.), is a set of free and open-source tools designed to support the ethical assessment, collection, use, and preservation of social media content. The motivation for the project came about by an effort to collect social media content in the aftermath of the killing of Michael Brown on August 9, 2014, by Ferguson, Missouri, police officer Darren Wilson (Jules, Summers, & Mitchell, 2019). Social media, and Twitter in particular, became the space for disseminating information and mobilizing and organizing. For those on the ground, Twitter was used to share images, video, and audio of the protests, while individuals not in Ferguson were actively participating by directing attention to what was happening through retweeting and commenting on the protests from their perspectives (Jules et al., 2019). Thus, DocNow stems from a commitment to preserve digital content depicting events of historical significance through personal and cultural experiences (Jules et al., 2019). 

Tamara Shepherd

There seems to be a tendency, particularly in studies of platforms that use the framework of surveillance capital (Zuboff, 2015, 2019), to point toward innovative kinds of research designs and artistic practices that subvert platforms’ data collection processes (Birchall, 2014; Howe & Nissenbaum, 2009). This kind of work is really important alongside a classic critical political economy approach that uncovers the deep links between the tech industry and governments. For example, Pawel Popiel (2018) has traced the way that tech industry lobbyists have pressured the U.S. government on legislation pertaining to privacy and copyright. He also points out the “revolving door” (Popiel, 2020, p. 568) nature of corporate lobbying where, in tech as in other industries, the same people move between governmental and lobbyist roles. These sorts of structural problems are endemic to the way capitalism functions in contemporary social democracies, and it has resulted in a particularly lax regulatory environment when it comes to Big Tech. What critical political economy suggests as a research method is to learn about the kinds of corruption at play and then challenge them through democratic channels. 

Leslie Shade

It is important to think about historical infrastructures and policy histories. An exciting project I am engaged in looks at previous policy discourses of informational privacy. For instance, I am thinking about the 1970 Department of Communication Telecommission report, Instant World, that talked about a right to communicate, a right to privacy, and described “wired cities,” essentially the precursor to early internet communities, such as the community nets that were popular in the mid-1990s (Government of Canada Telecommission, 1971). By looking at Globe and Mail coverage from the 1970s, we can see how eloquently these policy initiatives were addressed. We can also look back to the mid-1990s and the information highway policy initiatives. We were also talking back then about a right to communicate and a right to privacy. What happened? Neoliberalism exploded. Currently, we are having similar conversations that were had in the 1970s—and the 1990s. Looking back to earlier policy conversations is super important, because we can not only learn a lot about how we articulated the nexus of communication technologies and socio-economic possibilities, but we can also look at lost opportunities to value and promote the public interest through robust and actionable policy. 

Can platforms and algorithms be liberating?

Sara Bannerman

Antoinette Ruvroy and Thomas Berns (2013) argue that algorithmic governance is liberating, but not emancipatory. Algorithms are “liberating” from one (neoliberal) point of view, in the sense that they appear to move away from governmental policy or regulation in the traditional sense. However, they do not afford the kinds of public transparency and decision-making that is required for emancipation.

Andrea Zeffiro

We are constantly persuaded to invest in the promise that platforms and algorithms are inherently liberating. This cajoling happens most often through vision statements and manifestos that present future visions in which technologies change things for the better. Better than what? Better than the present. In this sense, the liberatory potential of platforms and algorithms is predicated on how the future possibilities of these technologies can liberate us from the current state of things (Zeffiro, 2019b). However, there is a cruel optimism (Berlant, 2011) to this form of liberation because even though it often feels like we are moving toward that promise, the future remains perpetually out of reach. And because this future can never be fully realized, we reach toward it repeatedly, expecting that this time things will change in the right way (Berlant, 2011). This is precisely why we are able to continue to invest in celebratory future visions, even as the present context is marred by failures, disappointments, and shortcomings to liberation. 

Rena Bivens

In terms of the identity space, where I work, there is always the issue of strategically claiming specific identities to request resources and rights from the state. At the same time, some anxieties and dangers go along with identification, such as increased surveillance capabilities.

There are also problems of misidentification given the biases built into systems. Sasha Costanza-Chock (2018) explains what it is like moving through airports knowing that the millimetre wave scanner and the employees they encounter will be deployed through a binary lens and thus, given Costanza-Chock’s gender expression and identity, will fail to correctly account for gender.

Joy Buolamwini and Timnit Gebru (2018) clearly show that as soon as you take an intersectional lens to the issue of AI you see the massive disparities in terms of identification capabilities; White guys are identified with high levels of accuracy, whereas Black women in particular are frequently misidentified. As we strive to perfect these systems, we should also think through what this means for identity and identification in the future.

Tamara Shepherd

Two things currently stand in the way of the liberatory potential of platforms: algorithms’ predictive capacity and their pervasiveness. Algorithmic prediction is an inherently conservative activity, since past behaviours become entrenched as future possibilities. Shoshana Zuboff (2015) talks about platforms’ predictive capacity as “anticipatory conformity” (p. 82) in that people’s paths are already shaped by the requirements of surveillance capital. This sort of predetermination has its roots in the predictive endeavours of the insurance industry, where statistics were introduced in the late nineteenth century to quantify risk (Bouk, 2015). The financial imperative of prediction as risk management was paired with socially discriminatory practices—for example, in insurance redlining of Black communities (Heen, 2009)—that have carried over from statistical models into the big data of contemporary platforms. While an historical trajectory of prediction might thus be traced from insurance companies to social media platforms (e.g., Leurs & Shepherd, 2017), there are also some other key periodizing contexts for a contemporary prediction that limit the possibilities for liberation, much less emancipation. For example, Tiziana Terranova’s (2004) characterization of digital subjects uses of the Deleuzian notion of “dividuals” or sub-individual units, to describe how people get rendered through the prism of their micro tastes and preferences in social media platforms (Deleuze, 1992). The usefulness of “dividuals” to capital is to facilitate “the real-time observation, communication, analysis, prediction, and modification of actual behaviour now and soon” (Zuboff, 2015, p. 84). This is the dominant model for prediction in major platforms; perhaps there could be some alternative version of algorithmic governance built on the posthuman potentials of AI. As AI systems grow more sophisticated, and in some ways unpredictable, there could be a version of Donna Haraway’s (1985) liberatory cyborg that comes into view, although with the deeply commercial imperatives of the key developers of such technology right now, it is difficult to imagine that happening any time soon. 

Christina Baade

In the early 2000s, there was a great deal of optimism that digital and online music promotion and distribution would liberate musicians from exploitation, especially by record labels. As I discussed above, streaming platforms have not brought liberation for musicians: not only have they reinforced the corporate power of the major labels but their rise has been accompanied by the downloading of risk and labor to musicians (Haynes & Marshall, 2018). The freedom that independent musicians have gained is a cruel delivery on the internet optimism of the early 2000s.

The promise of liberation that music streaming platforms and recommendation algorithms make to listeners is more complex. Essentially, they promise safe passage through the dizzying quantity of music that is available to us digitally. Easily and quickly, they offer us mood enhancement, a pleasant environment, even pleasure—all hallmarks of aspirational consumerism. But at a deeper level, they also offer us a sensation of care, participating in what Arlie Hochschild (2012) calls the “outsourced self,” in which the market serves our needs for care work and emotional labour, not least because sustaining life is so exhausting for so many people. Obviously, the thing we get from music platforms is not real liberation but I also think it is important to acknowledge that they serve real needs—needs that will only be met by liberation extending beyond (while also including) our platforms.

How can researchers and policymakers work together to effect change?

Leslie Shade

There are opportune policy moments for researchers and policymakers to work together, but context is important. I am relatively optimistic (at times!) that there is now an interest by policymakers in academic scholarship to inform the development of federal government strategies. When Tamara talks about the revolving door, the sort of question I always ask is: What are the structures of processes of participation and policymaking? There are a lot of initiatives, but how is meaningful public engagement happening on the ground?

I know individual policymakers who are keen to do policy in the public interest, and to integrate academic research into their work. The Social Science and Humanities Research Council (SSHRC) partnership grants, which rely on a network of community partners, government partners, and industry partners, is a useful example for building scholarly-policy capacities. There are good people who are working in these contexts. The federal public service is hungry for really smart, energetic, engaged young people, and I am very pleased that some of our recent graduates go into government policy jobs with a genuine public interest and social justice ethos. 

What regulatory futures should researchers attend to, and how can research contribute to establishing appropriate forms of platform regulation?

Sara Bannerman

We should attend to or try to imagine a future where that relationship between platforms and politicians or political parties or government is less problematic. One problem has been pointed to already: the “revolving door” that can happen between government parties and tech companies. The fact that political parties use platforms for political advertising—whether through the platforms’ political services unit or, in some cases, even through platform employees who are themselves embedded in political parties’ campaigns—creates a huge problem.

Contrast this with broadcasting: there is an arm’s-length regulator of broadcasting, which prevents, to some extent, the development of problematic relationships between politicians as regulators and communications companies because politicians do not play as direct a role in regulating media companies. Of course, there can be a revolving door there too, and regulatory capture, but there is an extra distance built in. Communication scholars should attend to the relationship between politicians and platform companies. 

Tamara Shepherd

The problems suggested here are important ones to parse out as they are particularly poignant in liberal democracies. Again, I raise the example of Chinese super apps to illustrate how internet companies’ complicity with a surveillance state might be mobilized in large-scale social control experiments such as the social credit system. Of course, Western platforms have been implicated in state surveillance, as seen in the Snowden revelations about the United States surveillance program PRISM, as well as in state-level censorship, as in Facebook’s content moderation practices in countries such as Turkey (Klonick, 2018). I think future critical political economy research would do well to engage with a comparative approach that considers what is going on not only in North America but in other places in the world, where regulatory considerations around platforms often come into even more stark relief. 

Rena Bivens

I would draw out conversations around Whiteness. We have spoken about how we are cushioned from certain things and how our lives are structured by social infrastructure, but I am interested in how Whiteness draws on a system of privilege to sustain and confirm power within these systems. How does this operate in conjunction with the calculated invisibility of software processes? Robin DiAngelo (2018) talks about White racial insulation and how White folks have the capacity to organize their lives in segregated ways and as a result do not have to think about race. In fact, White people are encouraged to avoid learning how to talk and think about race in complex ways. The lack of tolerance for race-based stress among White people also bolsters White supremacy. DiAngelo illustrates the reactions that follow, which include anger, victimhood, defensiveness, dominance, and avoiding Whiteness or White cultural identification altogether.

Similarly, there is often no recognition that there is value in learning from and with people of colour. So, I am wondering about these White cultural practices and how they link to colonial logic, as well as how they are all embedded in these platforms and thus also in how our lives are structured and shaped. Whiteness is central to the social construction of the default and of the norm, and so for me, there are a lot of parallels to the study of the defaults and norms programmed into software and platforms.

Christina Baade

Going back to the underlying structures of radio and music, it is important to recognize that regulators have historically defined “the public interest” in problematic and limited ways. “The public interest” has variously been aligned with “high culture,” with a diversity of commercial radio formats (which does not necessarily equate with a diversity of ownership, labels, artists, or songs), and with CanCon [Canadian content] quotas. Meanwhile, there have always been pressures limiting the diversity of music we hear: racial segregation in the construction of music industry genres (Miller, 2010), payola, and other forms of bribery to manipulate radio playlists, and the ongoing marginalization of women’s voices in radio formats (as shown, for example, in the excellent work of Jada Watson [2019]). It is critical that we uncover these deep-seated biases in the music industry, attending to both their histories and how they persist on internet music platforms.

With scholars such as Charles Fairchild (2012) and Kate Lacey (2013), I also worry about the deeper effects of customized playlists and recommendation algorithms that prevent us from encountering music that we may not like. I think there is something valuable, even democratic, about encountering sounds, people, and ideas that we may find unfamiliar—or that may challenge our personal aesthetic or our assumptions about how the world should be. How do we sustain values that have been nurtured in community media, such as participation, engaging with those different from us, and even having difficult conversations? What does genuine inclusiveness look like, and how do we make structures to sustain it? 

Andrea Zeffiro

Some of my research over the last few years has focused on disentangling the ethical conundrums researchers face when working with social media platforms (Zeffiro, 2019a). This work advocates for researchers and research communities to develop ethical approaches to social media research, rather than permit platforms or service providers to establish the ethical benchmarks for research. Given what we know about the tech industry’s diversion of ethics, perhaps we need to envision new frameworks that reorient ethical concerns about platform power from exclusively moral purviews.

I am intrigued by the concept of “ethical infrastructure” as a possible framework for evaluating platform politics. The term originates from legal discourse and describes formal systems, procedures, and policies that organizations put in place to establish ethical practices and conduct (Schneyer, 1991, 2011). Again, I am not advocating for an assessment of moral values or behaviours, but rather, I am interested in how the concept might be adapted into an interrogative framework to assess how principles such as justice, fairness, equity, and inclusivity are sustained throughout a platform. How do these standards register across resources and objects (e.g., humans, hardware and software, raw materials) that are arranged to produce a platform? And what values in turn are reproduced by the artefacts, techniques, organizations, and systems that constitute a platform?

If we return to the AETEC example and consider the contradiction between what Google suggested the council was supposed to do with the troubling expertise and worldviews it deemed crucial to steering an ethical charter, then we are compelled to question not only the values these competencies and experiences reflect, but also for whom they are of value.

Thus, we must continue to observe how Big Tech continues to perform ethics. If the tech industry can demonstrate an aptitude to self-regulate through codes, policies, boards, and councils, then external regulation becomes redundant. In turn, ethics becomes a means to resist industry regulation (Wagner, 2018), rather than a sustained project aimed at transforming organizational cultures and technical practices.

Acknowledgements

The authors would like to thank Emmanuel Appiah and Kim Payne for their research assistance on this article. They would also like to thank David Ogborn and the Centre for New Media and Performance for organizing the May 9, 2019, event. Finally, Sara Bannerman would like to thank McMaster University, the Canada Research Chairs program, and the Social Sciences and Humanities Research Council for making this work possible.

Note

  1. Only very recently, Facebook quietly changed the sign-up sheet from a mandatory, binary-only field to three options, one of which is custom gender. This change happened more than five years after the company had already made a similar change within individual profile pages. This means that today, new users are no longer forced to select a binary gender for themselves when signing up to use Facebook. In other words, new users who do not identify as male or female no longer have to lie in order to gain access to the site.

Websites

Canadian Heritage, https://www.canada.ca/en/canadian-heritage.html

Canadian Radio-television and Telecommunications Commission, https://crtc.gc.ca/eng/home-accueil.htm

Competition Bureau Canada, https://www.competitionbureau.gc.ca/eic/site/cb-bc.nsf/eng/home

Innovation, Science and Economic Development Canada, https://www.ic.gc.ca/eic/site/icgc.nsf/eng/home

Office of the Privacy Commissioner of Canada, https://www.priv.gc.ca/en/

References

Anderson, Tim J. (2014). Popular music in a digital music economy: Problems and practices for an emerging service industry. New York, NY: Routledge.

Aswad, Jem. (2019, April 9). Hit songwriters slam Spotify’s attempt to lower royalties: “You have used us.”URL: https://variety.com/2019/biz/news/spotify-secret-genius-songwriters-lower-royalties-1203184870/ [May 28, 2020].

Axon. (2018). Axon AI and policing technology ethics board. URL: https://www.axon.com/axon-ai-and-policing-technology-ethics [May 28, 2020].

Baade, Christina. (2018). Lean back: Songza, ubiquitous listening and Internet music radio for the masses. Radio Journal: International Studies in Broadcast & Audio Media, 16(1), 9–27.

Bannerman, Sara. (2019, May 1). Canada’s glaring failure to regulate Facebook. URL: https://policyoptions.irpp.org/magazines/may-2019/canadas-glaring-failure-regulate-facebook/ [May 28, 2020].

Barbrook, Richard, & Cameron, Andy. (1996). The Californian ideology. Science as Culture, 6(1), 44–72. doi: 10.1080/09505439609526455

Berlant, Lauren Gail. (2011). Cruel optimism.Durham, NC: Duke University Press.

Birchall, Clare. (2014). Aesthetics of the secret. New Formations, 83(83), 25–46.

Bivens, Rena. (2017). The gender binary will not be deprogrammed: Ten years of coding gender on Facebook. New Media & Society, 19(6), 880–898.

Bouk, Dan. (2015). How our days became numbered: Risk and the rise of the statistical individual.Chicago, IL: University of Chicago Press.

Boyle, Katherine. (2019, February 5). Goodbye, trolley problem. This is Silicon Valley’s new ethics test. URL: https://www.washingtonpost.com/opinions/welcome-to-your-new-ethics-test-silicon-valley/2019/02/05/eb9f6e10-2969-11e9-b2fc-721718903bfc_story.html [May 28, 2020].

Bratton, Benjamin H. (2015). The stack: On software and sovereignty.Cambridge, MA: MIT Press.

Bucher, Taina. (2018). If … then: Algorithmic power and politics. New York, NY: Oxford University Press.

Buolamwini, Joy, & Gebru, Timnit. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research 81, 77–91.

Canada. Broadcasting Act, SC 1991, c11.  URL: https://laws-lois.justice.gc.ca/eng/acts/B-9.01/ [August 25, 2020].

Canada. Broadcasting and Telecommunications Legislative Review Panel. (2020). Canada’s communications future: Time to act.Ottawa, ON: Department of Innovation, Science and Economic Development Canada. URL: http://www.ic.gc.ca/eic/site/110.nsf/eng/00012.html [August 25, 2020].

Canada. Telecommunications Act, SC 1993, c38. URL: https://laws.justice.gc.ca/eng/acts/T-3.4/ [August 25, 2020].

Canadian Radio-television and Telecommunications Commission. (1999, December 17). Public Notice CRTC 1999-197: Exemption order for new media broadcasting undertakings [Orders]. URL: https://crtc.gc.ca/eng/archive/1999/PB99-197.htm [November 3, 2018].

Canadian Radio-television and Telecommunications Commission. (2009, October 22). Broadcasting Order CRTC 2009-660: Amendments to the exemption order for new media broadcasting undertakings (Appendix A to Public Notice CRTC 1999-197); Revocation of the Exemption order for mobile television broadcasting undertakings [Orders]. URL: https://crtc.gc.ca/eng/archive/2009/2009-660.htm [November 3, 2018].

Chen, Yujie, Mao, Zhifei,  & Qiu, Jack Linchian. (2018). Super-Sticky WeChat and Chinese Society. Bingley, UK: Emerald Publishing Limited.

Costanza-Chock, Sasha. (2018). Design justice, AI, and escape from the matrix of domination. Journal of Design and Science, 3(5). doi: 10.21428/96c8d426

Cotter, Kelley. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913. doi: 10.1177/1461444818815684

Deleuze, Gilles. (1992). Postscript on the societies of control. October, 59, 3–7.

DiAngelo, Robin. (2018).White fragility: Why it’s so hard for white people to talk about racism. Boston, MA: Beacon Press.

Documenting the Now.(n.d.). URL: https://www.docnow.io/ [May 8, 2020].

Eslami, Motahhare, Rickman, Aimee, Vaccaro, Kristen, Aleyasen, Amirhossein, Vuong, Andy, Karahalios, Karrie, Hamilton, Kevin, & Sandvig, Christian. (2015). I always assumed that I wasn’t really that close to [her]: Reasoning about invisible algorithms in news feeds. In Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 153–162). ACM.

Fairchild, Charles. (2012). Music, radio and the public sphere: The aesthetics of democracy. New York, NY: Palgrave Macmillan.

Gartner. (2018, October 15). Gartner identifies the top 10 strategic technology trends for 2019. URL: https://www.gartner.com/en/newsroom/press-releases/2018-10-15-gartner-identifies-the-top-10-strategic-technology-trends-for-2019 [June 2, 2020].

Gillespie, Tarleton. (2010). The politics of “platforms.” New Media & Society, 12(3), 347–364.

Googlers Against Transphobia. (2019, April 8). Googlers against transphobia and hate. URL: https://medium.com/@against.transphobia/googlers-against-transphobia-and-hate-b1b0a5dbf76 [May 28, 2020].

Government of Canada. (2018, June 5). Broadcasting and Telecommunications Legislative Review [Home page]. URL: https://www.ic.gc.ca/eic/site/110.nsf/eng/home [May 21, 2020].

Government of Canada Telecommission. (1971). Instant world: A report on telecommunications in Canada. Ottawa, ON: Department of Communication.

Hao, Karen. (2019, December 27). In 2020, let’s stop AI ethics-washing and actually do something. URL: https://www.technologyreview.com/2019/12/27/57/ai-ethics-washing-time-to-act/ [May 28, 2020].

Haraway, Donna J. (1991). Cyborg manifesto: Science, technology, and socialist-feminism in the Late 20th century. In Simians, cyborgs and women: The reinvention of nature (pp. 149–182). New York: Routledge.

Haynes, Jo, & Marshall, Lee. (2018). Beats and tweets: Social media in the careers of independent musicians.New Media & Society, 20(5), 1973–1993.

Heen, Mary L. (2009). Ending Jim Crow life insurance rates. Northwestern Journal of Law and Social Policy, 4(2), 360–299.

Helmond, Anne. (2015). The platformization of the web: Making web data platform ready. Social Media + Society, 1(2), 1–11. doi:10.1177/2056305115603080

Hochschild, Arlie Russell. (2012). The outsourced self: What happens when we pay others to live our lives for us. New York, NY: Metropolitan Books.

Howard, Matthew. (2018, June 21). The future of AI relies on a code of ethics. URL: https://social.techcrunch.com/2018/06/21/the-future-of-ai-relies-on-a-code-of-ethics/ [May 28, 2020].

Howe, Daniel C., & Nissenbaum, Helen. (2009). TrackMeNot: Resisting surveillance in web search. In I. Kerr, V. Steeves, & C. Lucock (Eds.), Lessons from the Identity trail: Anonymity, privacy, and identity in a networked society (pp. 417–436). New York, NY: Oxford University Press.

Howe, Daniel C., Nissenbaum, Helen, & Toubiana, Vincent. (n.d.). TrackMeNot. URL: http://trackmenot.io/ [May 8, 2020].

Jin, Dal Yong. (2013). The construction of platform imperialism in the globalization era. TripleC: Communication, capitalism & critique. Open Access Journal for a Global Sustainable Information Society, 11(1), 145–172. doi: 10.31269/triplec.v11i1.458

Jules, Bergis, Summers, Ed, & Mitchell, Vernon. (2019). Ethical considerations for archiving social media content generated by contemporary social movements: Challenges, opportunities, and recommendations. Documenting the Now. URL: https://www.docnow.io/docs/docnow-whitepaper-2018.pdf [August 25, 2020]

Kircher, Philipp. (2020). Search design and online job search–new avenues for applied and experimental research. Labour Economics 64, 1–8. doi: 10.1016/j.labeco.2020.101820

Klonick, Kate. (2018). The new governors: The people, rules and processes governing online speech. Harvard Law Review, 131, 1598–1670. 

Lacey, Kate. (2013). Listening in the digital age. In J. Loviglio & M. Hilmes (Eds.), Radio’s new wave: Global sound in the digital era (pp. 9–23). New York, NY: Routledge.

Leathern, Rob. (2018, June 28). A new level of transparency for ads and pages. URL: https://about.fb.com/news/2018/06/transparency-for-ads-and-pages/ [May 28, 2020].

Leurs, Koen, & Shepherd, Tamara. (2017). Datafication & discrimination. In M.T. Schäfer & K. van Es (Eds.). The datafied society: Studying culture through data (Vol. 211–231). Amsterdam, NL: Amsterdam University Press.

Levin, Sam. (2019, March 29). “Bias deep inside the code”: The problem with AI “ethics” in Silicon Valley. URL: http://www.theguardian.com/technology/2019/mar/28/big-tech-ai-ethics-boards-prejudice [May 28, 2020].

Merrill, Jeremy B., & Tobin, Ariana. (2019, January 28). Facebook moves to block ad transparency tools—including ours. URL: https://www.propublica.org/article/facebook-blocks-ad-transparency-tools [May 28, 2020].

Microsoft. (2019). Responsible AI principles from Microsoft. URL: https://www.microsoft.com/en-ca/ai/responsible-ai [May 28, 2020].

Miller, Karl Hagstrom. (2010). Segregating sound: Inventing folk and pop music in the age of Jim Crow. Durham, NC: Duke University Press.

Morris, Jeremy Wade. (2016). Selling digital music, formatting culture.Berkeley, CA: University of California Press.

Office of the Privacy Commissioner of Canada. (2019, April 25). PIPEDA Report of Findings #2019-002: Joint investigation of Facebook, Inc. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia. URL: https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2019/pipeda-2019-002/ [May 8, 2020].

Office of the Privacy Commissioner of Canada. (2020, February 6). Notice of application with the Federal Court against Facebook, Inc. URL: https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/pipeda-complaints-and-enforcement-process/court_p/na_fb_20200206/ [May 8, 2020].

Parry, Julie. (2020, April 1). Yale releases COVID-19 treatment algorithm. URL: https://medicine.yale.edu/ysm/news-article/23611/ [May 7, 2020].

Pastukhov, Dmitry. (2019, June 16). What do music streaming services pay per stream (and why it actually doesn’t matter). Soundcharts Blog. URL: https://soundcharts.com/blog/music-streaming-rates-payouts [May 28, 2020].

Pelly, Liz. (2018, June 4). Discover weakly: Sexism on spotify. URL:https://thebaffler.com/latest/discover-weakly-pelly [May 28, 2020].

Pichai, Sundar. (2018, June 7). AI at Google: Our principles. URL: https://www.blog.google/technology/ai/ai-principles [May 28, 2020].

Popiel, Pawel. (2018). The tech lobby: Tracing the contours of new media elite lobbying power. Communication Culture & Critique, 11(4), 566–585.

Popiel, Pawel. (2020). Let’s talk about regulation: The influence of the revolving door and partisanship on FCC regulatory discourses. Journal of Broadcasting & Electronic Media, 1–24. doi: 10.1080/08838151.2020.1757367

Qiang, Xiao. (2019). The road to digital unfreedom: President Xi’s surveillance state. Journal of Democracy 30(1), 53–67.

Roose, Kevin. (n.d.). Rabbit hole(Podcast). URL: https://www.nytimes.com/column/rabbit-hole [May 21, 2020].

Rouvroy, Antoinette, & Berns, Thomas. (2013). Algorithmic governmentality and prospects of emancipation. Réseaux, 1(177), 163–196.

Schneier, Bruce. (2003). Beyond fear: Thinking sensibly about security in an uncertain world. Göttingen, DE: Copernicus Publications.

Schneyer, Ted. (1991). Professional discipline for law firms. Cornell Law Review, 77(1), 1–46.

Schneyer, Ted. (2011). On further reflection: How professional self-regulation should promote compliance with broad ethical duties of law firm management. Arizona Law Review, 53, 577–628.

Shade, Leslie Regan. (2019, July 19-21). “Nosy people have always been a nuisance in the society that values privacy” [Presentation]. Canadian perspectives on privacy and trust: Then and now (panel). Social Media and Society Conference, Toronto, ON: Ryerson University.

Singh, Rianka, & Sharma, Sarah. (2019). Platform uncommons. Feminist Media Studies, 19(2), 302–303.

Sonderby, Chris. (2018, May 15). Reinforcing our commitment to transparency. URL: https://about.fb.com/news/2018/05/transparency-report-h2-2017/ [May 28, 2020].

Suciu, Peter. (n.d.). Data algorithms are being used on social media to track COVID-19’s impact. URL: https://www.forbes.com/sites/petersuciu/2020/03/12/data-algorithms-are-being-used-on-social-media-to-track-covid-19s-impact/ [May 7, 2020].

Terranova, Tiziana. (2004). Network culture: Cultural politics for the information age. London, UK: Pluto Press.

Tunney, Catharine. (2019, May 27). MPs warn Facebook’s Zuckerberg and Sandberg could be found in contempt of Parliament for no-show. URL: https://www.cbc.ca/news/politics/facebook-contempt-parliament-1.5145347 [May 8, 2020].

Van Dijck, José. (2009). Users like you? Theorizing agency in user-generated content. Media, Culture & Society, 31(1), 41–58.

Velkova, Julia, & Kaun, Anne. (2019). Algorithmic resistance: Media practices and the politics of repair. Information, Communication & Society, 1–18. doi: 10.1080/1369118X.2019.1657162

Wagner, Ben. (2018). Ethics as an escape from regulation: From ethics-washing to ethics-shopping. In E. Bayamlioğlu, I. Baraliuc, L. Hanssens, & M. Hildebrandt (Eds.), Being profiling: Cogitas ergo sum: 10 Years of Profiling the European Citizen (pp. 84–90). Amsterdam, NL: Amsterdam University Press.

Wakefield, Jane. (2019, April 5). Google’s ethics board shut down. BBC News. URL: https://www.bbc.com/news/technology-47825833 [August 25, 2020]. 

Walker, Kent. (2019, March 26). An external advisory council to help advance the responsible development of AI. URL: https://blog.google/technology/ai/external-advisory-council-help-advance-responsible-development-ai/ [May 8, 2020].

Watson, Jada. (2019). Gender on the Billboard Hot Country Songs chart, 1996–2016. Popular Music and Society, 42(5), 538–560.

West, Dave. (2018, April 19). Why Tech companies need a code of ethics for software development. URL: https://www.entrepreneur.com/article/311410 [May 28, 2020].

Wolfson, Sam. (2018, April 24). “We’ve got more money swirling around”: How streaming saved the music industry. The Guardian. URL: https://www.theguardian.com/music/2018/apr/24/weve-got-more-money-swirling-around-how-streaming-saved-the-music-industry [August 25, 2020].

Zeffiro, Andrea. (2019a). Provocations for social media research ethics (Working Paper No. 19/2) (pp. 82–86). Hamilton, ON: Institute on Globalization & the Human Condition. URL: https://socialsciences.mcmaster.ca/globalization/research/publications/working-papers/working-papers-series [August 25, 2020].

Zeffiro, Andrea. (2019b). Towards a queer futurity of data. Journal of Cultural Analytics. URL: https://culturalanalytics.org/2019/05/towards-a-queer-futurity-of-data [August 25, 2020].

Zuboff, Shoshana. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89.

Zuboff, Shoshana. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York, NY: PublicAffairs.