Hostname: page-component-cd9895bd7-gbm5v Total loading time: 0 Render date: 2024-12-27T14:17:43.466Z Has data issue: false hasContentIssue false

A feminist framework for urban AI governance: addressing challenges for public–private partnerships

Published online by Cambridge University Press:  20 December 2024

Laine McCrory*
Affiliation:
Graduate Program in Communication and Culture, Toronto Metropolitan University, Toronto, ON, Canada Department of Communication and Media Studies, York University, Toronto, ON, Canada

Abstract

This analysis provides a critical account of AI governance in the modern “smart city” through a feminist lens. Evaluating the case of Sidewalk Labs’ Quayside project—a smart city development that was to be implemented in Toronto, Canada—it is argued that public–private partnerships can create harmful impacts when corporate actors seek to establish new “rules of the game” regarding data regulation. While the Quayside project was eventually abandoned in 2020, it demonstrates key observations for the state of urban algorithmic governance both within Canada and internationally. Articulating the need for a revitalised and participatory smart city governance programme prioritizes meaningful engagement in the forms of transparency and accountability measures. Taking a feminist lens, it argues for a two-pronged approach to governance: integrating collective engagement from the outset in the design process and ensuring the civilian data protection through a robust yet localized rights-based privacy regulation strategy. Engaging with feminist theories of intersectionality in relation to technology and data collection, this framework articulates the need to understand the broader histories of social marginalization when implementing governance strategies regarding artificial intelligence in cities.

Type
Data for Policy Proceedings Paper
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press

Policy Significance Statement

Artificial intelligence is rapidly being integrated into cities, often leaving policymakers unable to keep up. When corporations are given control of the integration of AI into urban centres, they can dismiss current regulations in favour of their own governance mechanisms. This paper proposes an integration of a feminist policy praxis into smart city development. A feminist approach addresses the risks of rapid datafication while integrating an intersectional, impact-based community engagement process in smart city governance and design. This ensures that civilians can be protected under a rights-based legal structure that understands the multifaceted risks posed by AI development, while also giving citizens the tools to hold corporate actors accountable.

1. Introduction

As urban infrastructure becomes interwoven with algorithmic processes, questions must be asked about who these technologies serve. While corporate partnerships regarding urban AI development promote “smart” technology as beneficial for all inhabitants, these technologies reflect histories of social marginalization, exploitation, and surveillance (Benjamin, Reference Benjamin2019; Eubanks, Reference Eubanks2018; O’Neil, Reference O’Neil2016; Scheuerman et al., Reference Scheuerman, Paul and Brubaker2019). This research analyses the case of Sidewalk Labs’ Quayside project as demonstrative of greater issues facing public–private partnerships in urban AI governance. While the Quayside project was officially abandoned due to “financial constraints” (Doctoroff, Reference Doctoroff2020) resulting from the COVID-19 pandemic, an in-depth analysis demonstrates that the state of algorithmic governance in Toronto, and Canada more broadly, reflects a multifaceted story—one that highlights the importance of citizen participation, a rejection of corporate governance, and an urban environment that reflects the needs of its residents. Most importantly, it epitomizes critical considerations for policymakers, as this analysis raises questions of how justice and power are wielded in the age of smart cities. In proposing a feminist approach to smart city governance, this research demonstrates the need to address the potential risks of data universalism with a policy strategy that accounts for the multifaceted and contextual makeup of the urban environment.

2. Data in the smart city

In Andrejevic and Burdon’s (Reference Andrejevic and Burdon2015) depiction of the “sensor society”, infrastructures become infused with systems designed to facilitate the mass capture of data. The sensor society departs from the traditional mode of data collection as targeted and discrete, instead favouring continuous, constant data capture processes. Subjects in this case are reduced to mere points within a greater data system, absolving context from these interactions. Rob Kitchin (Reference Kitchin, Kitchin, Lauriault and McArdle2017) further articulates this desire for more data in the city in his depiction of data-driven urbanism, in which the desire for Big Data systems influences and controls urban performance and responses (Kitchin, Reference Kitchin, Kitchin, Lauriault and McArdle2017). In this condition, a city’s success is not measured through local and situated analyses, but through the collection, amelioration, and mobilization of data according to a framework that prioritizes pragmatic and efficient solutions. As more data are gathered and evaluated for progress, additional data simultaneously need to be captured to enable that search for progress. Yet, those “producing” the data—often through simply existing within the environment itself—are unable to fully understand their contributions, as the means of control often lies outside of the user. As such, the sensor society is a phenomenon in which the passive, constant collection of data is implicated within a broader system of control.

In the smart community, the sensor society comes to life. Initiatives to build “smart cities” seek to integrate large-scale data collection into urban environments. While fully functional smart communities currently remain as theoretical aspirations, these aspirations rely on the simultaneous integration of networked infrastructure, Big Data, machine learning processes, and data analytics to create solutions to urban problems (Kitchin, Reference Kitchin, Kitchin, Lauriault and McArdle2017; Botero Arcila, Reference Botero Arcila2022). Proposals for smart communities may differ in the type of infrastructure they aim to transform, but are united in their desire to integrate ICTs into everyday environments to produce an optimized city experience (Powell, Reference Powell2021). This idea of the optimiszed city, however, did not begin with the advent of ICTs. Rather, this practice reflects a larger view of the city as a business venture, made up of services and interests to be purchased, controlled, and sold (Frug, Reference Frug1999). Following the Second World War, urban scientists began experimenting with the city as a logistical system of control, demonstrating their presence as “clear precursors to contemporary calls for cities to be built from the internet up” (Mattern, Reference Mattern2021, 64). In the 1980s and 1990s, machine learning developments and ICT infrastructure expansion aimed to integrate data into the very foundation of urban life (Kitchin, Reference Kitchin, Kitchin, Lauriault and McArdle2017). With this combination of infrastructure and a decades-long research programme into data collection, it is not a surprise that corporate actors embraced calls for a “smart” city through large-scale investment in data-driven urbanism (Kitchin, Reference Kitchin, Kitchin, Lauriault and McArdle2017), as it unites the desire to sell services in the city with a push to find optimised solutions to urban issues.

Among those intrigued with the notion of the smart city as an optimization problem was Daniel Doctoroff, the Chief Executive Officer of Sidewalk Labs. Doctoroff’s work implementing the LinkNYC programme in 2015 resulted in international praise for the corporation, despite numerous concerns surrounding privacy and corporate surveillance.Footnote 1 From then on, Sidewalk Labs sought to increase their reach through the implementation of a large-scale “smart” community located in Toronto, Ontario, through a series of sensors that would create the infrastructure necessary for self-driving cars, environmentally safe and temperature-controlled buildings, smart trash removal, AI-moderated health records and more digital interventions (Sidewalk Labs, 2019a). Each of these interventions created the ability to gain meaningful (and profitable) data about residents’ whereabouts, habits, and practices. They work collaboratively to produce the “smart” sensor society, where constant asymmetrical and opaque data collection enables predictive technology that can be marketed and sold. The reduction of civilian activity to data within this “smart community” was presented as objectively beneficial—not only for the economy but for citizens themselves (Waterfront Toronto, 2017).

A feminist study of smart urbanism is necessary to challenge these data cultures and assert the need for a localized form of data governance. Smart infrastructures collect data to be used in evaluation schemas that are presented as neutral, yet these schemas themselves are built upon histories of isolation, control, and unequal categorisation. Kate Crawford (Reference Crawford2021) highlights how the mugshots being used to train facial recognition algorithms within the NIST dataset are promoted as neutral and devoid of context. However, behind these images are consequences, as “neither the people depicted in the photographs nor their families have any say about how these images are used and likely have no idea that they are part of the test beds of AI” (Crawford, Reference Crawford2021, 92). Despite being “publicly available”, these photographs are not neutral but reflect broader histories within the criminal justice system, where communities of colour are often subjected to increased surveillance and police presence (Maynard, Reference Maynard2017; Browne, Reference Browne2015; Benjamin, Reference Benjamin2019). While this is just one example, this depoliticisation of data is part of “a shift from image to infrastructure, where the meaning or care that might be given to the image of a person, or the context behind a scene, is presumed to be erased at the moment it becomes part of an aggregate mass that will drive a broader system,” (Crawford, Reference Crawford2021, 94). Within the smart city, the mass capture of data enabled by an always-on surveillance regime results in a collection process where individual context is obscured. A focus on the Quayside project demonstrates the inherent locality of smart cities, a locality that underscores the need to re-contextualize complex urban issues. Only through this process can we develop the context necessary to build “smart-er” solutions that do not rely on narratives of objective and universal data. Applying a feminist lens to both this case and its future policy implications is an integral step to building these solutions, as it works directly to challenge universalism and expose invisible power relations. Developing stronger algorithmic governance procedures requires this rejection to centre values of meaningful participation and accountability procedures, so that these structures of black-boxed power (Pasquale, Reference Pasquale2015) can continue to be exposed and challenged.

3. Sidewalk labs and the pitfalls of corporate algorithmic governance

In the mid-2000s, Waterfront Toronto, an organiszation created in collaboration with the federal, provincial, and municipal government branches, purchased a plot of land that would later house the Quayside project for $68 million (Auditor General of Ontario (AG), 2019). This land was purchased to “build affordable housing, provide public access to the water’s edge, enable streetcar track extensions, locate an energy plant and enable other development opportunities” (AG, 2019, 662). Nearly one decade later Waterfront Toronto published a Request for Proposals (RFP) that aimed to innovate and transform the space. On October 17, 2017, 6 months after the initial RFP, Sidewalk Labs was announced as the succeeding bid (Waterfront Toronto, 2017). This announcement came incredibly quickly, as the initial finalists were selected a mere 6 weeks after the RFP was issued (AG. 2019, 44). The discovery of communications between the Chief Planning and Design Officer of Waterfront Toronto and the CEO of Sidewalk Labs were presented as potential evidence of unfair competition, as the two were reportedly engaged in frequent communication between 2016 and the issuance of the RFP. Even before the development of a proposal for smart infrastructure, questions about the ethics of public–private partnerships in Toronto were arising.

Throughout the next two and a half years, Sidewalk Labs would host five public roundtables, establish a Residents Panel, and publish the long-awaited Master Innovation and Development Plan (MIDP) (Sidewalk labs, 2019a). Yet, with each ‘public engagement’ activity, Sidewalk Labs faced numerous questions surrounding the logistics and monetisation of data. Residents and academics questioned the ambiguous proposal of sensors within the neighbourhood which would enable smart decision-making, asking about control, management of personal information, and anonymity (Wylie, Reference Wylie2018; Public Consultation Question List, n.d.; Zarum, Reference Zarum2019). As an answer to these concerns, Sidewalk Labs developed a data protection structure; the data collected in the community would be known as “urban data”—information gathered in the public realm, publicly accessible spaces, and certain private buildings (Sidewalk Labs, 2019a)—and held in a trust. Yet, the category of “urban data” is non-existent within Canadian legislation, leaving no precedent for an Urban Data Trust to work from, other than that of Sidewalk’s own Responsible Data Use Guidelines (Sidewalk Labs, 2019b).Footnote 2 While data trusts have been presented as potential solutions to ambiguous data governance (Delacroix and Lawrence, Reference Delacroix and Lawrence2019), Sidewalk Labs’ proposal highlights how accountability becomes difficult when corporate actors ignore diverse needs to define their own data governance methods.

3.1. First pitfall: Failure to create informed consent

When implementing any new urban innovation, informed consent is an important factor in guaranteeing the protection of citizens. As a practice, informed consent prioritises the digital rights of citizens, making them an active part of the process. Within the Quayside project, Sidewalk Labs treated citizen feedback as an additive feature, rather than a central aspect of the project’s development. The first pitfall I will examine is Sidewalk Labs’ failure to adhere to a proper informed consent model. This failure resulted in minimal citizen engagement, making it difficult to achieve public trust.

Within Toronto, there are three pieces of legislation that are especially important in terms of public–private partnerships and data collection: Municipal Freedom of Information and Protection of Privacy Act (MFIPPA 1990), Freedom of Information and Privacy Protection Act (FIPPA, 1990) and Personal Information Protection and Electronic Documents Act (PIPEDA, 2000). These three acts work in conjunction to provide strict privacy protections regarding the collection of personal information, with PIPEDA working at the federal level, FIPPA at the provincial level, and MFIPPA at the municipal level. While the three legislations differ in scope—PIPEDA covering private entities and FIPPA/MFIPPA covering public institutions—all three highlight the importance of consent and the ability to “opt out” as essential to privacy protection. PIPEDA creates the conditions in which corporations are deterred from using unethical data practices through ten principles to safeguard the collection of personal data: accountability; identifying purposes; consent; limiting collection; limiting use, disclosure, and retention; accuracy; safeguards; openness; individual access; challenging compliance (PIPEDA Fair Information Principles, 2019). FIPPA and MFIPPA complement this practice, as they apply to public institutions and aim to protect an individual’s right to privacy while ensuring that data is being collected for specific purposes such as administration or law enforcement (Beamish, Reference Beamish2019, 2). In proposing “urban data” as a new category, Sidewalk Labs was able to sidestep these regulations that only apply to personal data, subsequently neglecting the need for consent and participation.

In failing to adhere to the requirements of existing legislation, Sidewalk Labs violated Canadians’ personal and collective privacy rights, as informed consent was impossible to achieve within smart city infrastructure. (CCLA, 2019, 4). Smart neighbourhoods present an unacknowledged risk of coercion for civilians, as often individuals living, working, or passing through these areas have no alternative to the “smart” services existing within the neighbourhood, creating (Beamish, Reference Beamish2019). As citizens would be unable to meaningfully opt out of data collection within the space, the potential for surveillance and commoditization presented a contradiction to the precedent established by federal privacy legislation, in addition to the rights to Freedoms of Assembly and Association, Life, Liberty and Security of the Person, and Unreasonable Search or Seizure as outlined in the Canadian Charter of Rights and Freedoms (CCLA, 2019, 9). As Ellen Goodman makes clear in her Affidavit regarding informed consent in the Quayside neighbourhood, “consent is insufficiently “knowing” when the user does not understand the technology being agreed to or the practical consequences of agreeing. Consent is insufficiently “free” when a person must choose between consent and the loss of an important function or asset. The project poses both of these dangers to a significant degree” (Goodman, Reference Goodman2020, 13). Sidewalk Labs aimed to impose infrastructure that would have strong impacts on citizens’ daily lives, and as such, should have treated informed consent with precarity and caution. If people do not understand the implications of mass data capture, they will be unable to hold corporate actors and governments accountable. This becomes especially essential when technology is integrated into urban spaces, as simply telling people to not enter a neighbourhood is not always a possibility.

Yet, these issues are not solely representative of the Quayside project. Rather, the current global structure of data regulation itself relies on an interlocking value system, where the ideas of notice and consent are at the forefront. Canada’s PIPEDA, as well as the UK’s Data Protection Act (2018), and both the EU and UK General Data Protection Regulation (GDPR, 2016, Information Commissioner of the UK, 2021) share a commitment to a regulatory data framework in which individuals can both know when their information is being processed, and consent to its collection. Similar to PIPEDA’s Fair Information Principles, the GDPR outlines seven key principles with which personal information is to be governed: Lawfulness, fairness, and transparency; Purpose limitation; Data minimisation; Accuracy; Storage limitation; Integrity and confidentiality; and Accountability (GDPR, 2016, 5.1). These sets of principles outline the values behind these policy decisions, encouraging practices of transparency and acceptance.

On deeper reflection, these principles are far from effective in promoting safe data practices, as they all place an overwhelming onus of responsibility on the individual to find, understand, and then challenge or accept the collection of their data. Here, the desire for pragmatic solutions to complex problems sees transparency as a direct path to accountability. However, these legislative principles fail to account for the full problem at hand—a lack of digital literacy and a feeling of powerlessness when face-to-face with Big Data (Obar, Reference Obar2015). Data protection in its current form takes seeing over understanding, where an increase in transparency is equated with an increase in knowledge, and subsequently an increase in control (Ananny and Crawford, Reference Ananny and Crawford2018). Yet, being given information does not mean that meaningful change will occur. Nor does it mean that those who are supposed to be holding a system accountable will be able to achieve those goals. Rather, a sole focus on transparency fails to reach a critical stage, which recognizes the deployment of smart infrastructure as reflective of collective harms and meaningful engagement with the construction of community-based systems. The implementation of intelligent decision-making and data collection cannot be regulated with a sole focus on individuality. Instead, data governance needs to develop clear policies aligned with feminist protocols of challenging power to provide residents and citizens with the tools to critically evaluate this information (Kern, Reference Kern2019). Informed (and participatory) consent within digital policy is an incredibly important practice for ensuring ethical technological intervention. Integrating a new type of data alongside no clear legislative protections fails to ensure that legitimate consent is possible. As the Quayside project highlights, developing new standards built solely on the perception of transparency fails to address the need for collective support and engagement.

3.2. Second pitfall: promotion of corporate governance

To regulate this new category of data, Sidewalk Labs created the Urban Data Trust—an external regulatory body responsible for making decisions surrounding data in Quayside. The roles of the Urban Data involved the “approval and management of data collection devices placed in the public realm, as well as addressing the challenges and opportunities arising from data use, particularly those involving algorithmic decision-making” (Sidewalk Labs, 2019a, p.420). The data-sharing agreements were to be reminiscent of data licence agreements and enforceable in court (Sidewalk Labs, 2019a). However, Sidewalk Labs also stipulated that this trust was not a legal trust, and failed to identify what these court proceedings would look like or how data breaches could be legally enforceable. Further, the cases themselves would be evaluated not according to Canadian legislation, but Sidewalk Labs’ own Responsible Data Use Guidelines” (Sidewalk Labs, 2019b). The establishment of their own criteria for responsible data use, combined with the creation of an entirely new form of data, essentially makes Sidewalk Labs the new expert on the ethical use of data in smart cities. Rather than turning to the government for perspectives, a corporate actor sought to create the new rules of the game.

The issue at hand involves the notion of corporate governance as being essential to the implementation of the Quayside project. The increasing prevalence of corporate governance can be seen in the Trust’s ability to turn residents’ data into assets. The process of creating an asset-based understanding of data is outlined by Artyushina (Artyushina, Reference Artyushina2022) as constitutive of five interconnected processes: “introducing new legal definitions for the data, retrofitting city infrastructure with data-tracking devices, creating a self-certification regime for data collectors, accumulating the data collected in the smart city in one physical location, and establishing IP-intensive data sharing agreements” (Artyushina, Reference Artyushina2022, 8). In turning residents’ data into assets for evaluation, the data trust therefore aimed to establish data as an infrastructural element of urban life, rather than as part of a transactional relationship, redefining the rules of data governance according to Sidewalk Labs’ own procedure.

Further, there was a distinct lack of government consultation, ultimately resulting in harsh consequences and a failure to protect the rights of residents. Unlike its previous operating practices, Waterfront Toronto did not adequately consult with any levels of government regarding the Sidewalk Labs project (Auditor General, 2019, 650). The scope of the project, from self-driving vehicles to data collection, falls under multiple provincial and federal ministries and city departments, but Waterfront Toronto did not adequately consult with any of them before signing an initial agreement (AG, 2019, 650). In 2002, Waterfront Toronto established an Intergovernmental Steering Committee made up of representatives from the municipal, federal, and provincial governments (Eidelman, Reference Eidelman2013). Yet, Sidewalk Labs and Waterfront Toronto failed to adequately use the committee’s expertise and knowledge to its advantage. The steering committee was consistently in the dark regarding the decisions surrounding Quayside, as the Auditor General of Ontario found that the committee was only made aware of the name of the successful bidder 5 days before the public announcement (Auditor General, 2019, 649). Upon further investigation, the Auditor General found that the steering committee itself was not provided with a framework or guide to support its decision-making process (Auditor General, 2019, 681).

Following the proposal of the Urban Data Trust, Waterfront Toronto’s Digital Strategy Advisory Panel (DSAP) rejected the notion of urban data entirely (DSAP, 2020). Instead, they asserted that data collected through Quayside would be understood within the existing Canadian legal structure. Yet, given the limited control and input the Intergovernmental Steering Committee was given in regard to the Quayside project, the ways in which these laws would impact the governance processes still appeared vague. These concerns about government participation and the establishment of new trusts and categories of data governance present an important consideration moving forward, as there needs to be a rigorous process to protect the rights of citizens so that it is not corporations establishing the rules and trades of the game.

Public–private partnerships present complexities in governance, especially with regard to data. When viewed through a feminist lens that aims to see the “not-seen” (Smith, Reference Smith, Dubrofsky and Magnet2015), these complexities represent deeper systemic issues regarding mass surveillance and human rights. While these proposals are marketed under the guise of public benefit, enabling corporate governance provides more drastic consequences for civilians. As Sidewalk Labs aimed to promote an environment where technology solves all issues, the Quayside project ignored the risks that a hyper-surveilled space poses to certain populations. Creating seemingly innovative solutions without considering their full socio-political consequences creates the potential for further marginalization of people who are already experiencing direct harm from these technologies at multiple intersectional levels. Where Sidewalk Labs may promote increased surveillance as an efficiency or safety mechanism, it poses risks to people of colour, women, and gender minorities who are misrecognised by this technology more frequently than others (Buolamwini and Gebru, Reference Buolamwini and Gebru2018; Marshall, Reference Marshall2019). When data is mobilized as a source of power, collecting more of it presents the risk of upholding hierarchies of discrimination. Technology does not impact everyone equally, and the assumption that employing algorithms would “solve” the problems of a city vastly ignores the issues that technology cannot fix. In proposing to integrate an entire neighbourhood with surveillance technology, Sidewalk Labs subsequently isolated marginalized residents from safely accessing their own city.

If technology is to be integrated into cities, it needs to occur in a process where civilians are able to meaningfully engage with the development procedures, while knowing that their rights are thoroughly protected under the law. Within Toronto, the proposal to build a city “from the internet up” (Sidewalk Labs, 2019a) came alongside a series of different infrastructural changes. Yet, Sidewalk Labs is not the only corporation, nor the first one, to attempt to integrate corporate governance into the inner workings of a city.Footnote 3 Rather, the push for corporate governance within the Quayside project reflects a larger practice of corporate capture in the city. As corporate actors are on the cutting edge of innovation, research and development, their technology becomes integrated into these urban environments through an increasing reliance on public–private partnerships. As ICTs make data collection easier than ever, data are emerging as a key resource that—if not properly protected—can be exploited in search of profit (Sadowski, Reference Sadowski2019). The challenge to public–private partnerships in urban “smart” environments highlights that for corporate actors, there is an increased incentive to collect data en masse within the city. Here, the initial investment is not the sole source of profit. Rather, the ability to constantly collect and distribute key informational data about a community promises a stronghold on a valuable resource in the 21st century (Hollands, Reference Hollands2015). As corporate actors retain their position as the experts in the field of smart infrastructure, they can promote their own forms of universalist application and self-regulation without an acknowledgement of the economic benefits this data collection provides to corporate actors. In favouring universalism over a localized understanding of urban issues, smart cities ignore context and systemic issues in search of increased profit.

4. Combatting corporate governance: establishing feminist protocols in urban technology development

In May 2020, the Quayside project was abandoned. While this is only one case of a public–private partnership created to develop a “smart” neighbourhood, these initiatives exist within the broader scope of corporate tech projects worldwide that aim to present the collection and usage of data as universally beneficial. Yet urban life cannot, and should not, be reduced to mere “objective” data points. Data is not a universal, uncomplicated answer. In promoting an intersectional feminist praxis within data policy, I build from a feminist critique of universalism (Haraway, Reference Haraway1988) to argue for a viewing of data—and cities themselves—as local and contextualised, representing the multiple interactions that take place alongside the data itself. This approach challenges dominant systems of power while acknowledging that “we must make connections between all of these practices and begin addressing inequities in material…that take seriously the knowledge, experiences, and contributions of people of colour” (Cahaus, Reference Cahaus2022, p.6). To view data as contextualized, policymakers must look at data as reminiscent of greater sociotechnical contexts related to interpretation and production (Loukissas, Reference Loukissas2019). This involves a direct rejection of data collection as an autonomous process, but rather as one with disparate impacts that are embedded in the data cultures that govern and shape these sociotechnical relations (Roberge and Seyfert, Reference Roberge and Seyfert2016; Bates, Reference Bates, Kitchin, Lauriault and McArdle2017). Even when created from seemingly good intentions, data-driven urbanism, as articulated through a corporate drive for profit and endless collection, has its shortcomings. The current approach to public–private partnership that centres on data-driven urbanism is predicated on values of efficiency and technological solutionism that are often far removed from urban plans, meaningful accountability measures, and participation initiatives (Lauriault et al., Reference Lauriault, Bloom and Landry2018). In exposing and establishing data cultures as reflective of systemic biases, I bring to light how the Quayside project’s emphasis on a one-size-fits-all solution failed to address the multifaceted ways people are affected by their daily lives within Toronto, and establish the need to move away from self-regulation and towards collective data governance structures.

Despite prominent solutionist rhetoric, there are alternative ways to address urban problems that do not promise universal solutions, but highlight the strength of a localised, contextual approach. I argue that when focussed on community guidance, technology can be used as an important tool to recognise the multifaceted ways people experience urban life, shifting the power from corporations to those living in the city itself. To demonstrate this, I turn towards members of the community who are working towards changing the ways both cities and algorithms are designed. Learning from these interventions, policymakers can understand the strength of an intersectional and localized approach to algorithmic governance within the city. In the following section, I outline a critical approach to the construction of a feminist smart city, in which policy and participation play key roles. Taking a feminist approach to this development involves understanding the relationship between policy and participation as one of mutual influence, where the experiences of marginalized and at-risk communities—those who are often “not-seen”—are valued. Here I articulate two primary processes with which a feminist smart city can be developed: through participation in design and localized policy.

4.1 Participation in design

While urban bureaucracy often faces a common complaint of slowness, I argue for a leaning into the idea of slowness. From a feminist perspective, moving slowly involves a critical analysis of who is “not-seen” and subject to a hyper in/visibility in decision-making (Smith, Reference Smith, Dubrofsky and Magnet2015). This process integrates collaboration from the start, working to develop technological solutions in tandem with the communities they are meant to serve, rather than imposing corporate technology as a “universal” benefit. It also involves concrete accountability measures, where the public can understand what happens to their data. The slowness experienced by urban bureaucracy is often accompanied by confusion and frustration about being “out of the loop” with regard to the process. My articulation here emphasises the need for residents to be viewed as direct collaborators in the process of urban development, rather than as an additive feature later on in the process. Taking this approach to designing a smart city means putting the experiences and potential consequences for those impacted front and centre. Drawing from the key principles of an open smart city as articulated by Open North’s open smart cities project (Lauriault et al., Reference Lauriault, Bloom and Landry2018), I emphasise that a feminist perspective sees slowness not as a hindrance, but as a potential to mobilize technology and data when warranted according to ethical, accessible, and transparent standards.Footnote 4 This approach therefore builds upon calls to recognise that data and technology are not the solution to many of the systemic issues cities face, nor are there always quick fixes (Mattern, Reference Mattern2021; Powell, Reference Powell2021).

Where Sidewalk Labs sought to integrate public opinion into their project as an additive feature, an open smart city would build itself with public interest and contribution as initial stepping stones. A design justice framework would help organise citizen involvement. In this approach, designers are required to look to the community for guidance. Sasha Costanza-Chock (Reference Costanza-Chock2020) highlights designers engaging in this process as practitioners who aim to incorporate the ways in which the members of the community are already working together to face a challenge. Taking a design justice framework involves not only considering the perspectives of community members but genuinely involving them in the design process itself. This important work is already being done by groups publishing the Critical Engineering Manifesto (Reference Oliver, Savičić and Vasiliev2021), Feminist Data Manifest-No (Reference Cifor, Garcia, Cowan, Rault, Sutherland, Chan, Rode, Hoffmann, Salehi and Nakamura2019), and the More than Code Report (Reference Costanza-Chock, Wagoner, Taye, Rivas, Schweidler and Bulllen2018), which each highlight the importance of design as a process that involves meaningful engagement with those most impacted by technology. To embed a city with just design practices and open smart city frameworks, there is no need to reinvent the wheel, but there is a need to reflect on these critical interventions in current design theory.

4.2 Localized governance

However, it must be understood that design cannot solve everything. Design as a process goes hand-in-hand with developing clear policy initiatives to uphold the rights of residents. Currently, there is a call to redefine Canadian privacy legislation to reflect the necessary conditions of the digital age. As of early 2024, Bill C27—The Digital Charter Implementation Act—has yet to be passed. The proposals for Bill C27 involve integrating a comprehensive AI legislation based on a risk-assessment framework, while also replacing PIPEDA with an act that addresses the need for Canadians to be able to access, erase, correct, and transfer data about themselves (Bill C27, 2022) While this new approach to data governance addresses the need to develop strategies for the digital age, Bill C27 has already received criticism for its inability to address collective rights (Duncan and Wong, Reference Duncan and Wong2022), lack of a focus on shared prosperity and community engagement (Brandusescu and Sieber, Reference Brandusescu and Sieber2023), and its failure to integrate an intersectional, human rights-based framework (Kim and Thomasen, Reference Kim and Thomasen2023). Governance in the age of artificial intelligence requires meaningful citizen engagement to address the multifaceted ways in which technology used in urban spaces has intersectional, far-reaching impacts. Therefore, a critical policy intervention involves moving away from the self-regulatory precedent, and ensuring that the human rights of residents are not sacrificed in the development of these policies.

This articulation builds from the current push within critical AI studies to approach artificial intelligence development from a localized, context-based approach. This draws from Katz and Nowak’s (Reference Katz and Nowak2017) articulation of new localism in which “local jurisdictions are increasingly taking it upon themselves to address a broad range of environmental, economic, and social challenges, and the domain of technology is no exception” (Verhulst, Reference Verhulst, Brandusescu and Reia2022, 80). Within artificial intelligence communities, there has been an increase in specifically localized approaches to governance as a way to enable public participation in artificial intelligence development. Recent examples of AI localism include San Francisco’s ban on AI-powered facial recognition technology (Conger et al., Reference Conger, Fausset and Kovaleski2019), New York City’s rules on automated decision-making in hiring (Francis, Reference Francis2023), Helsinki and Amsterdam’s public registries of AI systems used in local government (Team AI Regulation, 2020), Montreal’s City Council motion against facial recognition (Serebin, Reference Serebin2020), and Barcelona’s citizen watch initiatives (March and Ribera-Fumaz, Reference March, Ribera-Fumaz, Karvonen, Cugurullo and Caprotti2018). A key element of this appeal to localized governance highlights that it works as a decentralised, rather than fragmented approach. Here, a decentralised approach aims to learn from the other local regulatory strategies to build a specific and contextual regulatory response, rather than applying them universally. While there are certain inalienable rights—particularly to privacy—this approach builds upon these rights to encourage a collaborative policy approach, while utilising the skills that other municipalities have developed to find what fits best with a community’s goals and needs.

Beyond ensuring governance principles in abstract, the notion of accountability in artificial intelligence policy involves creating a culture of trust. To this extent, Brudvig highlights the importance of consent as an ongoing project involving “a) revoking tools or systems that are not democratically agreed upon; b) opening up spaces for local creation of new tools and systems; c) decentralised governance or custodianship of data and knowledge systems; d) inclusion of local leaders in decision-making; and e) interdisciplinary collaboration in spaces of technology creation, policy and decision-making” (Brudvig, Reference Brudvig, Brandusescu and Reia2022, 29). Each of these acts reflects a commitment to creating a transparent and accountable city that prioritises the needs of its residents. Gaining meaningful transparency itself is a process that engages not only the public but also stakeholders, governing bodies, and corporations. Ensuring the interpretability of transparency, therefore, is necessary to develop this culture of trust, as accountability can only be enacted when civilians can understand how and why a system needs to change (Brauneis & Goodman, Reference Brauneis and Goodman2018, 132). Here I second Robert Brauneis and Ellen Goodman’s call for a shift away from the pull model of transparency in algorithmic governance—where individuals have to pull information that they wish to know—and instead call for a push model—in which those making decisions make their information publicly accessible and interpretable. Implementing accessible policy therefore goes hand-in-hand with ensuring participatory methods that engage the public at each stage of development, to ensure that all interests are recognised. In prioritising the development of a localized approach to policy, smart cities can be developed with the interests of residents at the forefront.

5. Conclusion: rethinking the current approach to AI governance

The consequences of Sidewalk Labs’ Quayside project are reflective of a broader issue in the governance of smart infrastructure. This project reflected deeper concerns about the consequences of public–private partnerships in the development of these programmes, as they often fail to address the disproportionate risk that algorithmic decision-making poses to marginalised groups. As I have detailed, there is a need to restructure the ways privacy is governed in relation to smart city development. When corporate actors are able to develop and set their own precedent for the storage, collection and usage of data, citizens are not ensured that their rights will be protected by legislation. Now more than ever, there is a need for comprehensive and strong governance mechanisms to ensure that the multifaceted risks of surveillance technology within smart cities are addressed. While new attempts to govern technological development in Canada have their particular shortcomings, there are clear benefits to developing civilian-centred governance regimes. Throughout this analysis, I have highlighted the need to develop systems of algorithmic governance that meaningfully engage those who will be put in harm’s way through the mass integration of smart technology in urban infrastructure. This approach is best conceived through a critical feminist praxis, which approaches smart city development through the simultaneous promotion of participation in design and localized policymaking. In this way, smart cities can be designed from the outset with an understanding of the ways marginalized populations are put at risk by surveillance technology in urban spaces, while also incorporating a governance programme that upholds the rights of citizens and gives them meaningful and accessible accountability tools. Enacting both of these approaches concurrently provides the potential for a revitalized approach to navigating governance in the smart city, in which technology is implemented as a method to support communities, rather than as a way to extract data from them.

Data availability statement

None.

Acknowledgments

The author is grateful for the administrative support and feedback provided by Dr. Jonathan Sterne and Dr. Alex Ketchum of McGill University, Montreal, as well as to the anonymous reviewers who provided essential feedback.

Author contribution

Conceptualization—L.M; Data curation—L.M; Formal analysis—L.M; Funding acquisition—L.M; Investigation—L.M; Methodology—L.M; Project administration—L.M; Writing—original draft—L.M; Writing—review and editing—L.M.

Provenance

This article is part of the Data for Policy 2024 Proceedings and was accepted in Data & Policy on the strength of the Conference’s review process.

Funding statement

None.

Competing interest

None.

Footnotes

1 Regarding concerns about Link NYC: NYCLU wrote a letter to Mayor de Blasio about their concerns about the information being retained from the servers. They recommended that the privacy policy be rewritten so that it expressly mentions whether the Links’ environmental sensors or cameras are being used by the NYPD for surveillance or by other city systems (Hirose and Miller, Reference Hirose and Miller2017). Charles Myers found that LinkNYC Has published folders on GitHub titled “LinkNYC Mobile Observation” and “RxLocation” that contained identifiable user data (Koffman, Reference Koffman2018).

2 Sidewalk published a “Responsible Data Use” Guide that would be used to determine the decisions of the Urban Data Trust. However, they were faced with growing concern from legislative actors and academics around the ethics of corporations (especially ones profiting off of data collection and distribution) determining “how” to use data ethically. (See Sidewalk Labs, 2019b)

3 The investment into the “smart infrastructure” industry is not limited to Alphabet—IBM’s “smarter planet” initiative was developed to facilitate the integration of ICTs and infrastructure to address urban issues of healthcare, traffic congestion, and security (Palmisano, Reference Palmisano2010). The initiative was accompanied by a large-scale marketing campaign aimed at introducing consumers to the benefits of intelligent infrastructure (Schoultz, Reference Schoultz2019)

4 In the Open Smart Cities Guide (Lauriault et al., Reference Lauriault, Bloom and Landry2018), there are five principles that make up an open smart city, including: (1) ethical, accountable, and transparent governance; (2) meaningful engagement through just and inclusive approaches; (3) data and technologies that are fit for purpose, open, interoperable, and local; (4) public interest-based governance; and (5) rejection of techno-solutionism.

References

Ananny, M and Crawford, K (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973989. https://doi.org/10.1177/1461444816676645CrossRefGoogle Scholar
Andrejevic, M and Burdon, M (2015) Defining the sensor society. Television & New Media, 16(1), 1936. 10.1177/1527476414541552.CrossRefGoogle Scholar
Artyushina, A (2022) Alphabet Is Here to “Fix” Toronto: Algorithmic governance in Sidewalk Labs’ Smart City. In Smart Cities. CRC Press, pp. 99112.CrossRefGoogle Scholar
Auditor General of Ontario (AG) (2019) (rep.) 2019 Annual Report. Available at https://www.auditor.on.ca/en/content/annualreports/arbyyear/ar2019.html.Google Scholar
Bates, J (2017) Data cultures, power and the city. In Kitchin, R, Lauriault, TP and McArdle, G (eds.), Data and the City. Routledge, pp. 189200.CrossRefGoogle Scholar
Beamish, B (2019, September 24) Re: Sidewalk Labs’ Proposal. Information and Privacy Commissioner of Ontario. Available at https://www.ipc.on.ca/wp-content/uploads/2019/09/2019-09-24-ltr-stephen-diamond-waterfront_toronto-residewalk-proposal.pdfGoogle Scholar
Benjamin, R (2019) Race After Technology: Abolitionist Tools for the New Jim Code. Polity.Google Scholar
Bill C-27 (2022) An Act to Enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, 44th Parliament, 1st session 2022.Google Scholar
Botero Arcila, B (2022) Smart city technologies: A political economy introduction to their governance challenges. In Bullock J, Chen Y, Himmelreich J, Hudson V., Korinek A, Young M and Zhang B (eds), The Oxford Handbook of AI Governance. https://doi.org/10.1093/oxfordhb/9780197579329.013.48CrossRefGoogle Scholar
Brandusescu, A and Sieber, R (2023) Canada’s Artificial Intelligence and Data Act: A missed opportunity for shared prosperity. Available at https://www.ourcommons.ca/Content/Committee/441/INDU/Brief/BR12636987/br-external/Jointly4-e.pdfCrossRefGoogle Scholar
Brauneis, R, and Goodman, EP (2018) Algorithmic Transparency for the Smart City. Yale Journal of Law and Technology, 20, 103. Available at http://hdl.handle.net/20.500.13051/7826Google Scholar
Browne, S (2015) Dark Matters: On the Surveillance of Blackness. Duke University Press.Google Scholar
Brudvig, I (2022) Feminist Cities and AI. In Brandusescu, A and Reia, J (eds.), Artificial Intelligence in the City: Building Civic Engagement and Public Trust. Centre for Interdisciplinary Research on Montréal, McGill University. pp. 2829.Google Scholar
Buolamwini, J and Gebru, T (2018) Gender shades: intersectional accuracy disparities in commercial gender classification. In Conference on Fairness, Accountability and Transparency, pp. 7791.Google Scholar
Cahaus, MC (2022) Building a holistic intersectional feminist praxis in geography: lessons from community. The Professional Geographer, DOI: 10.1080/00330124.2022.2061536CrossRefGoogle Scholar
CCLA. (2019) Amended Notice of Application to Divisional Court for Judicial Review. Canadian Civil Liberties Association. Available at https://ccla.org/wp-content/uploads/2021/06/Amended-Notice-of-Application.pdfGoogle Scholar
Cifor, M, Garcia, P, Cowan, TL, Rault, J, Sutherland, T, Chan, A, Rode, J, Hoffmann, AL, Salehi, N and Nakamura, L (2019) Feminist Data Manifest-No. Available at https://www.manifestno.com/homeGoogle Scholar
Conger, K, Fausset, R and Kovaleski, SF (2019, May 14) San Francisco bans facial recognition technology. The New York Times. Available at https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.htmlGoogle Scholar
Costanza-Chock, S (2020) Design Justice. MIT Press Ltd.CrossRefGoogle Scholar
Costanza-Chock, S, Wagoner, M, Taye, B, Rivas, C, Schweidler, C, Bulllen, G and The T4SJ Project. (2018) #MoreThanCode: Practitioners reimagine the landscape of technology for justice and equity. Available at https://morethancode.cc.Google Scholar
Crawford, K (2021) The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (1st ed.). Yale University Press.Google Scholar
Delacroix, S and Lawrence, ND (2019) Bottom-up data trusts: disturbing the ‘one size fits all’ approach to data governance. International Data Privacy Law, 9(4), 236252. https://doi.org/10.1093/idpl/ipz014Google Scholar
Doctoroff, D (2020 May 7) Why we’re no longer pursuing the Quayside project — and what’s next for Sidewalk Labs. Sidewalk Talk. Available at https://medium.com/sidewalk-talk/why-were-no-longer-pursuing-the-quayside-project-and-what-s-next-for-sidewalk-labs-9a61de3fee3aGoogle Scholar
Duncan, J and Wong, WH (2022) Data rights will not save democracy. Schwartz Reisman Institute for Technology and Society. Available at https://srinstitute.utoronto.ca/news/data-rights-will-not-save-democracyGoogle Scholar
Eidelman, G (2013) Three’s company: A review of Waterfront Toronto’s tri-government approach to revitalization. The Mowat Centre for Policy Innovation 79, 132. Available at https://tspace.library.utoronto.ca/handle/1807/99253Google Scholar
Eubanks, V (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press.Google Scholar
EU General Data Protection Regulation (GDPR) . (2016 ) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). OJ L 119/1.Google Scholar
Francis, S (2023, April 7) New York City adopts final rules on automated decision-making tools, AI in hiring. Ogletree Deakins. Available at https://ogletree.com/insights/new-york-city-adopts-final-rules-on-automated-decision-making-tools-ai-in-hiring/#:~:text=The%20AEDT%20law%2C%20which%20took,to%20notify%20employees%20and%20jobGoogle Scholar
Freedom of Information and Protection of Privacy Act, RSO 1990, c F.31. Available at https://canlii.ca/t/562wqGoogle Scholar
Frug, GE (1999) City Making: Building Communities Without Building Walls. Princeton University Press.Google Scholar
Goodman, E (2020, April 30) Affidavit of Ellen Goodman. Canadian Civil Liberties Association. Available at https://ccla.org/wp-content/uploads/2021/06/Affidavit-of-Ellen-Goodman-Sworn_April-20-2020.pdfGoogle Scholar
Haraway, D (1988) Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575599. DOI: 10.2307/3178066CrossRefGoogle Scholar
Hirose, M and Miller, J (2017, March 15) Re: Link NYC Privacy Policy. Internet Archive. Available at https://web.archive.org/web/20160910102801/http://www.nyclu.org/files/releases/city%20wifi%20letter.pdfGoogle Scholar
Hollands, RG (2015) Critical interventions into the corporate smart city. Cambridge Journal of Regions, Economy and Society 8(1), 6177, https://doi.org/10.1093/cjres/rsu011CrossRefGoogle Scholar
Information Commissioner’s Office of the United Kingdom (28 June 2021) The UK GDPR. Information Commissioner’s Office. Available at https://ico.org.uk/for-organisations/data-protection-and-the-eu/data-protection-and-the-eu-in-detail/the-uk-gdpr/Google Scholar
Katz, B and Nowak, J (2017) The New Localism: How Cities Can Thrive in the Age of Populism. Brookings Institution Press.Google Scholar
Kern, L (2019) Feminist City: A Field Guide. Between the Lines.Google Scholar
Kim, R and Thomasen, K (2023) Submission to the Standing Committee on Industry and Technology on Bill C-27. DOI: 10.2139/ssrn.4571389CrossRefGoogle Scholar
Kitchin, R (2017) Data-driven urbanism. In Kitchin, R, Lauriault, TP and McArdle, G (eds.), Data and the City. Routledge, pp. 4456.CrossRefGoogle Scholar
Koffman, A (2018, September 8) Are New York’s free internet Kiosks tracking your movements? The Intercept. Available at https://web.archive.org/web/20181113174549/https://theintercept.com/2018/09/08/linknyc-free-wifi-kiosks/Google Scholar
Lauriault, T, Bloom, R, and Landry, JN (2018) Open Smart Cities Guide V. 10. Open North. Available at https://opennorth.ca/wp-content/uploads/legacy/OpenNorth_Open_Smart_Cities_Guide_v1.0.pdfCrossRefGoogle Scholar
Loukissas, Y (2019) Local origins. In All Data Are Local. MIT Press, pp. 1324. DOI: 10.7551/mitpress/11543.001.0001CrossRefGoogle Scholar
March, H and Ribera-Fumaz, R (2018) Barcelona. In Karvonen, A, Cugurullo, F and Caprotti, F (eds.), Inside Smart Cities: Place, Politics and Urban Innovation, 1st ed.. Routledge, pp. 229242. 10.4324/9781351166201Google Scholar
Marshall, L (2019, October 8) Facial Recognition Software has a Gender Problem. CU Boulder TodayGoogle Scholar
Mattern, S (2021) A city is not a computer. In A City Is Not a Computer: Other Urban Intelligences. Princeton: Princeton University Press, pp. 5172.Google Scholar
Maynard, R (2017) Policing Black Lives: State Violence in Canada from Slavery to the Present. Fernwood Publishing.Google Scholar
Municipal Freedom of Information and Protection of Privacy Act, RSO 1990, c. M.56. Available at https://www.ontario.ca/laws/statute/90m56Google Scholar
Obar, JA (2015) Big Data and The Phantom Public: Walter Lippmann and the fallacy of data privacy self-management. Big Data & Society, 2(2), 216. https://doi.org/10.1177/2053951715608876CrossRefGoogle Scholar
Office of the Privacy Commissioner of Canada (2019, May 31) Pipeda Fair Information Principles. Office of the Privacy Commissioner of Canada. Available at https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/p_principle/Google Scholar
Oliver, J, Savičić, G and Vasiliev, D (2011–2021, October) The Critical Engineering Manifesto. Available at https://criticalengineering.org/Google Scholar
O’Neil, C (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown.Google Scholar
Pasquale, F (2015) The Black Box Society: The Secret Algorithms that Control Money and Information. Harvard University Press. Available at http://www.jstor.org/stable/j.ctt13x0hchCrossRefGoogle Scholar
Personal Information Protection and Electronic Documents Act, SC 2000, c 5. Available at https://canlii.ca/t/541b8Google Scholar
Public Draft Sidewalk Toronto Public Consultation Question List. Available at https://docs.google.com/document/d/1mD-jG5j3XWNoxiC1ZW6W7pcI5Pl71HVbqzfTg2H67eQ/editGoogle Scholar
Powell, AB (2021) Undoing Optimization: Civic Action in Smart Cities. Yale University Press.Google Scholar
Roberge, J and Seyfert, R (2016) What are algorithmic cultures? In Algorithmic Cultures. Routledge, pp. 1337.Google Scholar
Sadowski, J (2019) When data is capital: datafication, accumulation, and extraction. Big Data & Society 6(1), 112. 10.1177/2053951714528481.CrossRefGoogle Scholar
Scheuerman, MS, Paul, J and Brubaker, J (2019) How computers see gender: an evaluation of gender classification in commercial facial analysis and image labeling services. Proceedings of ACM Hum.-Comput. Interact. 3, CSCW, Article 144 (November 2019).CrossRefGoogle Scholar
Schoultz, M (2019, August 19) 6 Lessons learned from IBM’s smart planet marketing campaign. Medium | Mike Schoultz. Available at https://mikeschoultz.medium.com/6-lessons-learned-from-ibms-smart-planet-marketing-campaign-2e29eac4aabaGoogle Scholar
Serebin, J (2020, September 18) Montreal should restrict police use of facial recognition technology. Global News. Available at https://globalnews.ca/news/7345106/montreal-police-facial-recognition-technology/Google Scholar
Sidewalk Labs (2019a) Sidewalk Labs Master Innovation and Development Plan. Available at https://www.sidewalklabs.com/torontoGoogle Scholar
Smith, A (2015) Not-seeing: state surveillance, settler colonialism, and gender violence. In Dubrofsky, RE and Magnet, S (eds), Feminist Surveillance Studies. Duke University Press. https://doi.org/10.2307/j.ctv1198x2b.6Google Scholar
Team AI Regulation. (2020, October 13) Amsterdam and Helsinki launch algorithm and Ai Register. MIAI. Available at https://ai-regulation.com/amsterdam-and-helsinki-launch-algorithm-and-ai-register/Google Scholar
Verhulst, SG (2022) AI localism: Governance of artificial intelligence at the city and local level. In Brandusescu, A and Reia, J (eds.), Artificial Intelligence in the City: Building Civic Engagement and Public Trust. Centre for Interdisciplinary Research on Montréal, McGill University, pp. 8082.Google Scholar
Waterfront Toronto’s Digital Strategy Advisory Panel (DSAP) (2020, February 17) DSAP Supplemental Report on the Sidewalk Labs Digital Innovation Appendix (DIA). Available at https://www.waterfrontoronto.ca/sites/default/files/connect/waterfront/521b1d08-3499-4a49-9d2c-b5fc34990ce5/dsap-report---appendices.pdfGoogle Scholar
Wylie, B (2018, March 28) Debrief on Sidewalk Toronto Public Meeting #1— evasive on data products, no answer on data residency. Medium. Available at https://biancawylie.medium.com/debrief-on-sidewalk-toronto-public-meeting-1-evasive-on-data-products-no-answer-on-data-a9f551535dcdGoogle Scholar
Zarum, L (2019) #BlockSidewalk’s war against Google in Canada. The Nation. Available at https://web.archive.org/web/20200116191022/. https://www.thenation.com/article/google-toronto-sidewalk-gentrification/Google Scholar
Submit a response

Comments

No Comments have been published for this article.