THE CONVERSATION — As reported on Aug. 6, .
However, after , , fully committing to a “no AI training” set of policies by Aug. 11.
Even though Zoom pedalled back this time, their drive to gather data highlights the possibility of future hidden data extraction by them and other big tech companies.
More specifically, as a researcher working with and looking at Indigenous communities and their data, I am concerned about the privacy of these valuable data sets from Indigenous communities on Turtle Island.
Read news coverage based on evidence, not alarm.
Vulnerable Indigenous Knowledge
Over the past three years, Zoom calls have become a tool for organization and activism for many Indigenous communities.
For my own work, I use video and voice chat which lets us balance geographical differences to collaborate and share, as well as access communities that are hard to reach. Discussing issues with queer community members of different Indigenous Nations is often private and perhaps even sacred.
These conversations have elements that are public facing, but they also contain wisdom from Elders or Knowledge Keepers specifically trained to know what they can and cannot share in specific spaces. Some of this knowledge is sacred and is part of promoting and preserving Indigenous (and sometimes queer) ways of being.
A valuable commodity
This private information is constantly at risk of extraction from companies seeking to monetize or otherwise gain from our data.
Indigenous Knowledge represents a large gap in current big data. AI only works with large data sets which enables predictive technology to operate.
With knowledges that are primarily oral, it is difficult to gather proper data sets that often come from writing. The possibility for big companies to gather audio and visual data, could render this oral information visible by machines.
Protecting communities
has been an important concept for protecting marginalized communities from the extractive practices of researchers aiming to obtain data.
However, if platforms are extracting data without our knowledge, or demand our consent in order to use a service, a conflict emerges.
The conflict becomes one of : If we do not consent to use the infrastructure, we simply do not get access to that service. Access to voice and video sharing infrastructure has been a fundamental component of activism and community research, especially post COVID-19.
Can we ‘opt-out?’
Can we accept or refuse to be turned into research data?
Even though there is a permissions element, organizations are often gathering our data in exchange for using their services. For example, Fitbit .
Each individual who is opting for nearly any big service is being tracked to some capacity. And so, there needs to be a critical element of what is considered private.
Likewise, Zoom has the ability to gather this data, whether or not they use it for AI with consent. There is an anxiety that next time, the ambiguity will go unnoticed or perhaps force consent to access a seemingly necessary service.
As someone who looks at ethical data collection and mobilization, I believe we all need to be critical of those requests to have access to our private data when using these services.Crucial access to data
The relationship between data and Indigenous communities and the Canadian government has always been fraught. However, after the work of the in Canada (which concluded in 2015), it became even more clear that access to data and information is crucial to achieving justice and truth in relation to our histories.
For Indigenous peoples whose history has been systematically erased, demanding that organizations return records and data has become an important element of achieving the truth behind the experiences of Indian Residential School survivors. Communities have both the desire and need to have their data returned so that they can maintain to access information.
Ease of Zoom for communication
In-person collaboration between Indigenous communities can be difficult because of things like geographical differences, the lack of public transportation, and interruptions in Indigenous sovereignty. These issues continue the social and political fragmentation caused by settler colonialism to isolate these communities from one another.
Many of these challenges have been alleviated by information technologies like Zoom. And a platform like Zoom has been potentially unifying by bridging space. However, it could also become a tool to recreate the problem of data extraction in a new way.
We need to be attentive to these kinds of data gathering possibilities that offer to extract data from users.
These technological infrastructures may disproportionately harm Indigenous communities by making their private and sacred knowledges legible by AI. Data collection for AI could lead to the commodification of this sacred knowledge for profit.
Protecting this kind of data is not just the responsibility of Indigenous communities but a shared commitment that has a present and future urgency.
is a PhD student in Information at the University of Toronto. Wiebe does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
You can no longer count on social media to deliver important news to you. Keep your news a touch away by bookmarking SASKTODAY.ca's homepage at this link.
Here's why you should bookmark your favourites.