It is crucial to thoroughly consider the ethical implications before entering partnerships with tech companies. By evaluating the involved parties, data protection measures, and potential reputational impacts beforehand, we can mitigate risks such as further marginalization of subjects and partners.
Vulnerable groups, such as teens and youth, and marginalized groups, such as trans individuals, racialized communities, and Indigenous populations, require careful consideration, particularly in terms of data privacy and security.
Be cautious about becoming a data conduit for corporations, especially concerning specific populations.
Corporations might have a strong interest in gathering data about certain populations, and you risk becoming a conduit to data that they want to exploit for purposes other than your research.
Protect qualitative data repositories, especially for marginalized groups.
Consider the potential harm and marginalization if data is misused. Data can be exploited to target and manipulate populations, enabling harmful targeting practices.
Make sure you aren’t simply taking data from human beings and handing it over to corporations.
With good reason, many people tend to be highly skeptical of research involved with corporate finances and partnerships.
For example, corporate-funded research often faces skepticism due to real and perceived conflicts of interest. Accepting corporate funds may undermine trust in research outputs, despite their quality.
Peers may hesitate to collaborate due to concerns about real and perceived conflicts of interest.
There is always the risk that your work could become the basis of ethics washing campaigns, in which the results of your work are downplayed.
Your participation could be exploited for corporate messaging.
While this toolkit can help mitigate risks, be aware that the possibility of ethics washing remains.