Invisible Gaze in the Algorithmic Age: A Mixed-method Exploration of Legal Boundaries and Social Justice in Non-consensual Surveillance in China
Keywords:
Non-consensual surveillance, algorithmic bias, social justice, privacy, China, digital rights, PIPL, marginalized communitiesAbstract
This study explores non-consensual surveillance in China, focusing on the intersection of digital privacy, algorithmic bias, and social justice. This qualitative study uses a mixed-method approach to examine participants' awareness of digital surveillance, their experience with data collection, and their perception of how it affects marginalized groups. The obtained results indicate that there is high awareness of surveillance on social media and e-commerce sites, in which concerns of algorithm discrimination and unequal targeting of women, LGBTQ+, and racial minorities are a significant concern. Although China has a law addressing the Personal Information Protection Law (PIPL), the paper finds an information gap on legal awareness and enforcement. The study recommends intensifying regulatory activity, greater openness in collecting data, and additional public awareness of digital privacy rights. The research highlights the importance of a comprehensive approach to digital surveillance that safeguards individual and social justice rights.