1. Can regulations promote good vibes?
The International Association of Privacy Professionals (IAPP) posted last week about a new draft regulation from the Cyberspace Administration of China charging service providers who use recommendation algorithms to “disseminate positive energy.”
The idea of creating positive energy, or “good vibes,” from privacy regulations might seem absurd—or at the very least aspirational. However impractical, there may be something to be learned from the regulation and similar ideas in other privacy concepts.
1.1 What’s a recommendation algorithm?
Recommendation algorithms are systems used to predict or suggest more items by analyzing customer’s past preferences or comparing them to similar customers. You might have seen this in action on YouTube in the “Up Next” function, or Spotify’s “Discover Weekly” playlist made just for you.
1.2 What exactly does the regulation say?
The Chinese draft regulation, titled the Internet Information Service Algorithmic Recommendation Management Provisions in English, says in Article 6, “Algorithmic recommendation service providers shall uphold mainstream value orientations, optimize algorithmic recommendation service mechanisms, vigorously disseminate positive energy, and advance the use of algorithms upwards and in the direction of good.”
Not only are they imposing the duty of disseminating positive energy; they’re also including the promise of using algorithms for good.
The regulation goes on to list ways that algorithm recommendation services should not be used—namely, threats to national security, disrupting social or economic order, and infringing upon other individuals’ rights.
2. How can regulations create positive energy around privacy?
Emulating positive energy or thoughts around privacy is not a new concept. In fact, it’s an integral part of creating a culture of privacy.
2.1 Incorporating a positive view of privacy
A culture with a positive view of privacy will protect privacy by default, the same as with privacy embedded into systems and technologies. When employees are properly trained, supported, and invited to contribute to the privacy framework, they’ll be motivated to keep up with their individual and corporate privacy responsibilities.
I’m reading “disseminate positive energy” and “advance the use of algorithms upwards and in the direction of good” similarly. Algorithmic recommendation service providers can promote privacy internally and externally by promising, and proving, that they’re doing so only for good.
2.2 Using algorithms for good
Compliance with the requirement of using algorithms for good may simply look like not using them for illegal purposes or purposes outside of those specified in their privacy notice. Consumers are known to be willing to share data to get a more personalized experience, according to a 2020 survey by Formation.ai.
Knowing that their data is only being used to promote learning or positive technological growth will increase customer willingness to share their data and participate in recommendation algorithms. Transparency also builds customer trust. With the assurance of algorithms being used honestly, customers will share more data and enable more good to be done with it.
3. Areas of concern
A known issue with adding safety measures is the building feeling of restriction. Not only does the application of these articles make service providers feel further restricted; they also increase the amount of resources, effort, and time required to keep up with them. It may feel like an immense undertaking that some service providers are unwilling to go through.
3.2 Business separation
Last week, LinkedIn announced that it will shut down its professional networking service in China later this year in favor of a separate application operating only in China due to greater compliance requirements. This approach will allow the Microsoft-owned application to continue servicing China through the stricter regulations without making the entirety of their business comply.
With more regulations coming out of the Cyberspace Administration of China, there may be other companies that separate their China services instead of choosing a uniform approach for everything.
However, privacy experts have been quoted saying that they could see elements of this algorithm regulation adopted more widely should it prove effective.
Adams, Samuel, J.D. 2021. “China’s draft algorithm regulations: A first for consumer privacy”. IAPP. https://iapp.org/news/a/chinas-draft-algorithm-regulations-a-first-for-consumer-privacy.
Anderson, Monica; Auxier, Brooke; Kumar, Madhu; Perrin, Andrew; Rainie, Lee; and Turner, Erica. 2019. “Americans and Privacy: Concerned, Confused, and Feeling lack of Control Over Their Personal Information”. Pew Research Center. https://www.pewresearch.org/internet/2019/11/15/public-knowledge-and-experiences-with-data-driven-ads/pi_2019-11-14_privacy_3-03-2/.
Brown, Eileen. 2020. “Most consumers will trade their data for personalization”. ZD Net. https://www.zdnet.com/article/four-out-of-five-consumers-will-trade-data-for-personalisation/.
Leach, Emily and Weller, Aaron. 2020. “How to build a ‘culture of privacy’”. IAPP. https://iapp.org/news/a/how-to-build-a-culture-of-privacy.
Mozur, Paul and Weise, Karen. 2021. “LinkedIn to Shut Down Service in China, Citing ‘Challenging’ Environment”. New York Times. https://www.nytimes.com/2021/10/14/technology/linkedin-china-microsoft.html
2021. “Translation: Internet Information Service Algorithmic Recommendation Management Provisions (Opinion-Seeking Draft)”. DigiChina. https://digichina.stanford.edu/work/translation-internet-information-service-algorithmic-recommendation-management-provisions-opinon-seeking-draft.