Beware of 'Algorithmic Regulation'
36 Pages Posted: 12 Feb 2019 Last revised: 12 Mar 2019
Date Written: February 1, 2019
Among the structural elements that enable social media platforms to durably influence our moods and behaviour, their answering a widespread desire to be liked and accepted - a desire which is seldom transparent to us - greatly increases their manipulative power. So does their ability to harvest fine-grained information about their users (and acquaintances). This data puts such platforms in a position where they can not only covertly influence our thoughts, moods and behaviour: they can do so in a way that is maximally effective given our respective traits, vulnerabilities etc. This paper argues that if we are to stand any chance of taming these platforms’ considerable manipulative potential, some conceptual spring-cleaning is needed.
First, we need to stop confusing regulatory power and regulation. Whereas in mechanics or cybernetics there is no need to distinguish between the two -since there are no agents whose autonomy is infringed by regulation- when the concept of regulation is applied to human behaviour this distinction becomes crucial, and hinges upon the concept of authority. Since social media platforms do not - yet - claim authority, they do not regulate us: they have regulatory power over us. Our task is to regulate that power.
To succeed in the latter task, we also need to acknowledge that the type of influence exercised by social media platforms is not merely non-deliberative: it is also covert. While the former is often unproblematic from an ethical/legal perspective, the latter not only threatens our right to freedom of thought, it also compromises our commitment to moral equality.
Keywords: regulatory power, regulation, autonomy, normative agency, social media, cybernetics, algorithmic regulation, manipulation, social cruelty
Suggested Citation: Suggested Citation