Glukhin v. Russia: The European Court of Human Rights’ First Step into the Age of AI Surveillance
by Jomart Joldoshev, LL.M.
The rise of artificial intelligence has transformed many aspects of daily life, including the way governments monitor public spaces. In Glukhin v. Russia, decided in July 2023, the European Court of Human Rights (ECtHR) addressed for the first time the compatibility of live facial recognition technology with the European Convention on Human Rights. The case highlights how new technologies are testing traditional rights to privacy, expression, and assembly, and it illustrates the challenges faced by international courts in adapting human rights law to the digital era.
In 2019, journalist and activist Nikolay Glukhin staged a one-person protest in the Moscow subway. He carried a cardboard cutout of another activist, Konstantin Kotov, to draw attention to Kotov’s imprisonment under Russian assembly laws. The demonstration was peaceful, symbolic, and did not interfere with public order. Shortly after the protest, authorities identified Glukhin through Moscow’s live facial recognition system, which is integrated with thousands of surveillance cameras across the city. Based on this identification, he was fined for failing to notify the authorities of the protest, as required by domestic law.
Although Russia was expelled from the Council of Europe in March 2022, the Court retained jurisdiction over conduct occurring before September 16, 2022, the date on which Russia ceased to be a party to the Convention. Because the events in Glukhin’s case took place in 2019, the application remained within the Court’s temporal jurisdiction. This ensured that the case could proceed and underscores the ECtHR’s continuing authority to adjudicate claims arising from acts committed before Russia’s withdrawal from the Council of Europe.
Before the Court, the Russian government argued that facial recognition technology was a lawful and effective method of maintaining public order, presenting it as a neutral enforcement tool authorized under domestic legislation. Glukhin responded that the use of such an intrusive method to identify a peaceful protester was inconsistent with Convention protections. The ECtHR was asked to determine whether Russia’s actions violated Articles 8 and 10 (with the Court applying Article 11 principles in its Article 10 analysis), which concern the rights to private life and freedom of expression, respectively.
In its judgment, the Court characterized facial recognition as a “highly intrusive” form of surveillance. Unlike traditional policing methods, it involves the systematic collection and processing of sensitive biometric data, creating the possibility of continuous monitoring of individuals in public spaces. This capacity, the Court explained, requires a clear legal basis, robust safeguards, and a pressing justification. Applying this framework, the Court found that Russia had not demonstrated why the use of live facial recognition was necessary in response to a peaceful protest that posed no risk to public order. The interference with Glukhin’s rights was therefore disproportionate. The Court emphasized that efficiency alone cannot justify intrusions on fundamental rights when less restrictive alternatives exist.
The Court held that Russia had violated Article 8 because Glukhin’s biometric data was processed without sufficient legal safeguards, and Article 10 because the administrative-offence conviction of a peaceful solo demonstrator failed the “necessary in a democratic society” test for freedom of expression. In its reasoning, the Court stressed that Convention guarantees must have a practical effect and not be reduced to a formality. It underlined that surveillance powers cannot be justified solely on efficiency or domestic authorization but must be exercised within the Convention framework. The judgment reinforced the principle that all surveillance measures must comply with the standards of legality, proportionality, and necessity.
The ruling also built on earlier Strasbourg case law, particularly Zakharov v. Russia (2015), where the Court held that Russia’s system for intercepting mobile communications lacked proper safeguards and created a risk of arbitrary interference. In Glukhin, the Court extended these principles to artificial intelligence, confirming that the requirements of legality, proportionality, and necessity govern new technologies as much as traditional forms of surveillance. The judgment also highlighted the danger of function creep, where systems developed for limited purposes are gradually expanded into broader use without sufficient oversight.
The significance of Glukhin v. Russia lies in its clarification of how Convention rights apply to AI-driven technologies. The judgment demonstrates that existing legal doctrines can address the challenges posed by artificial intelligence, and that courts can adapt established proportionality analysis to new contexts. For practitioners, the case offers guidance on framing claims involving biometric surveillance under Articles 8 and 10. For policymakers, it highlights the need to enact legislation that clearly regulates the use of AI-based systems, establishes robust safeguards, and ensures effective judicial oversight. For other courts, it illustrates the adaptability of human rights instruments to evolving technological realities.
Ultimately, Glukhin v. Russia establishes an important precedent for the governance of artificial intelligence under international human rights law. By holding that live facial recognition was disproportionate in the context of a peaceful protest, the ECtHR reinforced that technological innovation cannot justify encroachments on fundamental freedoms. The judgment ensures that artificial intelligence will be assessed within the framework of legality and proportionality, thereby strengthening the role of human rights law in guiding the use of new technologies in public life.



