Electronic Workplace Monitoring and Human Rights: The Limits to Ontario’s New Algorithmic Monitoring Legislation

by | Nov 3, 2022

author profile picture

About Valerio De Stefano

Prof Valerio De Stefano is the Canada Research Chair in Innovation, Law and Society at Osgoode Hall Law School, York University, Toronto. His research focuses on artificial intelligence at work, algorithmic management, and labour and technology. Before joining Osgoode he was a law professor at the KU Leuven, in Belgium, and an officer of the International Labour Office in Geneva. He holds a PhD from Bocconi University in Milan.

Image Description:  A robot looking at coding/programming algorithm.

In early October 2022, Ontario became the first Canadian province to expressly regulate algorithmic monitoring at work. In terms of recent amendments to the Employment Standards Act, 2000, businesses with more than 25 employees must now have a written policy outlining whether they monitor employees electronically. If they do, the policy must explain how, when, and for what purposes the employer engages in this monitoring.

Ontario’s initiative is an important first step. It is crucial to address digital and algorithmic monitoring at work. Particularly after the soar in remote work linked to the COVID-19 pandemic, many businesses installed programs and tools to digitally track work performance in extremely invasive ways. Software can track the movements of the mouse, how many keystrokes are given in a specific timeframe, the browsing activity of workers, and can scan employees’ texts and messages in internal chats in search of non-work-related talk.

Regulating tech-enabled monitoring to avoid abuses is, therefore, positive. Ontario’s legislation, however, is far too limited for the following reasons.

Firstly, algorithmic management is not only a matter of transparency and privacy; many other rights are at stake. Algorithmic monitoring increases stress and creates occupational health hazards. Crucially, algorithmic management implies serious risks of discrimination against women and minorities as algorithms often incorporate longstanding biases in societies and the tech community. Tracking technology has also been reported to “bust” trade unions. Because algorithmic management poses risks to a vast range of human rights at work, it does not make sense to only regulate businesses with more than 25 employees as Ontario has done. Workers in small business are also entitled to human rights protections.

Secondly, risks also extend much beyond office workers and employees. Platform workers, workers in warehouses and many other blue-collar occupations are also managed by algorithms. Any regulation of algorithmic management should therefore offer universal protection irrespective of occupation types and employment status, including self-employed workers who are not protected under this law.

Moreover, the law merely imposes information duties towards individual workers, something that is drastically insufficient. Enhancing transparency towards individuals is not enough. Some forms of monitoring should be outright banned, for example, any form of monitoring that processes biometric data or the emotional and mental states of workers. This is no futuristic fantasy but the reality of management software that collects data about heartbeats or facial expressions through wearable devices or video scanning.

The draft EU Directive on platform work, for instance, bans any forms of algorithmic management that “process any personal data on the emotional or psychological state of the platform worker”. The draft Directive also introduces rights to explanation and contestation of decisions based on algorithmic processes and a “human in command” approach that allows workers to demand that any decision significantly affecting them be reviewed by human beings. Some transparency rights under the draft Directive also extend to self-employed people. Notably, a right to an explanation and a right to contest the output of algorithmic decision-making were also recently recommended by the Office of the Privacy Commissioner of Canada, something that the Ontario legislature evidently overlooked.

Most importantly, individuals often do not have the capacity or expertise to react to abusive algorithmic management and digital surveillance practices. Trade unions and regulators must be involved to protect and enforce human rights at work, also when technological tools threaten them. This is something that the draft Directive on platform work seems to take into account (albeit, arguably, much can be improved in the draft). Ontario’s regulation – limited as it is to enhance transparency towards individual workers – is far too vapid and essentially procedural. Nothing expressly prevents the sheerest invasion of privacy or the most severe abuses once employers have fulfilled their duty to prepare a policy and inform their workers about it.

It takes more than an employer’s unilateral policy to minimize the risks connected to electronic monitoring. Algorithms at work must be negotiated, not just disclosed. Ontario’s workers, like any other workers subject to algorithmic management, deserve more than transparency – they need agency, particularly through collective oversight and action.

Share this:

Related Content


Submit a Comment