| Historical Information |
|---|
| This post is about a previous Session of Parliament. Any legislation here that did not receive Royal Assent has been terminated. |
C-292 – The Online Algorithm Transparency Act – makes it illegal to allow an online algorithm to use personal information in a way that causes discrimination that’s prohibited under the Canadian Human Rights Act, and requires platforms to be now transparent about how their algorithms work.
Discrimination Prevention
Disclosure
First up online platforms will need to make information on their algorithm easily accessible to the public. This information is:
- A description of the personal information it collects, uses, and discloses and how it collects/uses/discloses that information
- A description of how that personal information is used
- The method the algorithm weighs and prioritizes the personal information collected
- How that information is used to recommend, withhold, or promote content to the user
- A statement on whether or not any of the personal information gathered is transferred between provinces or internationally
Prohibition
Next up, platforms will need to make sure their algorithms don’t discriminate against people based on protections in the Human Rights Act. The way this works is that the algorithm won’t be allowed to deny goods, services, facilities, etc to anyone. An example here would be pushing ads for housing that won’t be shown to people of a particular race. Nothing should be hidden to people based on anything that would count as discrimination under the Human Rights Act.
Failure to comply with this part of the Act could result in a fine up to $1 million.
De-Identification of Personal Information
Online platforms will need to make sure none of the personal information they track can be used to identify someone. They’ll be expected to take measures appropriate for how sensitive the information they track is, so something like your address would be under much stricter requirements than what brands you like to buy from. It will also be illegal to use any combination of this information to identify someone.
Failure to comply with this can result in a fine up to $50,000 on the first offence and $100,000 for each subsequent offence.
Advisory Committee
An advisory committee will be formed to monitor online platforms and make recommendations to prevent discrimination caused by their algorithms. The committee will consist of seven members, all of which need to have experience with online algorithms or human rights. The committee will meet at least twice a year, and will need to report back to the Minister of Industry with their recommendations annually.
Enforcement
The CRTC will be able to appoint inspectors responsible for checking compliance with this Act. Inspectors will be allowed to enter any building they feel necessary to ensure compliance with this Act. An exception here is for any homes, where the inspector will need to get a warrant first. These warrants can only be granted if the inspector has already been denied entry to the home or if there’s reason to believe they’ll be denied.
To do their jobs inspectors will be allowed to:
- Examine and copy any documents
- Access a computer to examine any data on it or a connected network, as well as print out any information needed from that computer
- Order any individual at the location under inspection to identify themselves
- Order anyone that performs activities regulated under this Act to start or stop those activities
- Limit access to any part of the workplace being inspected
- Remove anything for further examination
As usual everyone in the workplace being inspected will be expected to assist the inspector.
The CRTC will need to present an annual report that outlines any inspections that were performed and covering any recommendations they have regarding this Act.
Punishment
There’s two important notes here in terms of who’s liable when these regulations are broken. First up, being able to prove an employee of an online service provider broke these regulations is sufficient proof that the company broke them. The employee doesn’t need to be identified for this. So just finding code in a file only an employee could access would be enough to prove the company broke these regulations.
Second, any director or officer that influenced the company’s policies leading to any charges will also be charged with the same offence. This applies even if the company is found innocent. They could still be found guilty themselves.
Regulations
After consulting the CRTC the government can make regulations around this Act, including:
- Defining what counts as an online communication service, and as such what’s affected by C-292
- What counts as a “private communication” within those services
- How companies are expected to inform users about how private information is used
- How records about data collection are to be kept and how long they need to be kept to show that a company is compliant with this Act
- Setting any penalties to be applied to companies that fail to comply with this Act
Progress
C-292 is currently outside of the Order of Precedence.
Discover more from Commons Sense
Subscribe to get the latest posts sent to your email.
