C-216 – Promotion of Safety in the Digital Age Act – creates new regulations around online safety.
This is a bit of a long one so I’m breaking it down into parts, especially since the first part doesn’t quite match the rest and should probably be its own piece of legislation. In the first part of C-216 Michelle wants to add a list of new requirements for websites, mostly social media, that are used by minors. This part will be called the Protection of Minors in the Digital Age Act.
Protection of Minors in the Digital Age Act
So right off the bat the operators of an online platform or service will be expected to provide their service in a way that protects the best interests of minors. This will involve designing the service in a way that prevents:
- Physical harm or incitement to harm
- Online sexual violence against minors
- Creating or sharing sexuality exploitative images of a minor, altered or not
- Marketing products and services that are illegal for minors, like alcohol, weed, and gambling
- Mental health disorders, including eating disorders, substance use, and self-harm/suicide
- Patterns of use that encourage addiction
- Allowing a minor to use the platform without first verifying the contact information of their parents, such as through their internet service provider
- Predatory or deceptive marketing practices
Note that none of this expects the operator from preventing a minor from actively seeking out anything on this list. If a minor decides they want to search for an online casino that won’t be held against the operator.
Safeguards
This next bit applies to users that are children (16 or younger) and their parents. Parental controls will need to be made accessible to allow parents to:
- Control who can communicate with the child
- Prevent others from accessing personal data collected about the child
- Reduce features that encourage using or spending more time on the platform, such as notifications and time-based rewards
- Control personalized recommendation systems by letting the parent choose what categories of content will be recommended or opting out entirely
- Restrict sharing the child’s geolocation info and be notified when that information is being tracked
By default operators will need to have all these settings set to the highest, most restrictive settings. Only the parents will be able to reduce them.
Operators will be expected to use reliable algorithms for age verification to block children from inappropriate content while preserving privacy. They’ll also need to protect the privacy and well-being of children on the platform, and take steps to correct any issues they’re alerted to. They’ll also need to provide parents the ability to delete a child’s account, remove all data collected on that child, or limit the amount of time a child spends on the platform.
The requirements for parental controls for minors (18 or younger) are a bit looser. Operators will need to provide the parents of minors the ability to manage privacy and account settings, view metrics of time spent on the platform, and prevent the minor from making online purchases. (These of course also apply to children.) The default settings for these will also be the most restrictive level, and minors will need to be notified what parental controls are active. If the operator has reason to suspect that these settings have been changed by the minor they’ll need to notify the parent.
Reporting
Operators will need to create a dedicated, readily accessible channel for people to report potential harm directed at minors. They’ll also need to establish internal procedures for handling these reports in a timely fashion.
Prohibitions
Under C-216 there’s a few things operators won’t be allowed to do. First, no advertising products to minors that are illegal to sell to minors, such as alcohol, tobacco, or gambling. This includes through personalized advertising.
They aren’t allowed to design their service in a way that manipulates someone’s behaviour to weaken the effectiveness of parental controls. (For example, putting the controls behind 3 “Accept” buttons, a password, and an email is going to result in less people bothering to use them.)
Operators won’t be allowed to collect or request a digital identifier from a person that identifies their right to access information or services online. (I’m not entirely sure what this means. They can’t collect IP addresses? Does this affect how they handle age verification? If anyone knows what they’re talking about here let me know!)
Note that none of this is to be taken as limits on an operator’s ability to prevent illegal activities.
Disclosure
Every operator will need to put the following information about their platform in a prominent, accessible location:
- Its policies, practices, and safety settings
- Access to safety settings and parental controls
- The type of personal information the platform collects and discloses and how it does so
- The use of any personalized recommendation system, and options to modify or opt-out of these systems
- The use of any labels that mark advertisements, products, or services that are directed at minors
Speaking of advertising, operators will need to include clear labels and information about:
- The name of the product, service, or brand and the subject matter of each ad
- If the platform conducts targeted advertising, the reason any minors are targeted by that ad and the way their personal information is used to engage such advertising
- If applicable, the fact that content displayed to a minor is an ad, including endorsements from other users
Transparency
There’s a few things in C-216 to ensure operators are compliant. First, they’ll need to keep auditable logs on the collection and processing of personal data. Every two years they’ll need to have an independent review of their platform and its impact on minors. This includes any risks or harms from the platform’s use by minors, as well as any cumulative effects from continued use. Operators will need to make this report publically available.
Operators will need to produce an annual report outlining the steps taken to address issues found in the last review. This report will also need to include a risk and impact assessment regarding:
- The extent the platform is likely to be accessed by minors
- Data on the number of minors on the platform and their daily, weekly, and monthly usage time
- The platform’s safety settings and parental controls, their effectiveness, and any breaches that have occurred around them
- The extent the platform’s design features, such as personalized or automated content or rewards for using the platform over time, pose risk to a minor’s privacy, health, and well-being
- The collection, use, and disclosure of personal information such as geolocation and health data and the way this information is used
- The number and nature of reports received through the reporting channel
- The internal process to receive these reports and the timeliness and effectiveness of the response to them
This report also needs to be made public in a prominent place on the platform.
Market Research Guidelines
Super short bit here but the CRTC will be responsible for creating guidelines on how operators are to conduct market research when the target demographic includes minors.
Offences and Punishment
Anyone who violates the parts about publishing reports, notices of how data’s being used, or notices about ads can be fined up to $10 million. Anyone who violates any other part of this Act can be fined up to $25 million. Note that nobody will be charged with an offence here if they can prove they performed due diligence in attempting to follow this Act.
Private Right of Action
Anyone, including a minor, who thinks they’ve suffered “serious harm” caused by an online platform will have the right to sue the operator. “Serious harm” is includes significant physical and economic damage. People will have 3 years to file a lawsuit for damages.
Regulations
The Minister of Industry will have the ability to exempt any platform from C-216 if they believe the platform’s policies are at least as strong as this legislation.
The government will also have the ability to create regulations around how a platform’s policies are to be displayed and what information will need to be shown, as well as what data needs to be kept to prove compliance with this Act.
Author’s Note
I have some questions about how this legislation would be applied. For starters how does the advertising sections interact with third-party advertisers? Does Google AdSense count as a platform? Would you be able to claim that you performed due diligence by asking Google not to show gambling ads on your platform even if Google does it anyway? And I’m sure we’ve all seen those fake ads with politicians and celebrities, what happens if an advertiser just lies to AdSense about what’s in their ad?
I’ve also got questions about how platforms are expected to verify the age of adults. It talks about using algorithms to determine the age of the user, but what happens if minors just make a new account to get around the algorithm? I’m not entirely sure how online platforms are expected to comply with this.
Progress of C-216
C-216 is currently outside of the Order of Precedence.
Discover more from Commons Sense
Subscribe to get the latest posts sent to your email.
