Fot FLCKR jesulvisFollow censorship Liberal advice

Bill C‑11 is wor­se than Bill C‑10 becau­se it allows the Tru­de­au govern­ment to down­lo­ad its respon­si­bi­li­ties to the CRTC

By Mark Mancini
PhD student
Uni­ver­si­ty of Bri­tish Columbia

In the spring of 2021, the fede­ral govern­ment faced con­si­de­ra­ble bac­klash for its clum­sy attempt at regu­la­ting Cana­dian con­tent on the Inter­net. Bill C‑10 was con­cer­ned with com­pel­ling com­pa­nies like Net­flix Inc and Tik­Tok Inc to finan­ce and pro­mo­te Cana­dian con­tent. It was con­tro­ver­sial, not least becau­se the law could be read to tar­get con­tent pro­du­ced on user-dri­ven sites (such as Tik­Tok), the­re­by ensna­ring indi­vi­du­al con­tent cre­ators rather than the tech giants and sub­jec­ting them to disco­ve­ra­bi­li­ty requ­ire­ments and penalties.

One of the big­gest con­cerns was free expres­sion. Bill C‑10 could be read to grant Canada’s tele­com and bro­ad­cast regu­la­tor (the Cana­dian Radio-tele­vi­sion and Tele­com­mu­ni­ca­tions Com­mis­sion or CRTC) power to regu­la­te the con­tent of indi­vi­du­al expres­sions, some­thing that – to many of us – pre­sen­ted con­sti­tu­tio­nal and regu­la­to­ry con­cerns. As Uni­ver­si­ty of Ottawa’s Pro­fes­sor Micha­el Geist sta­ted upon the tabling of the Bill, it “hand[ed] mas­si­ve new powers …[to the CRTC] to regu­la­te onli­ne stre­aming servi­ces, ope­ning the door to man­da­ted Can­con pay­ments, disco­ve­ra­bi­li­ty requ­ire­ments, and con­fi­den­tial infor­ma­tion disc­lo­su­res, all bac­ked by new fining powers.”

reklama

Mer­ci­ful­ly, Bill C‑10 died becau­se of the elec­tion, and some of us tho­ught that would be the end of this sto­ry. Not so. The Tru­de­au govern­ment recen­tly re-intro­du­ced the same pig with dif­fe­rent lip­stick: Bill C‑11. It’s eno­ugh to say this Bill is gene­ral­ly not an impro­ve­ment on its pre­de­ces­sor, at least from the per­spec­ti­ve of the power it vests in the CRTC. Its cen­tral defect is failing to solve the pro­blem with Bill C‑10, and inde­ed making it wor­se by sim­ply allo­wing the CRTC to apply the law to users in cer­ta­in cases. This sho­uld be, if not con­sti­tu­tio­nal­ly pro­ble­ma­tic, then poli­ti­cal­ly so. If the CRTC can apply the law to a lar­ge class of indi­vi­du­al users, the govern­ment can eva­de its respon­si­bi­li­ty for this con­tro­ver­sial cho­ice in Par­lia­ment. In other words, the CRTC will still have the power to regu­la­te user-gene­ra­ted con­tent and sub­ject that con­tent to disco­ve­ra­bi­li­ty regu­la­tions and users to poten­tial penal­ties. It has this power despi­te the Bill being desi­gned to obscu­re it.

In Bill C‑11’s back­gro­un­der, the govern­ment says this new legi­sla­tion solves two pro­blems with its pre­de­ces­sor, Bill C‑10. First, “it cap­tu­res com­mer­cial pro­grams regar­dless of how they are distri­bu­ted, inc­lu­ding on social media servi­ces.” Second, “the pro­po­sed bill is also cle­ar that the regu­la­tor does not have the power to regu­la­te Cana­dians’ eve­ry­day use of social media, inc­lu­ding when they post ama­teur con­tent to the­se services.”

It seems, then, that the pro­po­sed Bill does not apply to Cana­dian users or indi­vi­du­al cre­ators. And the ope­ning part of the actu­al text of the Bill sounds pro­mi­sing. It says it must be con­stru­ed and applied in a man­ner that is con­si­stent with “(a) the fre­edom of expres­sion and jour­na­li­stic, cre­ati­ve and pro­gram­ming inde­pen­den­ce enjoy­ed by bro­ad­ca­sting under­ta­kings.” Sec­tion 4.1 (1) of the Bill sounds even bet­ter: “This Act does not apply in respect of a pro­gram that is uplo­aded to an onli­ne under­ta­king that pro­vi­des a social media servi­ce by a user of the servi­ce for trans­mis­sion over the Inter­net and recep­tion by other users of the servi­ce.” This seems to deal with the pro­blem so many cri­tics had with Bill C‑10 when it remo­ved key exemp­tions that exten­ded its sco­pe to inc­lu­de the ave­ra­ge Tik­Tok user.

So far, this sounds like a real impro­ve­ment. But the pro­mi­se fades when we con­si­der the CRTC’s new regu­la­tion-making power. A regu­la­tion is a form of law – the power to make regu­la­tions is given to an agen­cy by the elec­ted legi­sla­tu­re. This isn’t itself inhe­ren­tly pro­ble­ma­tic, and, of cour­se, regu­la­tion-making is wide­spre­ad today. But this goes fur­ther. Sec­tion 4.1(2) of the Bill basi­cal­ly “takes back” s.4.1(1) by giving the CRTC power to make regu­la­tions gover­ning “pro­grams” despi­te the seeming exc­lu­sion of user con­tent. If not con­sti­tu­tio­nal­ly pro­ble­ma­tic, it is poli­ti­cal­ly so sin­ce it allows the govern­ment to eva­de respon­si­bi­li­ty for the poten­tial­ly vast sco­pe of this law.

The clau­se is the cen­tral pro­blem with this new Bill. It is cabi­ned by a few fac­tors – name­ly s.4.2 (2) (a), which directs the CRTC to con­si­der “the extent to which a pro­gram, uplo­aded to an onli­ne under­ta­king that pro­vi­des a social media servi­ce, direc­tly or indi­rec­tly gene­ra­tes reve­nu­es” as it makes regu­la­tions. Based on com­ments made by the mini­ster, the tar­get here appe­ars to be YouTu­be music. But many other types of user-gene­ra­ted con­tent could con­ce­iva­bly fall under the sco­pe of the law, inc­lu­ding user-gene­ra­ted Tik­Tok vide­os or pod­ca­sts that indi­rec­tly gene­ra­te reve­nue and have other featu­res that fall within the sco­pe of the regu­la­tion-making power.

The end result is that this tech­ni­cal chan­ge has the poten­tial to ensna­re coun­tless users on vario­us plat­forms. Pro­fes­sor Geist has sum­ma­ri­zed the wide berth of power gran­ted to the CRTC in Bill C‑11 as follows:

Views on the sco­pe of this regu­la­to­ry appro­ach may vary, but it is unde­nia­ble that:

  1. regu­la­ting con­tent uplo­aded to social media servi­ces thro­ugh the disco­ve­ra­bi­li­ty requ­ire­ment is still very much ali­ve for some user-gene­ra­ted content;

  2. the regu­la­tions extend far bey­ond just music on Youtube;

  3. some of the safe­gu­ards in Bill C‑10 have been remo­ved; and

  4. the CRTC is left more power­ful than ever with respect to Inter­net regulation.

The Bill basi­cal­ly down­lo­ads real deci­sion-making a level down into regu­la­tions. Rather than the govern­ment taking respon­si­bi­li­ty for regu­la­ting user con­tent in this fashion, it inste­ad intends to grant it to the “inde­pen­dent” CRTC. If the­re is con­tro­ver­sy abo­ut any futu­re regu­la­tion, the govern­ment can shift respon­si­bi­li­ty to the CRTC. The regu­la­tion-making just rein­for­ces this, by gran­ting power to the CRTC to expand the sco­pe of the law and make deci­sions that sho­uld fall to Parliament.

An eerie silen­ce has met Ottawa’s plan to regu­la­te the Inter­net and outlaw hurt­ful – not just hate­ful – expression

A cri­tic could say that the sta­tu­te con­stra­ins the regu­la­tion-making power and that the inco­me-gene­ra­tion fac­tor is one non-exhau­sti­ve fac­tor. Per­haps. But I could grant all of this and still main­ta­in that the Bill pur­ports to grant signi­fi­cant power to the CRTC to apply the law to users, some­thing the back­gro­un­der sug­ge­sts it does not. This dispa­ri­ty betwe­en sub­stan­ce and rhe­to­ric is disturbing.

It is impor­tant here to address ano­ther possi­ble respon­se. Much is made in admi­ni­stra­ti­ve law abo­ut the need to empo­wer regu­la­to­ry experts to make deci­sions in the public inte­rest. So far as this goes, the devi­ce of dele­ga­tion could be use­ful. But it is not always and eve­ry­whe­re so, and the­re are dif­fe­ren­ces in degree. A dele­ga­tion to the CRTC here may be justi­fia­ble, but the govern­ment sho­uld take respon­si­bi­li­ty for the cho­ice to regu­la­te user con­tent. Pre­su­ma­bly, this sho­uld be some­thing that – if it needs to be addres­sed – sho­uld be addres­sed in the pri­ma­ry law rather than by the CRTC in its own wide, rela­ti­ve­ly uncon­stra­ined discre­tion. In other words, if Youtu­be music is the pro­blem, the law sho­uld be appro­pria­te­ly tailored.

The basic pro­blem here might be more fun­da­men­tal. What is the need for this Bill? It seems the regu­la­to­ry goal here may be to sub­ject the Act’s requ­ire­ments to users who gene­ra­te a cer­ta­in inco­me, for exam­ple, among other things. If that is the regu­la­to­ry goal, why is the CRTC regu­la­to­ry mecha­nism desi­ra­ble? If the govern­ment wants to make this poli­cy cho­ice, why can’t it do so in pla­in view in Parliament?

The­re are real demo­cra­tic tra­de-offs to using this sort of regu­la­tion-making power and, more spe­ci­fi­cal­ly, down­lo­ading respon­si­bi­li­ty to the CRTC. This is a con­tro­ver­sial appli­ca­tion of a regu­la­to­ry law – with penal­ties – to a poten­tial­ly huge class of users. Not only does the govern­ment per­mit by ste­alth what it says it has amen­ded the Bill to pre­vent, but it does it here by dele­ga­ting to the CRTC. I do not see this legal devi­ce, and this Bill, as any bet­ter than Bill C‑10.

Mark Man­ci­ni is a PhD stu­dent at the Allard Scho­ol of Law (Uni­ver­si­ty of Bri­tish Colum­bia), whe­re he stu­dies the law of judi­cial review par­ti­cu­lar­ly as it applies to the car­ce­ral sta­te. He is a gra­du­ate of the Uni­ver­si­ty of New Brun­swick, Facul­ty of Law (JD) and the Uni­ver­si­ty of Chi­ca­go Law Scho­ol (LLM).

© Troy Media