Our Digital Selves: Privacy Issues in Online Behavioural Advertising

AuthorChristopher Scott
PositionIs a J.D. candidate at the University of Victoria
By Christopher Scott*
CITED: (2012) 17 Appeal 63-82
Canadian s are spending more of their lives online than ever before.1 is trend has
profound ra mif‌ications acros s Canadian society, includi ng within the f‌ield of privacy
law. is paper will exam ine the privacy implications of two related technologies with in
the emerging f‌ie ld of online behavioural advertising. e f‌irst is the use of tracking
cookies to tra ck users’ activity across websites, a nd the second is deep packet inspe ction
(“DPI”). e use of these technologies in the f‌ield of targeted advertising has not yet been
subject to a f‌inding under the Personal Information and Protection of Electronic Documents
Act (“PIPEDA” or the “Act”),2 the federal private-sector privacy stat ute.
e goal of this paper is to survey the application of PIPEDA to this yet-nascent f‌ield and
describe the shape that a PIPEDA-c ompliant use of these technologies is likely to take.
For context, I will make referenc e to two prominent corporations at the forefront of this
f‌ield: Google a nd Phorm. ese corporations a re intended to be viewed a s case studies.
e goal of t his paper is not to cata logue the apparent failure s of either organization in
the style of a complaint to the Privacy Commissioner, but rather to illustrate the delicate
interplay of – and tensions between – privacy rights and legitimate commercia l interests.
Before exploring the legal issues arising from the se technologies, it is ne cessary to have
some familiarit y with the technical man ner in which they operate and an understanding
of the kinds of personal information they enable organizations to obtain. Understand ing
the pre sent a nd potential use of t hese technologies is essential to framing the privacy
issues they ra ise.
* Christopher Scott is a J.D. can didate at the University of Victo ria. He wrote this paper for the
course “Information an d Privacy Law”, taught by David Loukidelis an d Murray Rankin, Q.C. He is
grateful for David and Murr ay’s encouragement and depth of insight on this a nd related topics.
Christopher is actually qu ite fond of Google, and nds some of this p aper’s conclusions to be
1. Statistics Canada, Canadian Inter net Use Survey (Business Special Sur veys and Technology
Statistics Division, 200 9), online: The Daily
2. SC 2000, c 5.
A. Tracking Cookies
i. How Tracking Cookies Work
When a web browser visits a website, that site may instruct the browser to store a “cookie”.
A cookie is a small text f‌i le containing inform ation provided by the website. If a browser
has been given a cook ie by a website, it will send the cook ie back to the website on
every subsequent visit. By placing a un ique identif‌ier in each cookie, t he website can use
cookies to keep track of a particular web browser’s comings and goings.3 is interaction
is i nvisible to t he u ser operating the browser, a nd typical ly occurs without his or her
explicit consent.4
Tracking cookies do more than enable organizat ions to identify users within the conf‌ines
of their own websites. Organ izations also use them to track browsers a cross the websites
of third pa rties with which t hey have partnered (and which have added a piece of code
to their own website s to enable this). In this way, tracki ng organizations can k eep track
of browsers’ activity across extensive networks of partnered sites. ese cook ies enable
the organiz ations to record information including the time of the access, the IP a ddress
of the browser (which may reveal the approximate geographic location of the browser),
the URL of the pages visited, the contents of the pages visited a nd the unique identif‌ier
stored in the browser’s cookie.5
Up to this point, I have referred primarily to “browsers” and on ly rarely to “users”. is
is intended to highlight t he fact that tracking cookies se e only browsers, not people.
Generally, a cookie is part icular to a single browser on a single computer user account
(usually on a single computer). Accordin gly, one person may be associated with many
tracking cookies, and a single tracking cookie can c apture the personal information
of multiple individuals. e most rele vant example here is of a family computer with
a single user account. To the extent that members of the family (as wel l as any g uests)
use a c ommon browser on the computer, they will be trac ked together, and all of their
disclosed persona l data will be lumped together under the cook ie’s common identif‌ier.
Finally, the last relevant consideration regarding browser cookie technology is that
cookies have expiry date s. When a c ookie expires, it gets deleted, mea ning the issui ng
organization must issue a new unique identif‌ier the next time that the browser visits.
Similarly, most browsers allow users to manually delete cookies before their expiry date s.
is is an ef‌fective tabula rasa; having lost t he key that ties your browser to you r past
browsing behaviour, the organiz ation must now start from scratch with a new identif‌ier.
ii. Case Study – Google AdSense
e most prominent system of tracking cookies is Google’s AdSense.6 Google serves
advertisements on the websites of its vast network of partners – by some estimates, nearly
one in f‌ive websites display Google AdSense a dvertisements.7 ese advert isements are
3. For instance, my Google Chrome browser on my lap top computer has the unique identier
4. Electronic Privacy Informatio n Center, 6..7∗(89%.−:∗−(;%;??(>∗≅,.ΑΒ?>Α∗)#≅Χ?∗−/(Α−(/?
5. C.f. Google Privac y Center, online:
6. References to “AdSense” throughout this paper also ref er to DoubleClick, a parallel a dvertising
network owned by Go ogle that is based on the same technol ogy and even uses the same
cookie. See also note 5.
7. W3Techs, Usage of adver tising networks for websites, online: //w3techs.com/technologies/
not stored on the webservers of the website that users have chosen to visit; they are served
directly from Goog le’s ser vers to the browser, w here they are displayed a longside the
contents of the website t hat was requested. In the process of fetching the adver tisement
from Google’s servers, browsers dutifully send Google their trac king cookies. is
interaction provides Google w ith all of the above-mentioned inf ormation, including the
URL of the pag e that the user has chosen to view.8
As a consequence, not only c an Google mine every search you perform on the Google
homepage9 for in formation about you r interests and browsing habits, but it also know s
which of its partnered websites you visit i ndependently. Google collects all of this
information and, based on the content of sites that you frequent, infers w hich “interest
categories” c onsumers might be interested in. On the basis of these categories and the
contents of the page that you are presently viewing, Google can tailor the advertisements
it sends you on its partner sites.10 us, a u ser in Ca nada who frequent ly searches for
travel information on Google and chooses to view a website about Mexican history
might see advertisement s about travelling to Mexico displayed on that site.
In the context of privacy law, it is signif‌icant to note that this browser data can be collected
even if the browser has never been to a Google-owned webpage or had the opportunity to
agree to Google’s privacy policy dire ctly. Google requires that pa rtners provide notice of
Google’s collection of browsing information from the partner’s site as wel l as other sites
across the we b for the purpose of ser ving advertisements ba sed on that behaviour; they
also require partners to notify users of cook ie management options.11 is is typically
accomplished via the incor poration of Google’s privacy policy into th at of the partnered
website. In addition, because ads are serve d simultaneously with webpag es, users may be
required to view ads – and thus disclose personal information – in order to f‌ind the third-
party’s privacy policy. Even if users disagree with the privacy policy of that third part y,
Google has al ready collected their personal information.
Prior to 2007, Google’s track ing cookie was set to expire in 2038 (in ef‌fect, never), but
in response to privacy concern s it now has a two-year rolli ng expiry date that is renewed
every ti me the cookie gets used.12 In practice, this means that the cookie is u nlikely to
expire before the user c eases usi ng the browser perma nently, either due to switching
to a new browser, user a ccount, or computer (at which point a new cookie is created).
Google stores user interest inform ation for at least as long as the cook ie’s active life - but
anonymizes server logs (which include IP and UR L information) after 18 months as
a mat ter of polic y.13 Google insists that a shorter retention period would reduce their
ability to protect user security and may put t hem in violation of the data retention law s
of some countries.14 Google’s retention policies are not codif‌ie d in its privacy policy.15
8. Google Privacy Center, supra note 5.
9. Google, online: .
10. Google Privacy Center, supra note 5.
11. AdSense Terms and Conditions, online: .
12. Peter Fleischer, “Cookies: expiring so oner to improve privacy” (16 July 2007), online: The O cial
Google Blog
13. Peter Fleischer and Nicole Wong, “Taking ste ps to further improve our privac y practices” (14
March 2007), online: The Ocial G oogle Blog
steps-to-furt her-improve-our.html>.
14. Google Log Retention Polic y FAQ, online: Public Intelligence
google-log -retention-polic y-faq/>.
15. Google Privacy Polic y, online: .
Google provides an opt-out mechanism for users. Users must opt out for each browser on
each computer that t hey use, due to the technic al limitations discusse d above.16 Google
also provides a n interest-management tool that enables users to volunta rily disclose to
Google the t ypes of advertisements in w hich they are interested (referred to as “interest
categories”) and to remove interest categories from that li st of interests.17 Google is
careful to note that no personally identif‌iable information is collected “without your
explicit conse nt”18 and that it “wil l not associate sensitive interest cat egories with your
browser (such a s those based on race, reli gion, sexual orientation, health, or sensitive
f‌inancial c ategories)”.19 Google does, however, track us er product interests; for instance,
a perusal th rough my own aggregated list of interests re vealed that Google was aware of
my fondness for purchasing c omputer hard drives online.
Google’s privacy policy st ates only that Google takes “appropriate secu rity measures” to
safegua rd data a cquired through track ing cookies, that employees and contractors may
view it only on a need-to-know basis, and that third parties who do access it on this basis
are bound by conf‌identia lity agreements and may even suf‌fer cr iminal consequences for
a breach of security.20
B. Deep Packet Inspection
i. How Deep Packet Inspection Works
At a technical level, DPI is quite straightfor ward. Whenever you do anything on
the Internet – such as loading a webpage or sending an e-mail – you either send or
receive “packets” of digital information. Every packet you send goes direct ly to your
Internet service provider (“ISP”), which then sends it of‌f in the direction of its intended
destination. Similarly, every packet you receive comes f‌irst to your ISP, which then
sends it st raight to you. As a result, your ISP can see all of your unencry pted digital
communications directly, without resorting to the use of tracking c ookies or t he li ke.
is al lows for much broader disclosu re than trackin g cookies, as DPI reve als not only
where users go, but also what the y do.21
On the other hand, DPI is computationally expensive, me aning that it requires
substantial equipment and technica l expertise to p erform ef‌fectively. Most ISPs do not
have t he equipment or the expert ise to a nalyze the entire contents of every packet of
information that passes through their networks. Every packet of information contains
“header” information and “payload” information. Headers include the pac ket’s source
and destination IP addresses, the protocol being used, the port being u sed (which
roughly corresponds to the application that sent it),22 and other network-related technical
information. e payload is t he information that is being del ivered. is payload may
16. Google Privacy Center, supra note 5. Goo gle also oers a downloadable too l that will opt all
browsers on a single computer o ut of AdSense’s tracking program.
17. Ibid.
18. Ibid.
19. Ibid.
20. Google Privacy Polic y, supra note 15.
21. Assistant Commissioner recommends Bell Canada inform customer s about Deep Packet Inspection (3
September 2009), PIPEDA Case Summar y #2009-010 at paras 4-8, online: OP C
gc.ca/cf-dc/20 09/2009_010_rep_0813_e.cfm>. This OPC decision goes into much more te chnical
detail regarding the work ings of DPI, but reaches the same conclusi on: DPI can give ISPs the
technical ability to see n early everything.
22. I say “roughly” here because, id eally, each port number refers to one a pplication. However,
applications can selec t their own port numbers , meaning that some will “spoof” an other
application’s number in order to g et preferential treatment. C.f. note 23 at p aras 10600-10602.
not be readable on its own, however; a single piece of information can be split up between
several packets, so each of t hose packets can be c ollected and then read toget her. Bil l
Keenan, Di rector of Technology for CTV, described the technical cha llenges involved
as follows:
[T]he expense involved in doing tr ue Deep Packet I nspection – which
means not just inspect ing the headers … which is, functional ly, the
address on t he envelope, but actually opening all of the envelopes and
pasting them toget her and seeing what it reads. Doing that for every piece
of content that comes over the network wou ld absolutely be prohibitively
For t his reason, DPI may consist either of merely readi ng packet headers or reading
the entire contents of each packet. roughout this paper, references to DPI will refer
to the lat ter method. An inspe ction of a consumer’s packet contents may re veal “photo
images, [or] f‌inancial and contact information”,24 in addition to the information revealed
in the pa cket headers. However, the rea ding of packets’ header information should not
be d iscounted. Canad a’s Privacy Commissioner h as previously noted that headers are
rich with personal in formation – if analyzed, they ca n identify the us e of “most popular
services or applications”, “[s]ubscriber usage patterns”, “[a]pplication usage patterns”,
“competing ser vices and their presence on the net work” and “malicious traf‌f‌ic on the
ii. Case Study – Phorm Inc.
DPI provides incredibly detailed information about consumers’ lives throu gh their
use of the Internet. According ly, it can b e applied in a variety of circumstances. For
instance, some Ca nadian ISPs routinely use DPI for traf‌f‌ic-management pur poses (such
as by prioritizing the transfer of time-sensitive packets issued by internet telephony
applications).26 However, no Canadian ISPs a re presently using DPI for advertising
purposes, and none examine the payload of packet s for personal information – they read
only the headers.27 For examples of DPI-enabled advertising, we will need to look beyond
our borders.
Phorm Inc. is the one of the most-publicized organizations pursu ing DPI-enabled
advertising. Phorm contract s with ISPs to do the heavy lifting of DPI for them. In these
arrangements, the ISPs send Phorm all of thei r consumers’ packets, from w hich Phorm
generates a prof‌ile of a user’s interests. is requires performing at least a header-level
analysis on all packet s sent by the u ser; Phorm also reads the contents of most packets
sent to t he user.28  is al lows Phorm to collect, at a m inimum, “website addresses,
searches [and] browsing history” as well as the full page contents of nearly everything
that users read onl ine.29
23. CRTC, Transcript of Proceedings, Canadian bro adcasting in new media (10 March 2009) at para
24. Assistant Commissioner recommends Bell Canada inform customer s about Deep Packet Inspection,
supra note 21 at para 16.
25. Ibid at para 15.
26. C.f. note 21.
27. CRTC, Transcript of Proceedings, Canadian bro adcasting in new media
(27 February 2009) at para
28. Chris Williams, “How Phorm plans to tap yo ur internet connection” The Register (29 Feb ruary
2008), online: The Register .
29. Technology, online: Phorm Inc. .
To compensate for the broad scope of its data col lection, Phorm has taken a st rong
initiative in limiting the retention a nd use of this d ata. e company is c areful to note
that users’ IP addresses, browsing histor y, sea rch terms and the like are not stored.30
Phorm analyzes the inf ormation for indications of the user’s interests, stores that derived
user-interest information, and then deletes the information that was original ly collected.31
Like Google, Phorm does not associate “sensitive” user interests (such as medical or
adult information) with consumers’ accounts.32 Phorm claims to substantially curt ail the
invasiveness of its DPI ana lysis by excluding non-web packets (such as e-mail or VOIP),
certain web-based e-mail serv ices and form submissions (that is, user content posted to
the web) from its a nalysis.33 As a result, although Phorm still col lects far more personal
information that Google, Phorm uses information in a simila r manner to Google and
claims to reta in less of it.
Much like Google, Phorm uses the i nformation it collects to serve ads on third-
party websites. It al so of‌fers an opt-in website-recommendation servic e directly to
users (dubbed PhormDis cover) and a secur ity service dire cted at wa rning users about
potentially f raudulent websites (PhormSecure). Although Phorm originally intended to
use the information col lected to serve ads as part of an opt-out scheme (rather th an opt-
in), it has been required by UK regulators to adopt an opt-in program, which it now uses
in all markets.34 Phorm currently operates in Brazil, Korea, the United Kingdom and the
United States.35
Also sim ilar to Google, Phorm’s Privacy Policy promises “security measu res”, employee
training , and contractual safeguard s to govern third parties. Phorm is careful to note that
no system is 100% safe, but rem inds users that no personally identif‌i able information is
stored by Phorm.36
A. Jurisdiction and Reasons for Focusing on PIPEDA
PIPEDA is not the only private-sector priva cy statute in Canada, but it is the only
one discusse d in this paper. A lthough some provinces h ave enacted substantial ly
similar legislation that superse des PIPEDA within their jurisdictions, the federal Act is
generally applied aga inst collection, use or disclosure of information across provincia l or
national lines.37 is gener ally describes the activities of telecommun ications and online
behavioural advertising organizations such as Google and Phorm. Additionally, since
telecommunications and online advertisin g corporations are often federally incorporated
(if they are incorporated within Canad a at all), PIPEDA is the most consistently relevant
30. Phorm Service Privacy Po licy, online:
31. “Andrew Walmsley on digital: Phorm and funct ion fuel privacy fears” Marke ting (26 March 2008),
14 (%&∋∃()∃
32. PhormDiscover: How it Works, on line:
how_it_works/ >. (nb: This page inclu des information not only on PhormDis cover, but on
Phorm’s advertising progra m as well)
33. Brooks Dobbs, “Phorm: A N ew Paradigm in Internet Advertising ”, online: Oce of Privacy
Commissioner of Canada
34. “Controversy surrounds Phorm” Computer Fraud & Secur ity 2008:5 (May 2008) 4.
35. About Us, online: Phorm Inc. .
36. Phorm Service Privacy Po licy, supra note 30.
37. Stephanie Perrin et al, The Personal Informa tion Protection and Electronic Document s Act: An
Annotated Guide (Toronto: Irwin Law, 2001) at 4-56.
privacy st atute with respect to online behaviou ral advertising carried out by C anadian
organizations .
Although it is a federal statute, the Federal Cour t held in Lawson t hat PIPEDA (and
thus the jurisdic tion that it grants to the Privacy Commi ssioner of Canada) a lso
applies to extrate rritorial orga nizations that engage i n “the transborder f‌low of personal
information”,38 such as Phorm a nd Google. Accordingly, the jur isdictional waters
surrounding foreign-incorporated orga nizations are less murky: PIPEDA plainly applies.
In the age of the supranational Internet, this is perhaps the single most compelling reason
to focus on PIPEDA in the context of onli ne behavioural advertising.
B. Organization of PIPEDA
PIPEDA is organized around a set of ten “Principles” adopted from the Canadian
Standards Association’s Model Code for the Protection of Personal Information.39 ese
Principles are codi f‌ied in Schedule 1 to the Act, impor ted into law by s. 5 and modif‌ied
by ss. 6-9 of t he Act.40 Some Principles , such as those mandatin g consent and limited
collection (Principles 3 and 4, respectively), impose broad a nd foundational obligations
on organizations within the behavioural advertising industry. Others, such as those
relating to accountability and challenges concerni ng compliance (Principles 1 and 10),
are unlikely to operate d if‌ferently in the context of behavioural advertising t han they do
generally. Principles falling under the former class will be described individually, roughly
in order of their signif‌icance in the context. Principles falling u nder the latter class w ill
be lumped together and only brief‌ly mentioned.
C. Principle 3 – Knowledge and Consent Respecting Collection, Use
or Disclosure
is Principle stipulate s that “knowledge and c onsent of the indiv idual are required for
the collection, use , or disclosure of personal informat ion, except where inappropriate.”41
e term “inappropriate” is given its meaning exhaustively by PIPEDA s. 7.42 at
section permits collection, use or disclosure without knowledge or consent only in
certain circum stances, such as where collection is clearly in the interests of the individual
and cannot be otherwise accessed;43 where use is required for action in an emergency that
threatens an individual’s life , health or sec urity;44 or where disc losure to a government
agency is required for national security reasons.45 In all other circumstances, some
measure of knowledge a nd consent must be provided.
e question, then, is what form (or degree) of knowledge and consent must be provided
in a particular circu mstance. Consent may take a variety of forms, ranging from implied
consent on the low end (where no actual consent has been provided by the indiv idual
af‌fected) to ex plicit consent on the high end. PIPEDA summari zes this range in
Schedule 1:
38. Lawson v Accusearch Inc, 2007 FC 125, 2007 CarsweIlNat 247 at para 51 [Lawson].
39. CSA Standard Q830, online:
40. PIPEDA ss 5-9 and Schedule 1.
41. PIPEDA Schedule 1 clause 4.3.
42. Turner v Telus Communications Inc, 2007 FCA 21, 2007 CarswellNat 172 at para 23 [Turner].
43. PIPEDA s 7(1)(a).
44. PIPEDA s. 7(2)(b).
45. PIPEDA s 7(3)(c.1)(i).
e way in which an organiz ation seeks consent may vary, depending on
the circumsta nces and the type of inf ormation collected. An organization
should generally se ek express consent when the information is likely to be
considered sensitive. Implied consent would generally be appropriate when
the information is less sen sitive.46
e Privacy Commissioner of Ca nada has taken the following view a s to the distinct ion
between express a nd implied consent, as a matter of policy:
Express consent is given e xplicitly, either orally or in writing. Express
consent is unequivocal and doe s not require any inference on the pa rt of
the organization seeking c onsent. Implied consent ar ises where consent
may reasonably be inferred from t he action or inaction of the individu al.47
e Privacy Commissioner of Canad a has expressed a low opinion of “opt-out” program
schemes, calli ng them a “weak form of consent” and observin g that “[o]pt-out consent is
in ef‌fect the presumption of consent.”48 e Commissioner incorporated elements of s.
5(3) of the Act (discussed under Principles 2 , 4 and 5 – Purpose, below) in holding that
circumstanc es in which opt-out consent wou ld be appropriate should “remain li mited,
with due regard both to the sensitivity of the in formation at issue a nd to the reasonable
expectations of the individu al.49,50 e Commissioner t hen laid out cr iteria that a n
organization would have to me et in order to lawfully pursue an opt-out scheme rather
than an opt-in scheme:
1. e personal information must be demonstrably non-sensitive in nature
and context.
2. e in formation-sharing situation must be limited and well def‌ined as
to the nature of the personal information to be used or disclosed and the
extent of the intended use or disclosu re.
3. e organization’s purpose s must be limited and well-def‌ined, stat ed
in a reasonably clea r and under standable manner, and brought to the
individual ’s at tention at the time the persona l information is collected.
4. e organi zation must establish a convenient proce dure for easily,
inexpensively, and immediately opting out of, or withdrawing consent
to, secondary purp oses and must notify the individua l of the procedure
at the time the persona l information is collected.51
In Aeroplan, the Privacy Commissioner considered the appropriate level of consent
regarding A ir Canada’s sharing of customers’ information with Aeroplan, an adverti sing
partner, for the purpose of providin g targeted adverti sements to consumers. e
46. PIPEDA Schedule 1 clause 4.3.6. C. f. paras 4.3.4 and 4.3.7.
47. Oce of the Privacy Commission er, Your Privacy Responsibilities: Canada’s Personal Information
Protection and Electronic Docume nts Act - A Guide for Businesses and O rganizations, online:
at 2.
48. Air Canada allows 1% of Aeroplan membership to “opt out ” of information sharing practices (11
March 2002), PIPEDA Case Summary #2002-42, online: OPC
cf-dc_020320_e.cf m> [Aeroplan]. (As early OPC decisions are not given p aragraph numbers, no
pinpoint has been provid ed.)
49. Ibid.
50. C.f. PIPEDA Schedule 1 clause 4.3.5.
51. Bank does not obtain the meaning ful consent of customers for disclosure of personal info rmation
(23 July 2003), PIPEDA Case Summary #2003-192, online: OPC
dc/2003/cf-dc_030723_ 01_e.cfm>.
Commissioner concluded that express consent was nece ssary where there was a potential
for “use and disclosure of information customized accordin g to individua l plan
members’ purchasing h abits and preferences”.52 A lthough usin g personal information
for the purpose of advertising is not objectionable per se, the Commissioner applied a
reasonableness standard in c oncluding t hat the potential sensitivity of the information
caused that pur pose to fall short of reasonableness:
[A] reasonable person would not expec t such practice to extend to the
“tailoring” of information to the indiv idual’s potentially sensitive personal
or professional interests, uses of or preferences for cer tain products
and services, and f‌inancial statu s, without the positive consent of the
Similarly, knowledge must inform consent; an organiz ation’s description of the purposes
for which information will be used must be “suf‌f‌iciently conducive to [impar ting]
knowledge on t he part of the indiv idual” or the consent t hat was provided may be
invalid.54 at is, the organization must “clearly explain to a ll [af‌fected individuals]
the purposes for the collec tion, use, and disclosure of their personal information”55
[emphasis added]. is requirement draws in elements of Principle 2, which de als with
the obligation to identify s uch purposes.56
Organizat ions may not require con sent to the collection, use or disclosu re of personal
information beyond that required to fulf‌ill the “explicitly specif‌ied” a nd “legitimate”
purposes.57 With resp ect to ma rketing, the Privacy Commissioner often d raws
distinctions between so-ca lled primary and secondary purposes. Prima ry purposes are
essential to the service provided, and therefore organizations are permitted to require
consent to those purpose s as a condition of service.58 Secondary purp oses are inessential
(and additional to the primary purposes), and t herefore consent c annot be required
as a condition of ser vice.59 Marketin g is commonly c onsidered a seconda ry purpose,
although in Facebook, demographically-ta rgeted advertisements were considered a
primary pu rpose on the basis that Faceb ook provided its services for free and depended
on those advertisements for most of its revenue.60
In sum, the standard for consent is fa irly high. In the f‌ield of b ehavioural advertisin g,
it is likely to default to express consent in the context of fulsome knowledge of the
organization’s purposes. e consent cannot be mandatory, unle ss the advertising is
essential to the service provided. is standa rd is based, at least in part, on an assessment
of whether the notional “reasonable person” would assu me that such purposes (and the
methods us ed to pursue them) are likely to be c arried out without their knowledge. In
the case of targeted advertising based on personal preferences, the Privacy Commissioner
of Canada is of the view that reasona ble people do not expect that organizat ions will use
their personal inf ormation in this way.
52. Aeroplan, supra note 49.
53. Ibid.
54. Ibid.
55. Ibid.
56. PIPEDA Schedule 1 clause 4.2.
57. PIPEDA Schedule 1 clause 4.3.3.
58. Report of Findings into the Complaint Filed by the Canadi an Internet Policy and Public Interest Clinic
(CIPPIC) against Facebook Inc. (16 July 2009), PIPEDA Case Summary #2009 -008 at 130, online:
OPC [Facebook].
C.f. Chantal Bernier, “Online Behavioral Ad vertising and Canada’s Investigation on Facebook ”
(Remarks at the Privacy Laws and Business 23rd Annua l Conference, Cambridge, UK, 6 July 2010),
online: .
59. Ibid.
60. Ibid.
D. Principles 2, 4 and 5 – Purposes
Organizat ions must identify the purposes for w hich they intend to use indiv iduals’
personal information no later than the ti me of collection. 61 ey may not use or disclose
that in formation for any other purposes,62 and they may not collect more information
than is neces sary for those identif‌ied purposes (that is, the y may not collect information
“indiscrimi nately”).63 Section 5(3) of the Act, referenced above, di rectly inf‌luences
the a nalysis of a n organization’s stated (or perhaps un stated) purposes. at provision
requires that persona l information only be collected, used or disclosed “for purpose s
that a reasonable person would consider are appropriate in the circumstances.”64 As
a consequence, PIPEDA establishes a system in wh ich organiz ations’ stated purposes
def‌ine t he scope of allowable use, collection a nd di sclosure. Moreover, these purposes
may be reviewed on the ba sis of their reasonableness (or lack thereof).
In a ssessing rea sonableness, the Privacy Comm issioner has delineated a four-part test
that has been adopted by the Feder al Court:
1. Is the measure demonstr ably necessary to meet a specif‌ic need?
2. Is it likely to be ef‌f ective in meeting that need?
3. Is the loss of privacy proport ional to the benef‌it gained?
4. Is there a less privac y-invasive way of achievi ng the same end?65
In Eastmond , Canadi an Pacif‌ic Railway h ad installed securit y cameras in one of its rail
yards. e c ameras were installed for the identif‌ied purpose of preventing theft and
vandalism. e employees’ union argued that t he resulting sur veillance (of employees)
was not reasonable. e Federal Court concluded that it was in fact reasonable on the
basis th at the impact on t he employees’ privac y was not severe: employees knew which
areas were under surveil lance, it would only occasionally capture employees’ work
activities, a nd, most important ly, CP had put a number of safeguards in plac e to ensure
that the records were not accessible unle ss an incident was reported. If no incidents were
reported, the video would be deleted within 30 hours of its recording, and it could not be
used for the purpo se of evaluating employee work habits.66 ese safeg uards suf‌f‌iciently
mitigated the loss of priva cy ex perienced by the workers to render the survei llance
In Facebook, the Privacy Com missioner found that Facebook’s practice of sharing
“potentially unlimited” persona l information w ith application de velopers without
actively monitoring the developers’ use of that i nformation was not reasonable in the
circumstanc es. Releva nt to the Commissioner’s f‌indi ng was the fact that developers
needed much less information than they were given access to, and insuf‌f‌icient safeguard s
were put in place by Facebook.67
61. PIPEDA Schedule 1 clause 4.2.
62. PIPEDA Schedule 1 clause 4.5.
63. PIPEDA Schedule 1 clauses 4.4 and 4.4.1.
64. PIPEDA s 5(3).
65. Employee objects to company’s use of digital video sur veillance cameras, (23 January 2003), PIPEDA
Case Summary #2003-114, online: OPC
cfm>, a’d Eastmond v Canadian Pacic Railway, 2004 FC 852, 2004 CarswellN at 1842 at para 127
66. Ibid at para 176.
67. Facebook, supra note 59 at para 193.
e re quirement that purposes be identif‌ied prior to collection is var ied when
organizations i ntend to use previously collected i nformation for a new purpose. In these
circumstanc es, the new purpose must be identif‌ied prior to the use of that information.68
Organizat ions are still requi red to obtain consent from each individual in the u sual way
prior to using their information for a new purpose.69 In a ny event, whether the purpose
is identif‌ied pr ior to collec tion or prior to use, orga nizations are obliged to identify the
purpose in such a way that the knowledge requirement of Principle 3 i s satisf‌ied by the
time consent is obtained.70
E. Principle 5 – Retention of Information
Although this Principle ha s been included in the ab ove discussion, retention is a suf‌f‌iciently
signif‌icant issue in the context of behavioura l advertising that it deserves to be singled
out at this stage. Personal inform ation shall be reta ined only as long a s is necessar y for
the fu lf‌illment of an organization’s identif‌ied pu rposes.71 When th is information is no
longer nece ssary, it should be “destroyed, erased, or made anonymous.”72 e Privacy
Commissioner requires that organizations set a maximu m period of retention, despite
the fac t that the Act frames it as a suggestion.73 It may also be nec essary to institute a
minimum length of retention in order to facilitate access to information that was involved
in makin g a decision about an individual,74 a lthough it is not necessary to preserve that
information in its origina l form.75
In Credit Bur eau, the Privacy Commissioner considered the imposition of a 20-year
retention pol icy for credit-related information to be suf‌f‌icient for the purposes of the
Act in light of the fact t hat an e xtended retention period benef‌itted some individuals,
whereas others could still request to have their information disposed of prior to that
time.76 In Facebook, the organiz ation had instituted an indef‌inite retention policy for
deactivated accounts. e Privacy Commissioner objected to this arrangement even after
Facebook cre ated a process for account delet ion, despite Facebook’s claims that it wa s
merely safegua rding it for users and did not d isclose or use that information during the
deactivation period.77
F. Principle 9 – Individual Access
Individuals may request from a n organization conf‌irmation of the existence, use and
disclosure of their personal i nformation as well as access to t his in formation.78 e
Act permits exceptions to this rule, but requires t hat the individual be informed of t he
reasons for denying access.79 ose exceptions are codif‌ied in s. 9(3), which exempts
organizations from providing access where it would “reveal conf‌idential commercial
68. PIPEDA Schedule 1 clause 4.2.4.
69. Ibid.
70. Englander v Telus Communications Inc, 2004 FCA 387, 2004 CarswellNat 4119 at para 58 [Telus].
71. PIPEDA Schedule 1 clause 4.5.
72. PIPEDA Schedule 1 clause 4.5.3.
73. Credit bureau sets retention period for positive info rmation (18 January 2006), PIPEDA Case
Summary #2006-326, online: [Credit
74. PIPEDA Schedule 1 clauses 4.5.2 and 4. 5.4.
75. Vanderbeke v Royal Bank, 2006 FC 651, 2006 CarswellNat 1550 at para 20.
76. Credit Bureau, supra note 74.
77. Facebook, supra note 59 at paras 249-254.
78. PIPEDA Schedule 1 clause 4.9.
79. Ibid.
information”80 (which refers here to information relat ing to commerce, and not merely
information with c ommercial value),81 along with a variet y of other public-policy
exceptions, such as where access “could re asonably be e xpected to t hreaten the life or
security of another individual.”82 In ad dition, organizations are specif‌ic ally prohibited
from providing individuals with acc ess to their personal information if doing so wou ld
reveal personal information a bout a third party.83 If the third party’s information is
severable from the record at issue , then the orga nization should sever it prior to giving
the individua l access.84 If the t hird party consents, then access may be gr anted without
Where information is inaccu rate or incomplete, individuals have a right to chal lenge the
organization’s records and have thei r personal information amended accordingly.86
G. Other Principles
Not all Principles are as central to the issue of behavioural advertising as those listed above.
Institutional Principles suc h as Accountability (Principle 1), Openness (Principle 8) and
Challenging Compliance (Principle 10), t hough relevant to any organization subject
to the Act, do not take on a n appreciably dif‌ferent form in the context of be havioural
advertising as t hey are focused primarily on conventional orga nizational structures . It is
suf‌f‌icient to note that a ll organizations subject to PIPEDA must provide an apparatus
that monitors pr ivacy iss ues, inform s individua ls of the organization’s practices and
enables i ndividuals to make complaints under the Act. In addition, although personal
information must be a s “accurate, complete, and up-to-date”87 as t he organi zation’s
identif‌ied purposes require (Principle 6), this requirement is directed at “objective,
verif‌iable fac t”, and not subjective matters such as personality prof‌iles.88 Or ganizations
must al so put in plac e multi-layered safeg uards89 and fol low industry be st practices to
protect individuals’ priva cy (Principle 7). 90
e legal analysis presented above draws in elements of the surrounding social context by
assessing ci rcumstances on the basis of re asonableness, considering the sensitivity of the
information at issue and reviewing common practices and industry standa rds relevant
to the issue. ese are all ques tions of fact arising from the surrounding soc ial context.
Accordingly, being familia r with how Canadians behave and how they perceive these
issues is a critica l part of a complete analysis of privacy issues u nder PIPEDA.
80. PIPEDA s 9(3)(b).
81. Air Atonabee Ltd v Canada (Minister of Transport) (1989), 37 Admin LR 245, 27 FTR 194,
27 CPR (3d) 180 at 36 (FC TD) [Atonabee].
82. PIPEDA s 9(3)(c).
83. PIPEDA s 9(1).
84. Ibid.
85. PIPEDA s 9(2).
86. PIPEDA Schedule 1 clause 4.9. C.f. PIPEDA Schedule 1 clause 4.9.5.
87. PIPEDA Schedule 1 clause 4.6.
88. Complaint under PIPEDA against Accusearch Inc., doing business as Abi ka.com (not dated),
at para 36, online: .
89. PIPEDA Schedule 1 clause 4.7.3.
90. Report of an Investigation into the Securit y, Collection and Retention of Personal Information
(25 September 2007) at paras 70, 76 and 82, online: OPC
rep_070925_e.cfm> [TJX].
A. The Internet and Canadian Habits
Canadian s are voracious Internet users, with 80% of the Canadian population going
online for personal reasons91 and most of them logging in every day.92 irty-nine
percent of Canadians aged 16 or older shop online, collectively placing 95 million orders
and spending $15.1 billion.93 Over a quarter of adult Canadians access educational
resources online, as do 80% of students.94 More than a third of adult Canadians,
mostly women, access health-related infor mation online. 95 More than half of these users
looked up information on specif‌ic diseases or l ifestyle information (e.g. relating to diet
or exercise).96 Canad ians also en gage in socia l, civic and p olitical life online, w ith ha lf
of al l home Internet users going online to read about specif‌ic socia l or politica l issues
and 40% of home Internet u sers researching local community events.97 In light of the
signif‌icant portion of Canadians’ personal and professional lives spent online, the Pr ivacy
Commissioner of Canad a has expressed the view that “it is i mperative, in our view, that
their privacy is protected w hen engaged in Internet ac tivity.98
B. The Public Debate Around Deep Packet Inspection
e debate around deep packet inspect ion reached a fever pitch during the CRTC’s
2009 hearings into ISPs’ use of the technology for non-advertising-related, network-
maintenance purp oses. e Privacy Commissioner of Canada was sen sitive to the
concerns of the Canadian public (or, at lea st, vocal part s thereof) a nd commis sioned a
collection of e ssays from interested part ies.99 Many of t he essays cited deep reser vations
about t he use of DPI without consent or, worse, without users’ knowledge, calling it
“spy[ing]”,100 “intrusive”,101 and a violat ion of the Internet’s “presumption of privacy”.102
ese deep reservations regarding a technology that the Privacy Commissioner has
likened to the s teaming-open of sealed letter s103 are indic ative of t he public’s strongly
held views about what con stitutes a reasonable loss of privac y even in the context of a
meritorious purpose (such as mai ntaining net work infrastructure).
91. Canadian Internet Use Survey, supra note 1.
92. Ibid.
93. Statistics Canada, E-commerce: Shop ping on the Internet (Business Special Sur veys and
Technology Statistics Divisio n, 2010), online: The Daily
94. Statistics Canada, Study: Using the Internet for e ducation purposes (Business Special Sur veys
and Technology Statistics Divis ion, 2005), online: The Daily
95. Statistics Canada, Study: Health infor mation and the Internet (Business Special Surveys a nd
Technology Statistics Division, 2005), online: The Daily
96. Ibid.
97. Statistics Canada, Study: Internet use and so cial and civic participation (Busines s Special Surveys
and Technology Statistics Divis ion, 2007), online: The Daily
98. Oce of the Privacy Commission er of Canada, Essay, “Review of the Internet tr ac management
practices of Internet ” (18 February 2009) at para 20, online: OPC
essays/review-of-the -internet-trac-manageme nt-practices-of-inter net-service-p roviders/>.
99. Oce of the Privacy Commission er, “Collection of Essays” (2009), online: OPC
gc.ca/index. php/essays/>.
100. Oce of the Privacy Commissioner, “T he Greatest Threat to Privacy ” (2009), online: OPC
dpi.priv.gc.ca/inde x.php/essays/the-gr eatest-threat-to-privac y/>.
101. Oce of the Privacy Commissioner, “Just Deliver t he Packets” (2009), online: OPC
priv.gc.ca/index. php/essays/just-deliver-the- packets/>.
102. Ibid.
103. Oce of the Privacy Commissioner, “Obje cting to Phorm” (2009), online: OPC
gc.ca/index. php/essays/objecting-to- phorm/>.
is debate is not limited to Canada. In the United States, the Federal Communications
Commission (“FCC”) has stated that the use of DPI in t he context of network
maintenance must be disclosed to consumers so as to enable them to re asonably recognize
the ef‌fects of its use.104 e House Committee on Energy and Commerce, Subcommittee
on Telec ommunications and the Internet opined i n 2008 that, due to the “obvious
sensitivity” of the information b eing ana lyzed by DPI systems, consumers deserved
“clear, conspicuou s, and constructive notice” of the use of DPI, “meaningf ul” opt-in
consent to t hat use, and no “monitoring or data intercept ion” (i.e. collec tion) for u sers
who had not opted in.105 e National Adverti sing Initiative, an American org anization
that advocates self-regulation in the advertisi ng industry, has recognized the public’s
uneasy regard for behavioural adver tising with DPI by supportin g an opt-in standard
for such advertising.106 e U.K.’s Information Commissioner’s Of‌f‌ice has taken it a step
further by requir ing Phorm to supply opt-in consent to all of its customers.107
C. The Public Debate Around Tracking Cookies
In many respects, the public debate surrou nding t racking cookies has been just a s
impassioned as that surrounding DPI. Much of the c ontroversy began in the United
States, where lawsuits agains t major f‌irms such as Yahoo, Toys-R-Us and DoubleClick
(a ta rgeted advertisin g f‌irm that has since b een acquired by Google) prompted those
companies to voluntarily update their priva cy policie s to create opt-out consent schemes.108
e FCC continues to endorse this sel f-regulating model.109 e EU, however, has put
regulations in plac e requiring opt-in consent for the use of tracking cookie s.110
In Canad a, most companies fol low the opt-out approach popular in the United States.
ere has been evidence of a concerted public will to avoid tracking cookie s; estimates of
the proportion of us ers who clear their cook ies on a monthly basis rang e from 39 to 50
percent of users, a nd 13.2 percent of users block third-party cook ies outright.111 Not al l
cookies are track ing cookies, however, and clearing all of one’s cookies actually degrade s
some browser functionality. Still, thi s is a more practic al route t han opting out from
every tracking cookie that a user runs across. Tracking c ookies are numerous; Yahoo
alone operates 34 adverti sing networks that use dif‌ferent track ing cookies.112
104. US, Federal Communications Commission, M emorandum Opinion and Order In the Mat ters of
Formal Complaint of Free Press and Public Knowledg e Against Comcast Corporation for Secretly
Degrading Peer-to-Peer Applications (20 Augus t 2008), File No EB-08-Ih-1518, WC Docket No 07-52 at
40 and 58, online: .
105. US, Markey: Consumers Have Right to Know What Broadband Provi ders Know About Web Use:
Hearing Before the Subcommittee on Telecommunicati ons and the Internet of the House Committee
on Energy and Commerce, 110th Cong (2008) (Rep Edward J Markey), online:
house.gov/press-release/jul y-17-2008-markey- consumers-have-rig ht-know-what-broadban d-
providers-know- about-web>.
106. “Network Advertising Init iative Arms Support for Self-R egulation of Companies Using
‘Deep Packet Inspec tion’” Marketwire (25 September 2008), onli ne: Marketwire
marketwire.com/press-r elease/Network-Adve rtising-Initiative -903861.html>.
107. Controversy surrounds Phorm, supra note 34.
108. Amir M Hormozi, “Cookies and Priva cy” EDPACS 32:9 (March 2005) 1 at 9.
109. Ibid at 11.
110. Ibid.
111. Bria n Morrissey, “Wary Consumers Ward O Tracking Coo kies” Adweek 46:31 (8 August 2005) 10.
112. Ibid.
In response to grassroots user demand, some of the Internet’s mo st p opular browsers have
added a “do not track ” feature (suggested by Stanford University rese archers)113 to al low
users to pre-emptively opt-out of some or all tracking cookies by simply requesting of
sites that they not track t hem.114
Considering this context, it is clear that the public (in Canada and elsewhere) care deeply
about the privacy issues a rising from both DPI and tracking cook ies.
As w ith t he above discussion of the lega l scheme, each of PIPEDA’s Principles w ill be
considered in tur n (though some are grouped together for convenience). Due to the
substantial overlap betwe en the u se of trac king cookies and DPI, many points of the
legal a nalysis can be applied to both in similar f ashions. Accordingly, the technologies
will be de alt with together for the most part. Where dif‌ferences in PIPEDA’s treatment
of the two technologies a re likely to arise, they will be d iscussed independently.
A. Principle 3 – Knowledge and Consent Respecting Collection,
Use or Disclosure
Some form of knowle dge and consent is clearly required by the Act prior to the time of
collection (or use, if tracking for advert ising purposes is a ne w use). In thi s commercial
context, it is unlikely t hat one of the statutory exceptions to the requirement for explicit
consent will apply. e largest question for operators of DPI- and tracking-c ookie-based
advertising ne tworks is whether opt-out consent satisf‌ies the scheme of t he Act. On the
basis of the Privacy Commissioner’s previous f‌indings, this is unli kely in all but the most
limited behavioura l advertising schemes.
i. The Sensitivity of the Personal Information at Issue
ree of t he Commissioner’s four preconditions for i mposing an opt-out scheme
are plainly met, leaving on ly the sensitivity of the personal information at issue. e
information in question must be “demonstrably non-sensitive in nature”.115 is imposes
a high ba r, in par t because it places t he burden on the organi zation to demonstrate the
non-sensitive nat ure of the information, but also because the standa rd of “sensitivity”
is so easy to meet. In Aeroplan, the Commissioner held that information regarding an
individual ’s “personal or professional interests, uses of or preferences for certain products
and services, and f‌ina ncial statu s” were “potentially sen sitive”.116 Clearly, “potentially
sensitive” informat ion cannot be “demonstrably non-sensitive”, and yet t his is precisely
the sort of information that a ny ef‌fective behavioura l advertising system is intended to
collect and use.
Advertisers such as Google and Phorm are careful to state that no “personally identif‌iable”
information is collected. e Act does equate anonymization with disposal of data (under
113. C.f. Do Not Track: Universal Web Tracking Opt-O ut, online: .
114. Jared Newman, “Apple Prepares ‘Do Not Track’ Fe ature in Safari” PCWorld (14 April 2011), online:
115. Bank does not obtain the meaningful consent of customers fo r disclosure of personal information,
supra note 52.
116. Aeroplan, supra note 49.
Principle 5),117 so it could be argued that non-identif‌iable information ceases to be
sensitive, particula rly i f t he re asonable expectations of the individual are the lens through
which sen sitivity is adjudged. Gener ally speaki ng, there are two issues with this view.
e f‌irst is that supposedly anonymize d data, when voluminous, is actually ex tremely
dif‌f‌icult to anonymize ef‌fectively. AOL f amously released “anonymiz ed” records of t he
search histor y of hundreds of thousa nds of users, in which each user was identi f‌ied only
by a number (much like Google identif‌ies its users). It was not long before the New York
Times started attach ing faces to numbers, starting wit h 62-year-old Amer ican widow
elma A rnold of Lilburn, Ga.118 is demonstrates the unsurprising proposition that
an individual ’s beh aviour can be an ef‌fective digital f‌ingerprint. e second issue is th at,
so long as an IP addre sses can be attached to the record, the organization saving the
information wi ll still be able to associate the information collected w ith the household
from which it originated, if not the specif‌ic person.  is is a part icularly weak form of
Accordingly, t he information collecte d is likely sensitive i f it is reta ined in any
commercially usefu l form. is sensitivity is reinforced by the public’s apparent
expectation that their browsing habits should not be shared without their consent, as
evidenced by the rec ent shift by ma instream browsers and knowledge able users towards
tracking-c ookie avoidance. If so much of the public defaults to denying consent and
requiring explicit exceptions to allow organizations to track them, it is likely that the
“reasonable expectations” standard militates again st a n opt-out approach to consent. is
is reinforced by the fact that personal information is collected as soon as a webpage loads,
even before a user is given the chance to opt-out. is is collection before consent, which
the Act prohibits. As a consequence, PIPEDA likely requires opt-in consent for all but
the most limited behavioura l advertising ser vices. is consent must be accompanied by
a clear explan ation of the purposes to which individual s are consenting, which could be
as simple as enumerating the types of user activities t hat are tracked and an explan ation
that they wil l be analyzed to infer the user’s interests for the pu rposes of advertising.
ii. Can Consent be Mandatory for the Provision of the Service?
Whether t he provision of [opt-in] consent may be a mandatory precondition to service
depends on the facts. In the case of tracking cook ies, users typically browse a site for
the purpose of consuming some content or service, as in Facebook.119 e user’s browser
receives a tracking cookie that is governed by the terms of t he privacy p olicy on that
website, even if the cookie is from a third part y (such as Google). In cases where the
site depends on that advertising to of‌fer its s ervices for free, this may be c onsidered a
primary purpose, and thus cons ent may be mandatory for visiting users. Although it is
possible to advertise without behavioural analysis, Facebook ref‌lects a willingness to allow
sites to protect their primar y revenue streams as primary purp oses (if those purposes are
themselves reasona ble, discussed be low).
In t he case of DPI, however, it i s highly unlikely th at consent cou ld be a mandatory
requirement for servic e from an ISP. ISPs charge a fee for acces s to the Internet, and do
not depend on advertising to provide a free serv ice. Accordingly, DPI-based behavioural
advertising is, like most advertising ,12 0 a secondary purpose for w hich consent ca nnot
117. PIPEDA Schedule 1 clause 4. 5.
118. Michael Barbaro and Tom Zeller, “A Face Is Exposed for AOL Searcher N o. 4417749” New York
Times (9 August 2006), online:
119. Faceboo k, supra note 59.
120. Ibid.
be a mandatory require ment of service. is m ight change if an ISP chose to of‌fer a free
Internet connection on the condition that DPI-based behavioural advertising be built in,
but thus far no ISPs have expressed a n interest in such a system.
It bears noti ng, however, that permitting a mandatory consent requi rement on most of
the web’s resources may run afoul of the overarchi ng reasonableness requirement. In
a sy stem where a ll free websites may dem and a subs tantial loss of privacy in order to
obtain acc ess, individuals cou ld be left with t he choice of surrendering t heir privacy or
surrendering their Internet connections. is ties in to the reasonableness assessment of
the purpose itself, below, as it could reduce the benef‌it to the ind ividual and thus render
the purpose for collect ion, use and disclosu re unreasonable.
iii. Case Studies
In light of the above analysis, it is likely that Goog le is violating PIPEDA by provid ing
opt-out (rather than opt-in) consent for its track ing cookie. Google collects persona l
information across broad regions of the web and, although it does promise to avoid
connecting users’ identif‌iers with certain sensitive interests (such as “race, religion, sexual
orientation, healt h, or sensitive f‌i nancial categories”),121 it does not avoid a ll categories
that the Privacy Commissioner considers sensitive. According ly, it likely fails to meet the
criteria for imposing opt-out consent.
Phorm, on the other hand, likely meets its obligations under this Principle of PIPEDA
by using a system of opt-in consent with appropriate knowle dge prior to collection, use
or disclosure.
B. Principles 2, 4 and 5 – Purposes
An organization’s stated purposes def‌i ne the s cope of t heir lawful collection, u se and
disclosure. ese purposes must be reasonable, as def‌ined by the Privacy Commissioner’s
four-part test.122 Taking the view t hat an organization adopts behaviour al advertising in
order to ra ise revenues, and t hat many organi zations (most notably Google) are highly
successfu l i n that pursuit, the f‌irst two conditions (necessity and ef‌fectiveness) are plainly
met. e last condition, that there not be a less privacy-inva sive way of achieving the
same end, is unlikely to be a serious issue; although it could be argued that organizations
could simply c harge users direct ly rather than obt ain funding th rough advertising, t he
Commissioner declined to d ictate radical changes in busi ness models in Facebook and is
unlikely to sta rt doing so. Accordingly, the crucial consideration is the t hird.
i. Is the Loss of Privacy Proportional to the Benet Gained?
is is the question t hat divides critics of behavioural advertising. Both intere sts are
substantial: e individual s’ interest in protecting their privacy online, particularly
in l ight of the sensitivity of the in formation that behavioural adver tising schemes are
capable of collecting, is hi ghly compelling. So too is the business model of a n entire
industry, the Internet, which runs on ads. is latter interest is weakened by the fact that
the benef‌it is merely increased revenue, and not t he ability to earn revenue per se (after
all, organi zations can always display non-behaviour al advertisements). Nevertheless, the
commercial interest is not i nsignif‌icant. Some industry representat ives are quick to note
that user s also derive a n indirect benef‌it in the form of free content and more relevant
121. Google Privacy Center, supra note 5.
122. Eastmond, supra note 66.
123. Wary Consumers Ward O Tracking Cookies , supra note 112.
Relevant to thi s balancing of interests is a considerat ion of the sensitivity of the
information, t he safe guards in plac e,1 24 the organization’s retention policy, indiv iduals’
actual knowledge of the loss of privacy, and the scope of the collection.125 With bot h
tracking cookies a nd DPI, the scope of collection is extremely large , so orga nizations
hoping to satis fy PIPEDA will nee d to of‌fset that e xtensive collection by tig htening up
the other factors to reduce the deg ree of privacy loss experienced by individuals .
Arguably, the most signif‌icant factor in favour of proportionality is ful some, meaningf ul
consent obtained throug h an opt-in scheme. Unlike Eastm ond, where employees had no
say in the matter,126 u sers may choo se whether to participate a nd, shou ld they choos e
to opt-in, they enter the program with f ull k nowledge of t heir loss of privacy. is
consent, along with a robust, multi-level set of safeguards (including encry ption and
secure storage facilities), a collection policy that avoids collecting the most sensitive types
of pe rsonal information and a retention policy that emphasizes speedy deletion, may
be suf‌f‌icient to render this purpo se reasonable. Note that the imposition of mand atory
consent (discussed above) may negatively impact t his reasonablenes s assessment; it is
far less likely that a reasonable person would con sider such a system appropriate in t he
circumstanc es.
As DPI has a g reater scope of disclosure, organizations employing DPI-based behavioural
advertising will li kely need to take the strictest steps to reduce the loss of priv acy. In
addition to the features mentioned above, such organ izations may need to institute an
aggressively limited retention policy, where al l personal information that is collected
is immediately aggregated into interest categor ies and then deleted, leaving only the
aggregate data behind. is is a necessar y consequence of such broad collection; even
short-term retention can pose serious privacy risks when the data being retai ned is so
voluminous. Simila rly, suc h organiz ations need to be incredibly delicate in selecting the
information that gets aggregated – having access to literally everything that an individual
does online ma kes it necessary to only pick out the lea st sensitive information available.
It is not enough that such organizations avoid serving ads ba sed on a user’s f‌inancia l
information, health records, political interests and the like ; organizations that t ake it
upon themselves to sift throu gh a person’s entire dig ital li fe should be ca reful never to
learn these th ings in the f‌irst place.
is places these organizations in a fairly restricted position, as the Privacy Commissioner
recognizes broad (and, to some, apparently innocuous) c lasses of inform ation as
“sensitive”, leaving a fa irly limited class of data eligible for collection without requiring
stronger privacy protections than they present ly implement. But this is the resu lt of
casting a wide net; organiz ations must normally justi fy every piece of information that
they collect (indiscriminate collect ion b eing expressly forbidden),127 so it is not surprising
that a technology t hat is designed to collect every thing will have comparatively onerou s
restrictions imposed up on it.
ii. Case Studies
Both Google a nd Phorm have pledged to en force powerful safegua rds. Both companies
attempt to avoid associati ng sensitive interest categories with user s’ identif‌iers, alt hough
their conceptions of “sensitive information” are far more limited than that of the Privacy
124. Facebook, supra note 59 at para 193.
125. Eastmond, supra note 66.
126. Ibid.
127. PIPEDA Sche dule 1 clause 4.4.1.
Google’s AdSense is capable of indef‌inite retention of user interest categories (despite its
two-year rolling deletion policy), but only if users are consistently interac ting with the
system and, as a consequence, interacti ng with that information. Goog le colle cts data
from numerous part ner websites and retains most of the information it collects, such as
browser history and IP addresses, for a period of 18 months prior to anonymization. is
pattern of retention is troubling, particularly i n light of t he Facebook deci sion, wh ich
casts suspicion on inde f‌inite retention of personal inform ation. It also lacks a n opt-in
consent process to mitigate the severity of the priva cy loss. However, Google claims to
strike a balanc e between legitim ate interests – privacy and secu rity. As in Credit Bureau,
this may g o a long way towards establishing reasonableness (at least with respect to
retention). e aggregate i nterest categor y information that is indef‌in itely retained may
be sensitive, but it is less sensitive than the browsing histor y that Google eventually
anonymizes, and it likely is the minimal amount of information necessa ry to provide
behaviourally-ta rgeted ads.
Google appears to be treading a thin line when it comes to balancing individuals’ privacy
interests against the b enef‌its gained. Google a nonymizes the most sensitive personal
information af ter 18 months, an apparently re asonable period of time, a nd retains user
interest inform ation for the duration of its use plus two years. is policy sati sf‌ied the
Commissioner in Facebook, but the scope of collection (and thus loss of privac y) in
this ca se is considerably broader. Despite this, Google’s balancing appears to be la rgely
reasonable, and thus its purposes are likely PIPEDA-compliant. Such a f‌inding is not
guaranteed, however; revising its ret ention policy to store less information for less time
or instituting a n opt-in consent process would dramatically improve the lik elihood that
Google’s purposes would be found to be in li ne with PIPEDA.
Phorm, in c ontrast, retains nothing but users’ a ggregated interest c ategories and
their unique identif‌ier s. e only issues that can be taken with Phorm’s approach is
that Phorm’s def‌inition of sensitive information is much narrower t han t he Privacy
Commissioner’s, and it stores users’ interest categories indef‌in itely. is concern is likely
resolved by Phorm’s opt-in consent scheme, w hich reduces the severity of privacy loss
resulting from the collect ion, use and retention of sensitive information. Overall, Phorm
likely satisf‌ies t hese Principles of PIPEDA.
C. Principle 9 – Individual Access
Access to personal informat ion is a particularly problematic aspect of these tech nologies.
Multiple individua ls may contribute p ersonal information to a single identif‌ier, simply
by v irtue of using the same browser on the same computer (as is common in family
homes). As a consequence, it is likely that providing an individual access to his or her
personal information would revea l the personal information of a third party that can not
be severed. Worse, if a third party gains access to an individual’s c omputer account
they would be able to view t he interest categories as sociated with it even if none of the
personal informat ion collected was t heirs. To ge t around this, an individual wou ld have
to be able to demonstrate that he or s he was t he originator of the persona l information
associated with a part icular identif‌ier, and either demonst rate that no ot her individual
had used the same browser on the same computer (or, at least, that such an occurrence
was unlikely) or obtain consent from all individua ls who were likely to have access to the
computer in order to gain access to the personal information. In v iew of the practical
dif‌f‌iculties that arise, the most ef‌fective route to ensure PIPEDA -compliance i s to deny
access entirely (absent convincing proof of t he above requirements). is is not the only
solution; in theory, organizations could allow users to authenticate their identities before
browsing, but requiri ng users to log in to the ser vice is precisely what most b ehavioural
advertisers want to avoid.
Google and Phorm both allow users to view and edit their user preferences by visiting
a pa rticular webpage in their browser. e page recognizes the browser and provides
access to the associated user interests. Although this func tionality is likely provided in an
attempt to satisfy access requirements , in many cases it may actually allow indiv iduals to
view the per sonal information of third parties. In order to be compliant with PIPEDA,
Google and Phorm should either deny ac cess to these records entirely or est ablish some
mechanism by which u sers can authenticate their identities.
PIPEDA anticipates the need for a delic ate bala nce bet ween indiv iduals’ re asonable
expectation of privacy and organi zations’ legitimate business interests. In general, it does
not aim to prevent consumers from trading their privacy for c ommercial benef‌its, but it
does demand t hat individuals obtain f ulsome knowledge of the a rrangements that they
are entering, that the consent they provide be mea ningful a nd that the arrangements
themselves st rike a reasonable ba lance between t he privacy lost and the benef‌it gained.
Behavioural advertising technologie s test this balance by bein g pervasive, surre ptitious
and highly i nvasive by nature. e Act is intended to guide organizat ions through t hese
untested waters by providing a ba seline of protection appropriate to the circumstances.
Under PIPEDA, user s should consent to both trackin g cookies and deep packet
inspection via a n opt-in process due to the se nsitive information t hat these technologies
collect and use. Using these tech nologies for the purpose of targeted, behav ioural
advertising is not unreasonable per se, but fa iling to adopt stringent retention p olicies
that reduce t he amount of information stored a nd limited collection policies that avoid
collecting the most sensitive clas ses of information may render it unreasonable. Limiting
retention is also critica l, in addition to the institutional and physical protections that al l
organizations handling sensitive informat ion should take. Finally, as t hese technologies
cannot distinguish between one individual and a nother if they are using the same browser,
access to personal information should be limited to cases where it c an be demonstrated
that the only person who has contributed the persona l information attached to a
particula r identif‌ier is the person request ing it (or that all other contributing indiv iduals
have consented to the acces s).
On the basis of the above, I have concluded that Google may be violating PIPEDA due
to its reliance on opt-out consent despite its collect ion of sensitive personal infor mation,
and its practice of pe rmitting users to access personal information without demonstrating
that the p ersonal information of third parties is not lik ely to be disclosed without their
consent. I recommend that Google adopt an opt-in consent process and either deny
individuals ac cess to personal information or put in place a process that en ables them to
authenticate their identities in a ma nner that satisf‌ies the Act. It may also be appropriate
for Google to limit its retention and collection of personal information more aggressively
(particula rly with respect to highly sensitive classes of informat ion), although its current
practices like ly do not violate the Act.
I have also concluded that Phorm may be violating PIPEDA (or would be, if it performed
business in Canada) on the basis that it too is permitting users to access personal
information without demonstrati ng that the personal information of third par ties is not
likely to be disclose d without their consent. I recommend that it either deny individuals
access to personal information or put in place a process that enables them to authenticate
their identities in a manner t hat satisf‌ies the Act.

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT