MediaTech Law

By MIRSKY & COMPANY, PLLC

Do We Need to Appoint a (GDPR) Data Protection Officer?

Does your organization need to appoint a “Data Protection Officer”?  Articles 37-39 of the EU’s General Data Protection Regulation (GDPR) require certain organizations that process personal data of EU citizens to appoint a Data Protection Officer (DPO) to record their data processing activities.  Doing so is a lot more than perfunctory – you can’t just say, “Steve, our HR Director, you’re now our DPO.  Congratulations!”  The qualifications for the job are significant, and the organizational impact of having a DPO is extensive.  You may be better off avoiding appointing a DPO if you don’t have to, while if you do have to the failure to do so can expose your organization to serious enforcement penalties. 

Read More

Blogs and Writings We Like

This week we highlight three writers discussing timely subjects in privacy and trademark law. Brandon Vigliarolo wrote in TechRepublic about Google’s new app privacy standards; Sarah Pearce from Cooley wrote a practical guide to the EU’s General Data Protection Regulation that includes a 6-month compliance plan; and Scott Hervey posted a piece on the IP Law Blog analyzing whether there was trademark infringement under an interesting situation involving a strain of pot.

Google’s new app privacy standards mean big changes for developers

In TechRepublic, Brandon Vigliarolo wrote about Google’s new app privacy standards that will begin on January 30, 2018. At the forefront, app developers will need to explain what data is being used, how it is used, and when it is used – and get user consent. Vigliarolo anticipates that most developers will need to make changes to their app design in order to comply with the new standards. In addition, any transmission of data (even in a crash report) has to be explained and accepted by the user. While Vigliarolo writes that it is not completely clear how Google will enforce these standards, beginning at the end of January users will be given warnings if an app (or a website leading to an app) is known by Google to collect user data without consent. Non-compliant developers could see lower ratings and less traffic.

Read More

Apple Touts Differential Privacy, Privacy Wonks Remain Skeptic, Google Joins In

(Originally published January 19, 2017, updated July 24, 2017)

Apple has traditionally distinguished itself from its rivals, like Google and Facebook, by emphasizing its respect of user privacy. It has taken deliberate steps to avoid vacuuming up all of its users’ data, providing encryption at the device level as well as during data transmission. It has done so, however, at the cost of foregoing the benefits that pervasive data collection and analysis have to offer. Such benefits include improving on the growing and popular on-demand search and recommendation services, like Google Now and Microsoft’s Cortana and Amazon’s Echo. Like Apple’s Siri technology, these services act as a digital assistant, providing responses to search requests and making recommendations. Now Apple, pushing to remain competitive in this line of its business, is taking a new approach to privacy, in the form of differential privacy (DP).

Announced in June 2016 during Apple’s Worldwide Developers’ Conference in San Francisco, DP is, as Craig Federighi, senior vice president of software engineering, stated “a research topic in the area of statistics and data analytics that uses hashing, subsampling and noise injection to enable … crowdsourced learning while keeping the data of individual users completely private.” More simply put, DP is the statistical science of attempting to learn as much as possible about a group while learning as little as possible about any individual in it.

Read More

We’ve Updated our Terms of Use!

Why are they sending me this information, and what am I supposed to do with it? You’ve just received an email like the one below from Uber, or from one of your various subscription services, credit card companies, banks, ISPs or any of a zillion different web applications:

SUBJECT: We’ve Updated our Terms of Use

Hi Andrew, we’ve been able to bring Uber to more than 400 cities in 72 countries. And that’s in just a little over 6 years. In light of that growth and some changes to our services, we’ve made some updates to our US Terms of Use

They have your attention. You sit up alert in your chair, you rub your eyes and read on. The company then sometimes offers a summary of the changes, often in as cheery and euphemistic a way as possible, with statements like “We revised our arbitration agreement which explains how legal disputes are handled”, or “We have updated our Terms of Use regarding the ways in which we may contact you.” All, no doubt, good things.

Turns out, noone actually reads these updates. That last sentence is not meant as sarcasm. The non-partisan Stanley Roper Polling Organization actually published a study that concluded “Noone actually reads these updates.” Editor’s Note: There is no such organization and there was no such study. Evidently. Although Andrea Peterson reports in The Washington Post about a 2008 study (about privacy policies) that concluded “it would take a staggering 244 hours a year for the average American to read the privacy policies of every site they visit over the course of a year.”

Read More

Circuits Weigh-in on PII Under the VPPA

The Video Privacy Protection Act (VPPA) was enacted in 1988 in response to Robert Bork’s Supreme Court confirmation hearings before the Senate judiciary committee, during which his family’s video rental history was used to great effect and in excoriating detail. This was the age of brick-and-mortar video rental stores, well before the age of instant video streaming and on-demand content. Nonetheless, VPPA compliance is an important component to any privacy and data security programs of online video-content providers, websites that host streaming videos and others that are in the business of facilitating consumers viewing streaming video.

Judicial application of the VPPA to online content has produced inconsistent results, including how the statute’s definition of personally-identifiable information (PII)—the disclosure of which triggers VPPA-liability—has been interpreted. Under the VPPA, PII “includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.” 18 U.S.C. § 3710(a)(3). Courts and commentators alike have noted that this definition is vague particularly when applied to new technological situations, as it describes what counts as PII rather than providing an absolute definition. Specifically in the streaming video context, the dispute of the PII definition typically turns on whether a static identifier, like an internet protocol (IP) address or other similar identifier uniquely assigned to consumers, counts as PII under the VPPA.

Read More

Dataveillance Protection: The E.U.-U.S. Privacy Shield

For many years, technology outpaced policy when it came to standards and protections around ownership of and access to personal data. Privacy policies are not set by governments but rather by technology companies that created the digital world as it is experienced today. Many if not all of the dominant players in this space are American technology companies that include Alphabet (i.e. Google), Apple, Amazon, Facebook and Microsoft. These companies have more say about a user’s online life than any individual local, state or national government.

Read More

Appellate Court Upholds FTC’s Authority to Fine and Regulate Companies Shirking Cybersecurity

In a case determining the scope of the Federal Trade Commission’s (FTC) ability to govern data security, the 3rd U.S. Circuit Court of Appeals in Philadelphia upheld a 2014 ruling allowing the FTC to pursue a lawsuit against Wyndham Worldwide Corp. for failing to protect customer information after three data breaches that occurred in 2008 and 2009. The theft of credit card and personal details from over 600,000 consumers resulted in $10.6 million in fraudulent charges and the transfer of consumer account information to a website registered in Russia.

In 2012, the FTC sued Wyndham, which brands include Days Inn, Howard Johnson, Ramada, Super 8 and Travelodge. The basis of the claim stated that Wyndham’s conduct was an unfair practice and its privacy policy deceptive. The suit further alleged the company “engaged in unfair cybersecurity practices that unreasonably and unnecessarily exposed consumers’ personal data to unauthorized access and theft.”

The appellate court’s decision is of importance because it declares the FTC has the authority to regulate cybersecurity under the unfairness doctrine within §45 of the FTC Act. This doctrine allows the FTC to declare a business practice unfair if it is oppressive or harmful to consumers even though the practice is not an antitrust violation. Under this decision, the FTC has the authority to level civil penalties against companies convicted of engaging in unfair practices.

What exactly did Wyndham do to possibly merit the claim of unfair practices?

According to the FTC’s original complaint, the company:

  • allowed for the storing of payment card information in clear readable text;
  • allowed for the use of easily guessed password to access property management systems;
  • failed to use commonly available security measures, like firewalls, to limit access between hotel property management systems, corporate networks and the internet; and
  • failed to adequately restrict and measure unauthorized access to its network.

Furthermore, the FTC alleged the company’s privacy policy was deceptive, stating:

“a company does not act equitably when it publishes a privacy policy to attract customers who are concerned about data privacy, fails to make good on that promise by investing inadequate resources in cybersecurity, exposes its unsuspecting customers to substantial financial injury, and retains the profits of the business.”

Wyndham requested the suit be dismissed arguing the FTC did not have the authority to regulate cybersecurity. The appellate court found otherwise, however, stating that Wyndham failed to show that its alleged conduct fell outside the plain meaning of unfair.

The appellate court’s ruling highlights the need for companies to take special care in crafting a privacy policy to ensure it reflects the company’s cybersecurity standards and practices. This includes staying up-to-date on the latest best practices, and being familiar with the ever-changing industry standard security practices, including encryption and firewalls.

Read More

What’s Behind the Decline in Internet Privacy Litigation?

The number of privacy lawsuits filed against big tech companies has significantly dropped in recent years, according to a review of court filings conducted by The Recorder, a California business journal.

According to The Recorder, the period 2010-2012 saw a dramatic spike in cases filed against Google, Apple, or Facebook (as measured by filings in the Northern District of California naming one of the three as defendants). The peak year was 2012, with 30 cases filed against the three tech giants, followed by a dramatic drop-off in 2014 and 2015, with only five privacy cases filed between the two years naming one of the three as defendants. So what explains the sudden drop off in privacy lawsuits?

One theory, according to privacy litigators interviewed for The Recorder article, is that the decline reflects the difficulty in applying federal privacy statutes to prosecute modern methods of monetizing, collecting, or disclosing online data. Many privacy class action claims are based on statutes passed in the 1980s like the Electronic Communications Privacy Act (ECPA), the Stored Communications Act (SCA), both passed in 1986, and the Video Privacy Protection Act (VPPA), passed in 1988. These statutes were originally written to address specific privacy intrusions like government wire taps or disclosures of video rental history.

Read More

Website Policies and Terms: What You Lose if You Don’t Read Them

When was the last time you actually read the privacy policy or terms of use of your go-to social media website or you favorite app? If you’re a diligent internet user (like me), it might take you an average of 10 minutes to skim a privacy policy before clicking “ok” or “I agree.” But after you click “ok,” have you properly consented to all the ways in which your information may be used?

As consumers become more aware of how companies profit from the use of their personal information, the way a company discloses its data collection methods and obtains consent from its users becomes more important, both to the company and to users.  Some critics even advocate voluntarily paying social media sites like Facebook in exchange for more control over how their personal information is used. In other examples, courts have scrutinized whether websites can protect themselves against claims that they misused users’ information, simply because they presented a privacy policy or terms of service to a consumer, and the user clicked “ok.”

The concept of “clickable consent” has gained more attention because of the cross-promotional nature of many leading websites and mobile apps. 

Read More

Privacy: Consent to Collecting Personal Information

Gonzalo Mon writes in Mashable that “Although various bills pending in Congress would require companies to get consent before collecting certain types of information, outside of COPPA, getting consent is not a uniformly applicable legal requirement yet. Nevertheless, there are some types of information (such as location-based data) for which getting consent may be a good idea.  Moreover, it may be advisable to get consent at the point of collection when sensitive personal data is in play.”

First, what current requirements – laws, agency regulations and quasi-laws – require obtaining consent, even if not “uniformly applicable”?

1. Government Enforcement.  The Federal Trade Commission’s November 2011 consent decree with Facebook user express consent to sharing of nonpublic user information that “materially exceeds” user’s privacy settings.  The FTC was acting under its authority under Section 5 of the FTC Act against an “unfair and deceptive trade practice”, an authority the FTC has liberally used in enforcement actions involving not just claimed breaches of privacy policies but also data security cases involving managing of personal data without providing adequate security.

2. User Expectations Established by Actual Practice.  The mobile space offers some of the most progressive (and aggressive) examples of privacy rights seemingly established by practice rather than stated policy.  For example, on the PrivacyChoice blog, the CEO of PlaceIQ explained that “Apple and Android have already established user expectations about [obtaining] consent.  Location-based services in the operating system provide very precise location information, but only through a user-consent framework built-in to the OS.  This creates a baseline user expectation about consent for precise location targeting.”  (emphasis added)

Read More

Privacy For Businesses: Any Actual Legal Obligations?

For businesses, is there an obligation in the United States to do anything more than simply have a privacy policy?  The answer is not much of an obligation at all.

Put another way, is it simply a question of disclosure – so long as a business tells users what it intends to do with their personal information, can the business pretty much do anything it wants with personal information?  This would be the privacy law equivalent of the “as long as I signal, I am allowed to cut anyone off” theory of driving.

Much high-profile enforcement (via the Federal Trade Commission and State Attorneys General) has definitely focused on breaches by businesses of their own privacy statements.  Plus, state laws in California and elsewhere either require that companies have privacy policies or require what types of disclosures must be in those policies, but again focus on disclosure rather than mandating specific substantive actions that businesses must or must not take when using personal information.

As The Economist recently noted in its Schumpeter blog, “Europeans have long relied on governments to set policies to protect their privacy on the internet.  America has taken a different tack, shunning detailed prescriptions for how companies should handle people’s data online and letting industries regulate themselves.”   This structural (or lack of structural) approach to privacy regulation in the United States can also been seen – vividly – in legal and business commentary that met Google’s recent privacy overhaul.  Despite howls of displeasure and the concerted voices of dozens of State Attorneys General, none of the complaints relied on any particular violations of law.  Rather, arguments (by the AGs) are made about consumer expectations in advance of consumer advocacy, as in “[C]onsumers may be comfortable with Google knowing their search queries but not with it knowing their whereabouts, yet the new privacy policy appears to give them no choice in the matter, further invading their privacy.”

Again, there’s little reliance on codified law because, for better or worse, there is no relevant codified law to rely upon.  Google, Twitter and Facebook have been famously the subjects of enforcement actions by the states and the Federal Trade Commission, and accordingly Google has been careful in its privacy rollout to provide extensive advance disclosures of its intentions.

As The Economist also reported, industry trade groups have stepped in with self-regulatory “best practices” for online advertising, search and data collection, as well as “do not track” initiatives including browser tools, while the Obama Administration last month announced a privacy “bill of rights” that it hopes to move in the current or, more realistically, a future Congress.

This also should not ignore common law rights of privacy invasion, such as the type of criminal charges successfully brought in New Jersey against the Rutgers student spying on his roommate.   These rights are not new and for the time being remain the main source of consumer recourse for privacy violations in the absence of meaningful contract remedies (for breaches of privacy policies) and legislative remedies targeted to online transactions.

More to come on this topic shortly.

Read More

Update: Privacy for Mobile Apps – The Limits of Transparency

In June of this year, Senator Al Franken (D. Minn.) introduced the “Location Privacy Protection Act of 2011” (S. 1223).  According to the bill summary available on Franken’s website, a 2010 investigation by the Wall Street Journal revealed that 47 of the top 101 mobile applications for Apple iPhones and Google Android phones disclose user location without consent of the user.

According to Franken’s bill summary, current law prevents disclosure of user location during telephone calls without user consent. Currently, no similar legislation protects user location when a user accesses information through a mobile web browser or mobile application. Franken claims that his bill will close loopholes in the Electronic Communications Privacy Act that allow for this distinction.

If S. 1223 passes, companies will be required to obtain permission not only to collect mobile user location information but also to share that information with third parties. Additionally, the bill seeks to put in place measures to prevent stalking through location information.

As of this writing, Franken’s bill has been assigned to the Senate Judiciary Committee and is being cosponsored by Sens. Blumenthal, Coons, Durbin, Menendez, and Sanders.

Original Post (published 9/8/2011)

When was the last time you read a license agreement after installing software or downloading an app on your smartphone? If you’re like most people, the answer is probably never.

According to some estimates, fewer than 8 percent of us actually read the entirety of those agreements, despite rising concerns about digital privacy and data collection.

Read More