MediaTech Law

By MIRSKY & COMPANY, PLLC

Apple Touts Differential Privacy, Privacy Wonks Remain Skeptic, Google Joins In

(Originally published January 19, 2017, updated July 24, 2017)

Apple has traditionally distinguished itself from its rivals, like Google and Facebook, by emphasizing its respect of user privacy. It has taken deliberate steps to avoid vacuuming up all of its users’ data, providing encryption at the device level as well as during data transmission. It has done so, however, at the cost of foregoing the benefits that pervasive data collection and analysis have to offer. Such benefits include improving on the growing and popular on-demand search and recommendation services, like Google Now and Microsoft’s Cortana and Amazon’s Echo. Like Apple’s Siri technology, these services act as a digital assistant, providing responses to search requests and making recommendations. Now Apple, pushing to remain competitive in this line of its business, is taking a new approach to privacy, in the form of differential privacy (DP).

Announced in June 2016 during Apple’s Worldwide Developers’ Conference in San Francisco, DP is, as Craig Federighi, senior vice president of software engineering, stated “a research topic in the area of statistics and data analytics that uses hashing, subsampling and noise injection to enable … crowdsourced learning while keeping the data of individual users completely private.” More simply put, DP is the statistical science of attempting to learn as much as possible about a group while learning as little as possible about any individual in it.

Read More

“Do Not Track” and Cookies – European Commission Proposes New ePrivacy Regulations

The European Commission recently proposed new regulations that will align privacy rules for electronic communications with the much-anticipated General Data Protection Regulation (GDPR) (the GDPR was fully adopted in May 2016 and goes into effect in May 2018). Referred to as the Regulation on Privacy and Electronic Communications or “ePrivacy” regulation, these final additions to the EU’s new data protection framework make a number of important changes, including expanding privacy protections to over-the-top applications (like WhatsApp and Skype), requiring consent before metadata can be processed, and providing additional restrictions on SPAM. But the provisions relating to “cookies” and tracking of consumers online activity are particularly interesting and applicable to a wide-range of companies.

Cookies are small data files stored on a user’s computer or mobile device by a web browser. The files help websites remember information about the user and track a user’s online activity. Under the EU’s current ePrivacy Directive, a company must get a user’s specific consent before a cookie can be stored and accessed. While well-intentioned, this provision has caused frustration and resulted in consumers facing frequent pop-up windows (requesting consent) as they surf the Internet.

Read More

Federal Judge Tosses Stingray Evidence

In a first, a federal judge ruled that evidence found through the use of a stingray device is inadmissible. Reuters reports on the case, United States v. Raymond Lambis, which involved a man targeted in a US Drug Enforcement Administration (DEA) investigation. The DEA used a stingray, a surveillance tool used to reveal a phone’s location, to identify Raymond Lambis’ apartment as the most likely location of a cell phone identified during a drug trafficking probe. Upon searching the apartment, the DEA discovered a kilogram of cocaine.

According to ArsTechnica, the DEA sought a warrant seeking location information and cell-site data for a particular 646 area code phone number. The warrant was based on communications obtained from a wiretap order that suggested illegal drug activity. With the information provided by the cell-site location, the DEA was able to determine the general vicinity of the targeted cell phone, which pointed to the intersection of Broadway and 177th streets in Manhattan. The DEA then used a stingray device, which mimics a cell phone tower and forces cell phones in the area to transmit “pings” back to the device. This enabled law enforcement to pinpoint a particular phone’s location.

Read More

Protecting Children’s Privacy in the Age of Siri, Echo, Google and Cortana

“OK Google”, “Hey Cortana”, “Siri…”, “Alexa,…”

These statements are more and more common as artificial intelligence (AI) becomes mainstream. They serve as the default statements that kick off the myriad of services offered by Google, Microsoft, Apple and Amazon respectively, and are at the heart of the explosion of voice-activated search and services now available through computers, phones, watches, and stand-alone devices. Once activated, these devices record the statements being made and digitally process and analyze them in the cloud. The service then returns the search results to the device in the form of answers, helpful suggestions, or an array of other responses.

A recent investigation by the UK’s Guardian newspaper, however, claims these devices likely run afoul of the U.S. Children’s Online Privacy Protection Act (COPPA), which regulates the collection and use of personal information from anyone younger than 13. If true, the companies behind these services could face multimillion-dollar fines.

COPPA details, in part, responsibilities of an operator to protect children’s online privacy and safety, and when and how to seek verifiable consent from a parent or guardian. COPPA also includes restrictions on marketing to children under the age of 13. The purpose of COPPA is to provide protection to children when they are online or interacting with internet-enabled devices, and to prevent the rampant collection of their sensitive personal data and information. The Federal Trade Commission (FTC) is the agency tasked with monitoring and enforcing COPPA, and encourages industry self-regulation.

The Guardian investigation states that voice-enabled devices like the Amazon Echo, Google Home and Apple’s Siri are recording and storing data provided by children interacting with the devices in their homes. While the investigation concluded that these devices are likely collecting information of family members under the age of 13, it avoids conclusion as to whether it can be proven that these services primarily target children under the age of 13 as their audience – a key determining factor for COPPA. Furthermore, according to the FTC’s own COPPA FAQ page, even if a child provides personal information to a general audience online service, so long as the service has no actual knowledge that the particular individual is a child, COPPA is not triggered.

While the details of COPPA will need to be refined and re-defined in the era of always-on digital assistants and AI, the Guardian’s claim that the FTC will crack down harshly on offenders is not likely to happen, and the potential large fines are unlikely to materialize. Rather, what will likely occur is the FTC will provide guidance and recommendations to such services, allowing them to modify their practices and stay within the bounds of the law, so long as they’re acting in good faith. For example, services like Amazon, Apple and Google could update their services to request on installation the age and number of individuals in the home, paired with an update to the terms of service requesting parental permission for the use of data provided by children under 13. For children outside of the immediate family who access the device, the services can claim they lacked actual knowledge a child interacted with the service, again satisfying COPPA’s requirements.

Read More

What’s Behind the Decline in Internet Privacy Litigation?

The number of privacy lawsuits filed against big tech companies has significantly dropped in recent years, according to a review of court filings conducted by The Recorder, a California business journal.

According to The Recorder, the period 2010-2012 saw a dramatic spike in cases filed against Google, Apple, or Facebook (as measured by filings in the Northern District of California naming one of the three as defendants). The peak year was 2012, with 30 cases filed against the three tech giants, followed by a dramatic drop-off in 2014 and 2015, with only five privacy cases filed between the two years naming one of the three as defendants. So what explains the sudden drop off in privacy lawsuits?

One theory, according to privacy litigators interviewed for The Recorder article, is that the decline reflects the difficulty in applying federal privacy statutes to prosecute modern methods of monetizing, collecting, or disclosing online data. Many privacy class action claims are based on statutes passed in the 1980s like the Electronic Communications Privacy Act (ECPA), the Stored Communications Act (SCA), both passed in 1986, and the Video Privacy Protection Act (VPPA), passed in 1988. These statutes were originally written to address specific privacy intrusions like government wire taps or disclosures of video rental history.

Read More

License Plate Numbers: a valuable data-point in big-data retention

What can you get from a license plate number?

At first glance, a person’s license plate number may not be considered that valuable a piece of information. When tied to a formal Motor Vehicle Administration (MVA) request it can yield the owner’s name, address, type of vehicle, vehicle identification number, and any lienholders associated with the vehicle. While this does reveal some sensitive information, such as a likely home address, there are generally easier ways to go about gathering that information. Furthermore, states have made efforts to protect such data, revealing owner information only to law enforcement officials or certified private investigators. The increasing use of Automated License Plate Readers (ALPRs), however, is proving to reveal a treasure trove of historical location information that is being used by law enforcement and private companies alike. Also, unlike historical MVA data, policies and regulations surrounding ALPRs are in their infancy and provide much lesser safeguards for protecting personal information.

ALPR – what is it?

Consisting of either a stationary or mobile-mounted camera, ALPRs use pattern recognition software to scan up to 1,800 license plates per minute, recording the time, date and location a particular car was encountered.

Read More

Website Policies and Terms: What You Lose if You Don’t Read Them

When was the last time you actually read the privacy policy or terms of use of your go-to social media website or you favorite app? If you’re a diligent internet user (like me), it might take you an average of 10 minutes to skim a privacy policy before clicking “ok” or “I agree.” But after you click “ok,” have you properly consented to all the ways in which your information may be used?

As consumers become more aware of how companies profit from the use of their personal information, the way a company discloses its data collection methods and obtains consent from its users becomes more important, both to the company and to users.  Some critics even advocate voluntarily paying social media sites like Facebook in exchange for more control over how their personal information is used. In other examples, courts have scrutinized whether websites can protect themselves against claims that they misused users’ information, simply because they presented a privacy policy or terms of service to a consumer, and the user clicked “ok.”

The concept of “clickable consent” has gained more attention because of the cross-promotional nature of many leading websites and mobile apps. 

Read More

Please Don’t Take My Privacy (Why Would Anybody Really Want It?)

Legal issues with privacy in social media stem from the nature of social media – an inherently communicative and open medium. A cliché is that in social media there is no expectation of privacy because the very idea of privacy is inconsistent with a “social” medium. Scott McNealy from Sun Microsystems reportedly made this point with his famous aphorism of “You have zero privacy anyway. Get over it.”

But in evidence law, there’s a rule barring assumption of facts not in evidence. In social media, by analogy: Where was it proven that we cannot find privacy in a new communications medium, even one as public as the internet and social media?

Let’s go back to basic principles. Everyone talks about how privacy has to “adapt” to a new technological paradigm. I agree that technology and custom require adaptation by a legal system steeped in common law principles with foundations from the 13th century. But I do not agree that the legal system isn’t up to the task.

All you really need to do is take a wider look at the law.

Privacy writers talk about the law of appropriation in privacy. The law of appropriation varies from state to state, though it is a fairly established aspect of privacy law.

Read More

Privacy: Consent to Collecting Personal Information

Gonzalo Mon writes in Mashable that “Although various bills pending in Congress would require companies to get consent before collecting certain types of information, outside of COPPA, getting consent is not a uniformly applicable legal requirement yet. Nevertheless, there are some types of information (such as location-based data) for which getting consent may be a good idea.  Moreover, it may be advisable to get consent at the point of collection when sensitive personal data is in play.”

First, what current requirements – laws, agency regulations and quasi-laws – require obtaining consent, even if not “uniformly applicable”?

1. Government Enforcement.  The Federal Trade Commission’s November 2011 consent decree with Facebook user express consent to sharing of nonpublic user information that “materially exceeds” user’s privacy settings.  The FTC was acting under its authority under Section 5 of the FTC Act against an “unfair and deceptive trade practice”, an authority the FTC has liberally used in enforcement actions involving not just claimed breaches of privacy policies but also data security cases involving managing of personal data without providing adequate security.

2. User Expectations Established by Actual Practice.  The mobile space offers some of the most progressive (and aggressive) examples of privacy rights seemingly established by practice rather than stated policy.  For example, on the PrivacyChoice blog, the CEO of PlaceIQ explained that “Apple and Android have already established user expectations about [obtaining] consent.  Location-based services in the operating system provide very precise location information, but only through a user-consent framework built-in to the OS.  This creates a baseline user expectation about consent for precise location targeting.”  (emphasis added)

Read More