Category Archives: Privacy

FTC Strengthens Kids’ Privacy, Gives Parents Greater Control

by Kevin Mills

Privacy continues to evolve into one of the most important legal issues of this decade.  While we as Americans are wary of the government collecting our private information, we are comparatively complacent regarding private information collected by private businesses.  It’s a dangerous conundrum.  After all, the government is created for our benefit and is ultimately accountable to us, but private business, on the other hand, has no such inherent accountability and is dedicated to its own self-interest.

The Federal Trade Commission (“FTC”) plays an important role in protecting the privacy of persons using the internet.  The FTC has just recently adopted changes to its Children’s Online Privacy Protection Act (“COPPA”) to strengthen privacy protections for children and give parents greater control over the personal information that websites and online services may collect from children under thirteen.  The information in this article, largely taken from the FTC itself, explains these changes.

Congress passed COPPA in 1998.  It requires that operators of websites or online services that are either directed to children under thirteen or have actual knowledge that they are collecting personal information from children under thirteen give notice to parents and get their verifiable consent before collecting, using, or disclosing such personal information, and keep secure the information they collect from children.  It also prohibits these operators from conditioning children’s participation in activities on the collection of more personal information than is reasonably necessary for them to participate.  COPPA contains a “safe harbor” provision that allows industry groups or others to seek FTC approval of self-regulatory guidelines.

In 2010, the FTC initiated a review to ensure that COPPA keeps up with evolving technology and changes in the way children use and access the internet, including the increased use of mobile devices and social networking.

The final amendments:

  • modify the list of “personal information” that cannot be collected without parental notice and consent, clarifying that this category includes geolocation information, photographs, and videos;
  • offer companies a streamlined, voluntary, and transparent approval process for new ways of getting parental consent;
  • close a loophole that allowed child-directed apps and websites to permit third parties to collect personal information from children through plug-ins without parental notice and consent;
  • extend coverage in some of those cases so that the third parties doing the additional collection also have to comply with COPPA;
  • extend COPPA to cover persistent identifiers that can recognize users over time and across different websites or online services, such as IP addresses and mobile device IDs;
  • strengthen data security protections by requiring that covered website operators and online service providers take reasonable steps to release children’s personal information only to companies that are capable of keeping it secure and confidential;
  • require that covered website operators adopt reasonable procedures for data retention and deletion; and
  • strengthen the FTC’s oversight of self-regulatory safe harbor programs.

Definitions

The Final Rule includes these modified definitions:

  • The definition of an “operator” has been updated to make clear that COPPA covers a child-directed site or service that integrates outside services, such as plug-ins or advertising networks, that collect personal information from its visitors.  This definition does not extend liability to platforms, such as Google Play or the App Store, when such platforms merely offer the public access to child-directed apps.
  • The definition of a “website or online service directed to children” is expanded to include plug-ins or ad networks that have actual knowledge that they are collecting personal information through a child-directed website or online service. In addition, in contrast to sites and services whose primary target audience is children, and who must presume all users are children, sites and services that target children only as a secondary audience or to a lesser degree may differentiate among users, and will be required to provide notice and obtain parental consent only for those users who identify themselves as being younger than thirteen.
  • The definition of “personal information” now also includes geolocation information, as well as photos, videos, and audio files that contain a child’s image or voice.
  • The definition of “personal information requiring parental notice and consent before collection” now includes “persistent identifiers” that can be used to recognize users over time and across different websites or online services. However, no parental notice and consent is required when an operator collects a persistent identifier for the sole purpose of supporting the website or online service’s internal operations, such as contextual advertising, frequency capping, legal compliance, site analysis, and network communications. Without parental consent, such information may never be used or disclosed to contact a specific individual, including through behavioral advertising, to amass a profile on a specific individual, or for any other purpose.
  • The definition of “collection of personal information” has been changed so that operators may allow children to participate in interactive communities without parental consent, so long as the operators take reasonable measures to delete all or virtually all of the children’s personal information before it is made public.

Parental Notice

The amended Final Rule revises the parental notice provisions to help ensure that operators’ privacy policies, and the direct notices they must give parents before collecting children’s personal information, are concise and timely.

Parental Consent Mechanisms

The Final Rule changes add several new methods that operators can use to obtain verifiable parental consent: electronic scans of signed parental consent forms; video-conferencing; use of government-issued identification; and alternative payment systems, such as debit cards and electronic payment systems, provided they meet certain criteria.

The amendments retain email plus as an acceptable consent method for operators that collect personal information only for internal use.  Under this method, operators that collect children’s personal information for internal use only may obtain verifiable parental consent with an email from the parent, as long as the operator confirms consent by sending a delayed email confirmation to the parent, or by calling or sending a letter to the parent.

To encourage the development of new consent methods, the FTC establishes a voluntary 120-day notice and comment process so parties can seek approval of a particular consent method.  Operators participating in an FTC-approved safe-harbor program may use any consent method approved by the program.

Confidentiality and Security Requirements

COPPA requires operators to take reasonable steps to make sure that children’s personal information is released only to service providers and third parties that are capable of maintaining the confidentiality, security, and integrity of such information, and who assure that they will do so.  COPPA also requires operators to retain children’s personal information for only as long as is reasonably necessary, and to protect against unauthorized access or use while the information is being disposed of.

Safe Harbors

The FTC seeks to strengthen its oversight of the approved self-regulatory “safe harbor programs” by requiring them to audit their members and report annually to the FTC the aggregated results of those audits.

These changes will go into effect on July 1, 2013.

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

Advertisements

Leave a comment

Filed under Internet Law, Privacy, Technology

Prohibiting Employer Monitoring of Employees’ Social Networking Accounts

By Kevin Mills

It seems that in recent years, there has been an increase in the number of news reports lauding law enforcement for using social networking sites to help catch criminals.  It’s often entertaining to hear about criminals posting self-incriminating evidence online, creating “LinkedIn to locked up” scenarios.  However, when employers (or potential employers) use similar investigation tactics with regard to employees (or potential employees) it is an entirely different matter; increasingly, state and federal governments have grown alarmed with employers’ practice of requesting access to current and prospective employee social media accounts for investigative purposes, and they are moving to put an end to such practices.

In April 2012, Maryland became the first state to enact a law aimed at prohibiting employers from requiring current and prospective employees to provide employers with access to their social media accounts.  Maryland recognized the need for such laws when the ACLU filed a lawsuit after a job interviewer for the State Corrections Department asked a job applicant to provide his social network passwords and then logged on to the applicant’s Facebook account and reviewed his messages, wall posts, and photos.  The ACLU alleged that the conduct violated the Stored Communications Act, the First Amendment, and the Fourteenth Amendment, and constituted an invasion of privacy. The State defended its policy, stating that it needed to check job applicants’ Facebook pages in order to ensure that the applicants were not engaging in any gang-related activities.

In response, the Maryland legislature moved quickly and became the first state to enact a statute expressly prohibiting employers from requesting or requiring the disclosure of usernames or passwords to personal social media accounts. The statute also prohibits employers from taking (or threatening to take) any disciplinary action against employees or job applicants who refuse to disclose such information.

Over the course of the past few months, several other states, including New York, California, and Illinois (effective January 1, 2013), have followed Maryland’s lead and passed legislation similar to Maryland’s.  Additionally, Delaware, Michigan, Minnesota, New Jersey, Texas, and  Washington have all proposed similar laws.

The Federal Government has also taken steps to implement similar measures. In April 2012 the Social Networking Online Protection Act was introduced in the House. The Act would prohibit employers from requiring current or prospective employees to provide their usernames or passwords to access online content.  In the Senate, the Password Protection Act of 2012 was introduced, with provisions similar to the House bill.  In addition, Richard Blumenthal (D-CT) and Charles Schumer (D-NY) have requested that the Department of Justice and the EEOC launch a federal investigation into these practices.

Employers and business owners should remain abreast of these developments.  Even if a particular state does not affirmatively ban an employer from requesting social media passwords, employers should still proceed with caution because the practice of requesting social media passwords may give rise to liability (including a potential violation of employee Section 7 rights under the National Labor Relations Act).

Businesses will have to learn how to address these types of social media issues.  According to a recent survey by the Poneman Institute, only 35% of companies have a social media policy and only a fraction of those companies actually enforce them.  One thing is clear:  to be safe, Businesses that currently ask employees or applicants to provide them with access to social media accounts should consider ending the practice.

It should also be noted that there are other potential liabilities arising out of an employer viewing a current or prospective employee’s social media accounts and protected social media content (viewing publicly-available information is not currently prohibited by any of the pending state and federal statutes).  For example, what if an employer encounters the following when doing a background check on a prospective employee: “On the wagon, been sober for one whole month!” or “Having a bad day…looking to take it out on someone…WATCH OUT WORLD!”  Issues raised by an employer’s knowledge of these posts are beyond the scope of this piece, but they certainly do have the potential for raising important employment law issues.

 

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

Leave a comment

Filed under Internet Law, Privacy

Are the Kids Alright? The FTC’s New Privacy Rules for the Protection of Children

by Kevin Mills

The Federal Trade Commission (“FTC”) has recently proposed to modify the rules for the Children’s Online Privacy Protection Act (“COPPA”).  If these modifications are implemented, it would be the first time COPPA rules were revised since 1999, a time when there was no Facebook or “an app for that” – even MySpace wasn’t founded until 2003.

Today, there are countless ad networks, third party tracking cookies, and information brokers that harvest personal data across the web and on smartphones – none of these existed when the COPPA rules were last issued.  Although COPPA was designed to protect children’s online experiences, currently, certain loopholes in COPPA allow companies to gather children’s personal information.  A 2010 Wall Street Journal report found that some popular children’s websites installed more data-gathering technology on computers than websites aimed at adults.

The FTC wants to revise COPPA rules so that they apply to third party ad networks and app and plug-in developers, and to expand the definition of “personal information.” Specifically, the revisions aim to cover plug-ins and ad networks that know or have reason to know that they are collecting personal information through child-directed websites or online services.  The revisions could affect popular website features such as Facebook’s “Like” button, as well as new social networks for playing games on smartphones.

First, the proposed revised rules would require sites with content designed to appeal to both young children and others (including parents) to be able to “age-screen all visitors in order to provide COPPA’s protections only to users under age 13.”  These sites would not be allowed to collect any personal information without first obtaining parental consent.  Currently, many websites secure consent by sending an email to an address provided by the child.

Second, the proposed revised rules would create co-responsibility between companies that furnish apps or plug-ins and those that operate the platforms where the apps or plug-ins run.  The FTC states that “an operator of a child-directed site or service that chooses to integrate the services of others that collect personal information from its visitors should itself be considered a covered ‘operator’ under the Rule.”  The revised rules would not only hold third parties responsible for any unlawful data collection, but would also make the host website responsible for those infractions.

Third, the proposed revised rules would expand the definition of “personal information” to include “‘persistent identifiers’ that recognize a user over a period of time which are used for purposes other than ‘support for internal operations.’”  This revision is aimed at “tracking cookies” that are capable of delivering advertising within a single site and also of tracking people across sites to deliver targeted information.  In other words, the revised rules would restrict or prohibit advertising to children based on their previous online behavior.

Fourth, the proposed revised rules would prohibit smartphone apps from collecting geolocation data (defined as “a home or other physical address including street name and name of a city or town”), which they often collect along with phone numbers.

Another important change, especially for many mobile apps, is that personal information now includes “a home or other physical address including street name and name of a city or town.” Such geolocation data is often collected by smart phone apps along with phone numbers, which will now be prohibited by the proposed rules.

It is also important to take a look at what is not covered in the new rules; these rules would apply to information that is being collected for the purposes of advertising or marketing — not information necessary to maintain a network or offer a service.

The revised rules are not aimed at sites that don’t allow children.  This is true even though children do in fact use such sites.  Facebook, for example, requires users to state their date of birth and does not allow users under thirteen to use the site.  Of course, it is possible to lie about one’s age (Consumer Reports estimates that 5.6 million of Facebook’s users are under thirteen).  And it’s worth noting that any site that requires a user to sign in via Facebook is certifying that that person claims to be thirteen or older based on Facebook’s terms of service.

Of course, when considering new rules, one must consider their effectiveness.  Privacy advocates are concerned that the FTC lacks the resources to vigorously enforce the law.  And given the FTC’s history of lax enforcement of COPPA, that is a valid concern.

 

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

Leave a comment

Filed under Internet Law, Privacy

State Initiatives for Internet Privacy

by Kevin Mills

Recently, to the presumed delight of attorneys working in the area of internet privacy, states have become increasingly involved in imposing privacy requirements on companies that collect information from users on the internet.  Perhaps as a reaction to the lack of a clear overall privacy scheme at the national level, many states are now taking action to protect the privacy of citizens who are using the internet.  Such state efforts come in various forms.

Privacy Policies of Internet Companies

In February 2012, the California Attorney General negotiated a deal with six of the largest companies running mobile apps: Apple, Google, Amazon, Microsoft, RIM and Hewlett-Packard—Facebook became the seventh member of this group in June.  The deal requires that the companies put forth a written privacy policy on what information is collected and shared.  Because the AG lacks the power to write rules for mobile apps, the AG asserts authority under a 2004 state law that broadly requires that “online services” that collect personal information from consumers have privacy policies.  Failure to provide such a written policy may result in prosecutions against app makers that mislead California consumers about what uses are made of the personal information collected.  Penalties may be as high as $5,000 per download.

It is readily acknowledged that such efforts by states are not as effective or as efficient as a national privacy policy might be.  However, in the absence of such a national policy, states feel compelled to fill the void.  On a practical level, the existence of fifty different sets of privacy laws can be confusing and can result in the need for attorneys to sort out the checkerboard of legislative initiatives.

Data Security Legislation

In 2003, California enacted a landmark security breach notification law.  Since then, nearly every state has adopted a similar law; today, forty-six states (as well as the District of Columbia, Puerto Rico, and the U.S. Virgin Islands) have security breach notification laws on the books, and in the past several years, many state legislatures have introduced amendments and updates to existing security breach notification laws.  Recent efforts in Connecticut and Vermont, and similar amendments made by other states last year, demonstrate a growing trend of enhancements to state data security legislation.

On May 5, 2012, Vermont approved a law that requires that notice of data security breaches be given to the Vermont AG.  Specifically, such a notice must include: (1) the date of the breach; (2) the date of discovery of the breach; (3) the number of Vermont consumers affected, if known; and (4) a copy of the notice provided to consumers.

On June 15, 2012, Connecticut replaced its security breach notification law.  The new law states that if a business is required to provide notice of a data security breach, the business also must notify the Connecticut AG.  While Vermont and Connecticut may be the most recent states to adopt AG breach notice requirements, they undoubtedly will not be the last.

On a practical level, it is important for businesses to keep in mind the existence of state AG breach notice requirements.  If a business experiences a security incident that requires notice to consumers in one or more states, the business also must consider whether those states have notice requirements to the AG or another state entity.

Other State Privacy Law Efforts

Some state internet privacy laws push the envelope. For example, Facebook and MySpace already bar sex offenders from using their services, but Louisiana feared that the online companies wouldn’t be able to weed out all sex offenders.  A new Louisiana law requires sex offenders to state their criminal convictions on their social networking pages.  It also requires the offenders to disclose their addresses and describe their physical characteristics.  This requirement, scheduled to take effect in August of this year, is the first of its kind in the nation.

States have a legitimate interest in regulating privacy policies on the internet.  It will be intersting to see how far the legislative interest extends.  Will it extend to the local level?  When cable televison was first introduced, city and local governments were agressive in exercising their regulatory powers.  It will be interesting to see if, now, city and local governments will similarly advance some kind of dominion over the internet, and if they do, what form it will take.  Regardless of whether they join the cause, privacy rights will continue to be an area rife with conflict and in need of uniformity.

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

Leave a comment

Filed under Privacy, Software, Technology