Sign In
    Wisconsin Lawyer
    March 10, 2008

    The FTC's Web Site Privacy and Security Rules for Every Business

    Every business engaged in Internet commerce or using a Web site to collect personal information – not just those businesses subject to financial services industry regulations – must comply with the Federal Trade Commission rules governing the use and protection of personal information. Failure to comply can be very costly.

    Mark F. Foley

    Wisconsin LawyerWisconsin Lawyer
    Vol. 81, No. 3, March 2008

    The FTC's Web Site Privacy and Security Rules for Every Business

    Every business engaged in Internet commerce or using a Web site to collect personal information - not just those businesses subject to financial services industry regulations - must comply with Federal Trade Commission rules governing the use and protection of personal information. Failure to comply can be very costly.

    Sidebar:

    Castle

    by Mark F. Foley

    The Federal Trade Commission (FTC) has authority under the Federal Trade Commission Act to bring enforcement actions1 to stop unfair and deceptive acts or practices.2 Through the filing, or the threat of filing, just 20 administrative and civil complaints, the FTC has used this power to establish minimum requirements for data privacy and security practices for the online world.3 This article explores the scope and content of these rules as they affect entities engaged in Internet commerce.

    Do What You Say

    The first lesson that emerges from the FTC cases is one that seems obvious to everyone except, apparently, online data collectors: Do what you say. Anything else is unfair and deceptive.

    In its first enforcement action involving online privacy practices, the FTC issued a draft administrative complaint against GeoCities, the operator of a Web site that hosted personal home pages and provided email addresses to registered adults and children. GeoCities' "New Member Application" required users to provide personal identifying information (name, address, gender, and age) and requested additional information about user interests. Applicants were asked to select from a list of special offer topics and to designate whether they wished to receive specific products or services from individual companies.

    GeoCities' published privacy policy promised that "[w]e will not share this information with anyone without your permission…."4 In truth, GeoCities sold, rented, or disclosed the collected personal identifying information to third parties who used it for purposes not approved by the data subjects.

    GeoCities capitulated in the face of the FTC's threats and resulting bad publicity, entering into a 20-year consent decree establishing what would become a familiar pattern in FTC enforcement cases.5 GeoCities agreed not to make any misrepresentation, expressly or by implication, about its collection or use of information from or about consumers. GeoCities agreed not to collect information from children if GeoCities had actual knowledge that a parent had not given permission to provide the information. GeoCities also agreed to provide a clear and prominent notice to consumers about its practices regarding the collection and use of personal identifying information, including:

    • what information is collected;
    • its intended uses;
    • third parties to whom it will be disclosed;
    • consumers' ability to access the information; and
    • consumers' ability to remove information from GeoCities' databases.

    The decree requires this information to appear on GeoCities' home page or a page accessible from a home page hyperlink and at each location on the Web site at which personal identifying information is collected. Finally, the decree requires GeoCities to establish a procedure for obtaining express parental consent before collecting and using personal identifying information from children.

    Mark F. Foley

    Mark F. Foley, Michigan 1981, is a partner at Foley & Lardner LLP, Milwaukee, practicing in the litigation and data privacy and security practice groups. He counsels domestic, foreign, and multinational companies on domestic and international data privacy and security compliance.

    The GeoCities case establishes that it is an unfair or deceptive trade practice to mislead consumers about online data privacy practices. The case also illustrates the FTC's special sensitivity to the collection and use of information about children and establishes a standard for minimum fair information privacy principles (FIPPs).

    The FTC repeated these themes in subsequent cases. Exactly three months after the GeoCities consent decree, the FTC reached a settlement of threatened charges against Liberty Financial Companies Inc.6 Liberty created Web pages directed at children. Through this Web site, known as "The Young Investor Measure Up Survey," Liberty collected information about allowances, financial gifts, spending, work habits, college plans, and family finances. The survey stated that "all of your answers will be totally anonymous."7 The children's answers were merged with contact information for a promised newsletter and quarterly prize drawings, but no newsletter was ever created and no prizes were awarded. The FTC's core complaint, as in GeoCities, was that the Web site operator had not done what it promised to do. The resulting 20-year consent decree prohibited future misrepresentations and required Liberty's compliance with the GeoCities FIPPs.

    Similarly, the FTC sued to prevent the bankruptcy trustee of online retailer Toysmart.com from selling a customer contact list despite the company's express promise that personal information collected through its Web site "is never shared with a third party … [and] is used only to personalize your experience online."8 In fact, every FTC privacy case involves an allegation that the target company failed to do what it expressly or impliedly promised.

    Say What You Do

    A second lesson from the FTC enforcement cases is that it is not enough that a company do what it says; it also must say what it does in a clear and conspicuous way. In two related cases, Educational Research Center9 and National Research Center,10 the FTC complained about data uses that went beyond what the Web site operator had disclosed. Both entities collected information from students, representing that it would be tabulated into a report used by colleges and universities to "keep in touch with the interests and trends among today's high school students" and to "make funding available for students' post-secondary education."11 Although the information was shared with such educational institutions, it also was shared with commercial entities for marketing purposes. The FTC alleged that the failure to include complete information about how data would be used constituted an unfair and deceptive trade practice.12

    A new permutation of this "say what you do" principle appeared in Cartmanager International.13Cartmanager provided shopping cart software and related services to thousands of online retail merchants. The software generated customized shopping cart and checkout Web pages for use on merchants' Web sites. These pages resided on Cartmanager's Web site, but they were designed to look like the other pages on the merchant's site and typically displayed the merchant's name and logo. Information collected through the Cartmanager software, including customers' names, billing and shipping addresses, phone numbers, email addresses, credit card information, and merchandise ordered, was transmitted to Cartmanager, which then notified the merchant so it could fulfill the customers' orders.14

    Some of the merchants had published privacy policies promising not to share personal information with third parties. But in January 2003 Cartmanager began renting to third parties for marketing purposes consumers' personal information that it collected through shopping cart and checkout pages. The FTC alleged that this constituted an unfair and deceptive practice because Cartmanager's pages appeared to be part of the merchants' individual pages, and consumers were not notified that different privacy policies applied to information provided through the sales and checkout pages. The FTC also complained that Cartmanager failed to disclose to the merchants its intention to share such information. Although Cartmanager's software license agreement provided that "Cartmanager shall retain full ownership of all data submitted by either Merchant or Purchaser," this was "buried in the middle of the online agreement and does not explain how [Cartmanager] intends to use the information or that such use may conflict with the merchants' privacy policies."15

    Have Reasonable and Appropriate Security Practices

    A third lesson established by the FTC cases is that strong privacy practices are not enough; a business also must have security practices that are reasonable and appropriate to the nature of the data. In early 2000 the FTC filed a lawsuit against ReverseAuction.com16 alleging that the company had become an eBay user in order to obtain other people's eBay user IDs, email addresses, and feedback ratings in violation of eBay's terms and conditions of use. ReverseAuction.com then sent email to the other eBay users suggesting that their eBay membership IDs would expire if the user did not update his or her information. ReverseAuction, in a precursor to today's phishing activities, did this in order to get eBay users to provide personal identifying information to ReverseAuction, which used the data for its own purposes. Once again, the FTC demanded that the company cease the deceptive practices, divest itself of its ill-gotten information, and promise to adopt the same FIPPs expressed in GeoCities and Liberty.

    Even though no security breach was involved in ReverseAuction's unfair practices, the FTC added a requirement that the company disclose "the steps defendant has taken to ensure the security of the information collected and/or maintained at the site." This was the first indication that the FTC would require security mechanisms for Web site operators not covered by substantive legislation such as the Gramm-Leach-Bliley Act (GLBA) or the Fair Credit Reporting Act (FCRA).17

    Having already established that Web site operators had to disclose their practices, the FTC took the next logical step by adding to its list of prohibited practices the making of misleading express or implied statements about Web site security. In the Microsoft case, the FTC's complaint alleged that the company had represented "expressly or by implication, that it maintained a high level of online security by employing sufficient measures reasonable and appropriate under the circumstances to maintain and protect the privacy and confidentiality of personal information obtained from or about consumers in connection with the Passport and Passport Wallet services."18 Specifically, Microsoft had said that ".NET Passport achieves a high level of Web Security by using technologies and systems designed to prevent unauthorized access to your personal information … is protected by powerful online security technology and … is stored on secure … servers … in controlled facilities."19 The FTC complained that Microsoft did not fulfill these express promises.

    The FTC complaint about what Microsoft had failed to do creates, by implication, a list of what the FTC thinks a company must do to have adequate security policies, even when the Web site operator is not covered by specific legislative or regulatory requirements:

    "[R]espondent failed to implement and document procedures that were reasonable and appropriate to: (1) prevent possible unauthorized access to the Passport system; (2) detect possible unauthorized access to the Passport system; (3) monitor the Passport system for potential vulnerabilities; and (4) record and retain system information sufficient to perform security audits and investigations."20

    In its next administrative proceeding, Guess?, Inc.,21 the FTC revealed its thinking about the substantive contents of a reasonable and appropriate security policy. Guess? sold its clothing and accessories through various outlets, including the Web site. To make purchases on the Web site, consumers were required to use a credit or debit card and to divulge the customer's name, address, and credit or debit card number and expiration date. The company stored this information in databases that supported or were connected to the Web site. Guess.com's privacy policy said:

    "This site has security measures in place to protect the loss, misuse and alteration of the information under our control. All orders are transmitted over secure Internet connections using SSL (Secure Sockets Layer) encryption technology. All of your personal information including your credit card information and sign-in password are stored in an unreadable, encrypted format at all times. This Website and more importantly all user information, is further protected by a multi-layer firewall based security system."22

    In fact, the company did not encrypt stored data. Guess.com's software was designed to automatically present in readable text any information retrieved from or supplied to the databases.23 Thus, the databases were vulnerable to the use of a structured query language (SQL) injection string. By inserting an SQL query into the URL address bar of a standard browser, an unauthorized individual could retrieve any data held in the Web-connected databases.

    The FTC complaint alleged that to avoid violating the Federal Trade Commission Act, Web site operators collecting personal identifying information had to implement a security policy that would include procedures "reasonable and appropriate to: (1) detect reasonably foreseeable vulnerabilities of their Website and application and (2) prevent visitors to the Website from exploiting such vulnerabilities and gaining access to sensitive information."24

    Guess?'s 20-year consent decree requires adoption of a security program having:

    "[A]dministrative, technical, and physical safeguards appropriate to Respondents' size and complexity, the nature and scope of Respondents' activities, and the sensitivity of the personal information collected from or about consumers, including:

    "A. the designation of an employee or employees to coordinate and be accountable for the information security program.

    "B. the identification of material internal and external risks to the security, confidentiality, and integrity of personal information … and assessment of the sufficiency of any safeguards in place to control these risks …

    "C. the design and implementation of reasonable safeguards to control the risks identified … and regular testing or monitoring of the effectiveness of the safeguards' key controls, systems, and procedures …

    "[and] that Respondents obtain an assessment and report from a qualified, objective, independent third-party professional, [to examine, assess, and certify] that Respondents' security program is operating with sufficient effectiveness to provide reasonable assurance that the security, confidentiality, and integrity of personal information is protected…." 25

    The FTC added in a later case that such security assessments must be completed by a person "qualified as a Certified Information System Security Professional (CISSP); … a Certified Information Systems Auditor (CISA); a person holding Global Information Assurance Certification (GIAC) …, or a similarly qualified person or organization approved by the Associate Director for Enforcement."26

    In subsequent cases, the FTC expanded its definition of what constitutes reasonable and appropriate security. In Tower Records,27 the FTC took the position that companies must implement fixes for "widely known" security threats and must implement appropriate change controls to ensure that existing privacy and security practices are continued. In Cardsystems, the FTC added requirements that "(i) companies should not store sensitive information for unnecessarily long periods of time or in a vulnerable (i.e., unencrypted) format, (ii) must use strong passwords to prevent a hacker from gaining control over computers and access to personal information stored on a network, (iii) must use readily available security measures to limit access between computers on its network and with the internet; and (iv) must employ sufficient measures to detect unauthorized access to personal information or to conduct security investigations."28

    The FTC's imposition on companies of a duty to implement reasonable and appropriate information data security practices stems from the agency's work under the GLBA. Pursuant to the GLBA, the FTC and several other federal agencies overseeing the financial services industry issued identical regulations titled "Guidelines Establishing Standards for Safeguarding Consumer Information." According to these guidelines, later adopted by the FTC as its GLBA Safeguards Rule in 2002, "security is more a process than a state."29 The Department of Health and Human Services adopted the same approach in the HIPAA Security Standards for health care information.30 The FTC has taken these process oriented, fact-driven standards, which were created under industry-specific regulations, and established them as a general standard for data security.

    Training and Oversight Are Required

    In the Eli Lilly case, the FTC taught that merely having a suitable privacy policy is not sufficient; companies must take appropriate steps to implement their policies.31

    The FTC complained that Eli Lilly had inadvertently disclosed personal identifying information about users of an antidepressant drug, Prozac, by sending an email with every user's address in the "To" box. This made all the email addresses viewable by all the recipients and therefore arguably disclosed the addressees' use of the drug. The agency complained that this error had occurred as a result of inadequate training and oversight of the personnel who sent the email, and the FTC required the company to improve training and supervision. Having the right policy was not enough; the company also had to take reasonable steps to make sure the policy was properly implemented.

    Don't Change the Rules Retroactively

    The fifth lesson is that a company cannot retroactively change the rules of the privacy and security game to the detriment of consumers. In Gateway,32 the FTC objected to the "Hooked-on-Phonics" company's use of personal identifying information collected from parents in violation of previously published privacy policies. Gateway had said that it would not sell, rent, or loan personally identifiable information to any third party without receiving the customer's explicit consent.33 Those same policies informed users that the policy might change in the future, but promised that Gateway would notify consumers of such changes "on this Site or by e-mail. You will then be able to opt-out of this information usage by sending an email."34

    In April 2003, Gateway began renting personal information provided by consumers on the Gateway Learning Web site without seeking or receiving consumers' consent. On June 20, 2003, Gateway posted on its Web site a new privacy policy that contained a revised statement permitting the sharing of personal information with third parties and requiring consumers to write to Gateway to object if they wished to opt out of the new policy. Gateway later made additional changes and added "updated July 17, 2003" to its privacy policy. But Gateway took no additional steps to alert customers that it had changed its policy to permit third-party sharing of personal information without explicit consent.

    The FTC complained that the retroactive application of privacy policy changes caused or is likely to cause substantial injury to consumers. The FTC said that Gateway should have provided additional notice that its policy had materially changed and what aspects of the policy had changed.35 The resultant 20-year consent decree prohibits Gateway from applying material changes in its privacy policy to information collected before the posting and notification of the new policy, unless Gateway obtains the express affirmative (opt-in) consent of the affected consumers.36

    The High Cost of Noncompliance

    As the cases discussed above demonstrate, the FTC commonly resolves complaints by requiring a consent decree describing in detail specific steps the target company must take, subject to agency oversight, typically for a 20-year period.

    If that is not enough by itself to encourage compliance, the agency demonstrated in ChoicePoint37 just how aggressive it can be in seeking to rectify unfair and deceptive practices. ChoicePoint collected information from consumer reporting agencies and public sources, not the consumers themselves. ChoicePoint sold compilations of this information to fee-paying subscribers, qualifying certain of ChoicePoint's subsidiaries as "consumer reporting agencies" under the FCRA.38 To become a subscriber, a business had to submit an application that included information and documentation to establish that the applicant is a legitimate business with a lawful purpose for purchasing consumer data.

    In early 2005 ChoicePoint discovered that it may have disclosed the personal information of 163,000 consumers to persons who did not have a lawful purpose for acquiring the data. The information disclosed included birth dates, Social Security numbers, and, in many cases, credit reports. At least 800 cases of identity theft arose out of these disclosures.

    According to the FTC complaint, this disclosure occurred because ChoicePoint had failed to implement reasonable procedures to verify or authenticate the identities and qualifications of prospective subscribers39 and failed to monitor unauthorized activity by subscribers, even after subpoenas from law enforcement authorities alerting it to fraudulent accounts and its own experiences with a subscriber should have raised doubts about the legitimacy of the subscriber's business.40

    The FTC and ChoicePoint stipulated to entry of a civil judgment imposing what had become the FTC's standard 20-year consent decree oversight terms. The judgment also required ChoicePoint to pay a $10 million civil penalty and to deposit $5 million into a fund administered by the FTC for equitable relief, including consumer redress. The court ordered the company to adopt specific internal procedures for investigating subscribers and a comprehensive information security program, fully documented in writing. As part of this program, the company had to designate an employee to coordinate and be held accountable for the information security program; identify the material internal and external risks to security, confidentiality, and integrity of personal information that could result in unauthorized disclosures, misuse, loss, alteration, destruction, or other compromise of such information; and design and implement reasonable safeguards to control the risks through assessment and regular testing. ChoicePoint also reportedly spent $9 million in legal and technical fees as a result of the breach and FTC action and suffered significant declines in its stock price. These costs and fines should be large enough to get a business's attention.

    The nature of ChoicePoint's deficiencies also is instructive. This was not a case of a sophisticated hacker penetrating technical defenses, but plain old con artists using simple, sloppy, tricks easily detected by anyone paying attention. ChoicePoint's lapse was not so much in failing to have privacy and security policies in place but in failing to administer them in a diligent way.41

    Finally, it also is noteworthy that the FTC raised these issues and imposed these sanctions both under the FCRA regulations applicable to consumer reporting agencies and pursuant to its general powers to prohibit unfair and deceptive trade practices. That is, the agency has made clear that it believes all companies should adopt security practices like those required under financial industry regulations, even if those regulations do not specifically apply.

    Conclusion

    The FTC's enforcement actions establish important lessons for every company collecting or using personal identifying information. While the FTC has not established specific minimum substantive content for privacy policies, it has established procedural minimums. A company must tell data subjects what information it is collecting about them and how it is going to use the information. A company must do what it says, not just in theory, but in practice. It is not enough to have a published privacy and security policy; a company also must provide appropriate training and oversight to make policy implementation a reality, and it must not apply to data a less restrictive usage policy if the data was collected under a more restrictive policy.

    The FTC cases and recently published guidelines also establish specific minimum content for security policies. Every company should:

    • know what information it has in its files and on its computers;
    • keep only the information it needs for a specific, legitimate business purpose;
    • use strong passwords and controls to prevent unauthorized access to systems, data, and communications;
    • establish technical and nontechnical methods to detect unauthorized access, use, or alteration of data;
    • record and retain system information sufficient to perform security audits and investigations;
    • store sensitive data only for so long as it is needed;
    • encrypt sensitive data when stored or transmitted;
    • establish personal responsibility for data security;
    • perform risk and vulnerability assessments and make adjustments based on the results;
    • test and monitor the effectiveness of the safeguards' key controls, systems, and procedures;
    • promptly apply industry-recognized procedures and fixes;
    • document the security system in writing;
    • use qualified, credentialed, independent third parties to assess and test its systems; and
    • develop plans for responding to security incidents if they occur.42

    Most important, the FTC has established the requirements that privacy and security policies must be based on the sensitivity of the data at issue, and that such policies and practices must evolve continually in light of the ever changing nature of the threats. That is, security is a process, not a state or destination.

    The final lesson is that all companies must be aware of these rules, not just those companies specifically subject to detailed financial services industry regulations. Failure to comply with the FTC's data privacy and security rules can lead to very costly lessons.

    Endnotes

    1The FTC may bring suit in federal court to obtain a temporary restraining order or preliminary injunction pending the commencement of an administrative proceeding. The FTC also may file an administrative complaint. See 15 U.S.C. § 53(b). In most of the FTC privacy and security matters the mere threat of a formal complaint has led to the negotiation of a consent decree with the target company. These decrees typically impose 20-year-long requirements that companies engage in particular practices, avoid specified practices, or obtain third-party oversight of their activities.

    2See 15 U.S.C. § 45(a)(2).

    3The FTC also has authority to issue privacy and security regulations under the Gramm-Leach-Bliley Act (GLBA) (see 15 U.S.C. §§ 6821-6827 and implementing regulations at 16 C.F.R. parts 313-314) and the Children's Online Privacy Protection Act (COPPA) (see 15 U.S.C. §§ 6501-6508 and implementing regulations at 16 C.F.R. part 312). Those rules apply only to persons engaged in specific types of financial services or to Web sites that collect information from or about children, respectively. Detailed analysis of these regulations is beyond the scope of this article.

    4See In re GeoCities, Docket No. C-3850, Compl. at ¶ 12, Ex. A at 3. All the FTC complaints and consent decrees referenced in this article are available at http://www.ftc.gov.

    5Contrary to what has happened since, FTC Commissioner Orson Swindle said in his concurring statement at the time of the GeoCitiesconsent decree: "I have voted in favor of final issuance of the consent order in this matter because its provisions are appropriate to remedy the alleged violations of the law by GeoCities, Inc. However, I want to emphasize that my support for these provisions as a remedy for alleged law violations in this particular case does not necessarily mean that I would support imposing these requirements on other commercial Internet sites through either legislation or regulation."

    6In re Liberty Fin. Cos., Docket No. C-3891.

    7See In re Liberty Fin., Compl. at ¶ 4, Ex. A. at 2.

    8See FTC v. Toysmart.com L.L.C., No. .00 CV 11341 RGS, Compl. for permanent inj. and other equitable relief, at ¶ 9, Ex. 1 at 1 (D. Mass. July 7, 2000).

    9In re Educational Research Ctr. of Am., Inc., No. C-4079 (May 2003).

    10In re National Research Ctr. for College & University Admissions Inc., File No. 022 3005 (2002).

    11See In re Educational Research Ctr., Compl. at 3, Ex. A at 4.

    12Id. Compl. at 3-4.

    13In re Vision I Props. L.L.C., File No. 042 3068, Compl. (Mar. 2005),

    14Id. at 1-2.

    15Id. at 3.

    16FTC v. ReverseAuction.com Inc., stipulated consent agreement and final order, Privacy Notice VI, at F (D.D.C. 2000).

    17See 15 U.S.C. § 1681a(f). The ReverseAuction.com consent decree coincided with the FTC's publication of its proposed data security standard under the GLBA. See Standards for Safeguarding Customer Information, 67 Fed. Reg. 36,484 (May 23, 2002). The regulations eventually adopted by the FTC to implement the GLBA's data security requirements contain many of the same concepts and provisions found in the ReverseAuction.com consent decree. See 16 C.F.R. pt. 314.

    18In re Microsoft Corp., Docket No. C-4069, Compl. at 2 (Dec. 24, 2002).

    19Id. at 1-2.

    20Id. at 2.

    21See In re Guess?, Inc., Docket No. C-4091, Compl. at 2 (July 2003).

    22Id. at 2.

    23Id. at 1-2.

    24Id., agreement containing consent order, at 4.

    25See In re Guess?, Inc., Docket No. C-4091, commission decision and order, at 3-4 (July 30, 2003). The FTC made similar allegations on similar facts with similar results in another case 18 months later and again in 2006. See In re PETCO Animal Supplies, Inc., Docket No. C-4133, Compl. (undated in 2004) and commission decision and order (Mar. 4, 2005), and again in late 2005; see also In re Cardsystems Solutions Inc., File No. 052 3148, draft Compl. and consent decree (posted Jan. 2006).

    26See In re DSW Inc., Docket No. C-4157, decision and order at 4 (2006).

    27See In re MTS Inc., Docket No. C-4110, Compl. (May 2004).

    28See In re Cardsystems Solutions Inc., File No. 052-3148, Compl. at 2; see also In re DSW Inc., Docket No. C-4157, Compl. at 2. Lest anyone think that only technically challenged companies fail to satisfy these requirements, the FTC entered into a similar consent decree with a company that "sells software and related training, materials, and services that customers use [to] investigate and respond to computer breaches and other security incidents." See In re Guidance Software Inc., File No. 062 3057, Compl. and consent decree (2006).

    29See Final Report of the FTC Advisory Committee on Online Access and Security 26 (May 15, 2000), available at www.ftc.gov.

    3045 C.F.R. pt. 164.

    31In re Eli Lilly & Co., Docket No. C-4047 (May 2002).

    32See In re Gateway Learning Corp., Docket No. C-4120, Compl. (Sept. 2004).

    33Id., Compl. at 2, Ex. A at 2.

    34Id., Ex. A at 4.

    35Id., Ex. A at 5.

    36See In re Gateway Learning Corp., Docket No. C-4120, decision and order (Sept. 2003).

    37United States. v. ChoicePoint Inc., No. 06-CV-0198, Compl. (N.D. Ga. 2006).

    38See complaint, id. at 3. See also 15 U.S.C. § 1681a(f).

    39See complaint, id. at 4-5.

    40See complaint, id. at 5-7. The FTC also alleged that ChoicePoint had adopted and published various privacy principles that created the false impression it had implemented effective privacy and security practices. However, it is hard to understand how these representations were material, because they were not made to consumers, and consumers were not allowed to modify or remove their data from ChoicePoint's databases.

    41According to the FTC, ChoicePoint furnished to a purported apartment leasing subscriber, over a short period of time, consumer reports that substantially exceeded in number the total number of rental units stated in the subscriber's application. ChoicePoint continued to furnish consumer reports to a subscriber whose telephone had been disconnected, whose address was incorrect, and whose credit card number used for payment was in the name of an individual not associated with the subscriber's ChoicePoint account. The company continued to provide reports to subscribers who made multiple changes of address over a short period of time and paid ChoicePoint using commercial money orders drawn on multiple issuers. ChoicePoint also allegedly accepted untrustworthy documents as verification of application information, including documents that contained self contradictory information, documents indicating that the subscriber was suspended or inactive, documents inconsistent with an applicant's stated type of business, applications transmitted by fax from public commercial locations, applications from putatively separate businesses sent from the same fax numbers, and applications from subscribers that were linked by ChoicePoint's own internal reports to possible fraud associated with the Social Security number of another individual.

    42In its recently published guidelines, the FTC summarizes these recommendations as "1. Take stock. 2. Scale down. 3. Lock it. 4. Pitch it. 5. Plan ahead." See Protecting Personal Information: A Guide for Business (Federal Trade Commission, October 2007),available at www.ftc.gov.


Join the conversation! Log in to comment.

News & Pubs Search

-
Format: MM/DD/YYYY