GS Logo
The Green Sheet, Inc

Please Log in

A Thing
View Archives

View flipbook of this issue

Care to Share?

Table of Contents

Lead Story

EMV, four months on

Patti Murphy

Thicken that skin


Industry Update

FTC takes on big data

U.S. adds six Russian banks to OFAC banned list

MasterPass joins Wal-Mart payment mix

New DOT standards reach airport kiosks


The automated ISO

Phablet popularity soars this holiday season

Felix Richter
Statista Inc.

Smarthphone-driven commerce


Choice not chance

Dale S. Laszig
DSL Direct LLC

Will we be Uberized?

Ken Musante
Eureka Payments LLC


Street SmartsSM:
Facts and figures of the MLS

Jeffrey I. Shavitz
TrafficJamming LLC

The M&A market 2016: 10 things to know to best position your business

Adam Hark

Real capabilities of tokenization in mobile payments

David Poole

Termination: The end or a new beginning?

Adam Atlas
Attorney at Law

The high-risk merchant services opportunity

Matt O'Shea
National Bank Services

The time is right for second generation P2PE

Ruston Miles
Bluefin Payment Systems LLC

Company Profile


New Products

Holistic approach to cybersecurity

Next Generation Security Assessment Services
Redhawk Network Security LLC

Future-proof, obsolescence-free POS

CardWare International Inc.


Letter From the Editors

Readers Speak: Much ado about faster payments

Boost Your Biz:Earn respect with your website

ISOMetrics:Online retailer status update

GS Book Notes:Powerful presence, powerful stories

Resource Guide


A Bigger Thing

The Green Sheet Online Edition

January 25, 2016  •  Issue 16:01:02

previous next

FTC takes on big data

A 50-page report released Jan. 6, 2016, by the Federal Trade Commission challenges business owners and consumers to evaluate the benefits and risks of data analytics. Big Data: A Tool for Inclusion or Exclusion? poses tough questions about how data is collected and analyzed and where it is stored at the end of its useful life. The report stated that big data's lifecycle typically spans four stages: collection, compilation and consolidation, analysis, and use.

"Big data's role is growing in nearly every area of business, affecting millions of consumers in concrete ways," said FTC Chairwoman Edith Ramirez. "The potential benefits to consumers are significant, but businesses must ensure that their big data use does not lead to harmful exclusion or discrimination."

Public forum

The issues covered in the FTC report were initially explored in a workshop of the same title held Sept. 15, 2014, and have remained at the forefront of FTC efforts to enforce best practices in big data usage in conformance with the Fair Credit Reporting Act.

Following are topics covered in the 2014 workshop and subsequent 2016 report: How are organizations using big data to categorize consumers?

Big data benefits, risks

The FTC solicited opinions from the public pertaining to data analytic trends in business and private sectors, particularly in the areas of health care, education and credit scoring. The commission also cited numerous ways in which big data can help underserved communities improve access to services in health care, education, employment, and alternative forms of credit and nonbank financing.

The report also took a cautionary stance regarding how inaccurate profiles and biases used in data analytics and credit reporting can impact and marginalize individuals and groups. These outcomes can extend from basic denial of credit and privacy issues to cybercriminals targeting vulnerable consumers.

The FTC urges business owners to adhere to regulatory guidelines issued by the Fair Credit Reporting Act, the FTC Act and equal opportunity laws that govern the use of big data. The report provides guidance on how to assess levels of compliance with these laws.

Four questions

The following four policy questions cited in the report are designed to help companies examine potential biases and determine their level of compliance with legal and ethical guidelines related to big data usage:

  1. How representative is your data set? Companies should consider whether their data sets are missing information about certain populations and take steps to address issues of underrepresentation and overrepresentation. For example, if a company targets services to consumers who communicate through an application or social media, it may be neglecting populations that are not as tech-savvy.
  2. Does your data model account for biases? Companies should consider whether biases are being incorporated at both the collection and analytics stages of big data's life cycle, and where biases exist, develop strategies to overcome them. For example, if a company has a big data algorithm that only considers applicants from top-tier colleges to help them make hiring decisions, they may be incorporating previous biases in college admission decisions.
  3. How accurate are your predictions based on big data? Companies should remember that while big data is very good at detecting correlations, it does not explain which correlations are meaningful. A prime example that demonstrates the limitations of big data analytics is Google Flu Trends, a machine-learning algorithm for predicting the number of flu cases based on Google search terms.

    While, at first, the algorithms appeared to create accurate predictions of where the flu was more prevalent, it generated highly inaccurate estimates over time. This could be because the algorithm failed to take into account certain variables. For example, the algorithm may not have taken into account that people would be more likely to search for flu-related terms if the local news ran a story on a flu outbreak, even if the outbreak occurred halfway around the world.

  4. Does your reliance on big data raise ethical or fairness concerns? Companies should assess the factors that go into an analytics model and balance the predictive value of the model with fairness considerations. For example, one company determined that employees who live closer to their jobs stay at these jobs longer than those who live farther away. However, another company decided to exclude this factor from its hiring algorithm because of concerns about racial discrimination, particularly since different neighborhoods can have different racial compositions.

Notice to readers: These are archived articles. Contact names or information may be out of date. We regret any inconvenience.

previous next

Spotlight Innovators:

North American Bancard | USAePay | Board Studios