The University of South Carolina is offering licensing opportunities for A method for assessing disparate impact in internet markets
Companies increasingly use computer algorithms to make automatic business decisions (e.g., prices) and personalize these decisions for each consumer. Algorithms, however, may unknowingly learn to discriminate against consumers on basis of their gender, race, socioeconomic status, etc. Yet, it is difficult and risky to evaluate algorithms, partly because collecting and holding demographic data can be a liability for businesses.
The proposed innovation is a method for the collection of data that can be used to reliably assess the fairness of automated decisions made by computer algorithms in internet markets (e.g., service platforms such as uber or online retailers such as walmart.com). The proposed approach impersonates consumers to elicit and capture the behaviors of algorithms at a large scale so that the data collected are large and rich enough to draw reliable conclusions.
The proposed innovation can help businesses evaluate the fairness of their algorithms without forcing them to collect and store consumer demographic data. This could be used by big firms serving consumers directly or by consulting companies serving smaller internet firms that cannot afford an in-house team of specialists but have enough revenues to pay for consulting services.
Advantages and Benefits:
Existing firms dedicated to the testing of algorithms are staffed by professionals with technical backgrounds (mathematicians, statisticians, computer scientists), who lack the business expertise to deeply understand all the business contexts in which algorithms have been embedded. They take on all kinds of projects, although current demand is strongest for hiring, housing, and credit decisions because those are the areas that are currently regulated. The proposed innovation instead has an initial focus on consumer markets (e.g., online retailing) and embeds substantive knowledge of the retailing industry.
In addition, available solutions require internal data and full collaboration from the firms being audited. This imposes a burden on the firms, which must allocate internal resources to support the fairness assessments. In contrast, the proposed innovation uses public data rather than internal data. Therefore, the conclusions it generates may be safer and more credible. The firms being evaluated need not allocate resources or be concerned about collecting and storing demographic data (which may create liabilities).