This invention relates generally to human resource applications, and more specifically to removing data from candidate profiles that may influence bias.
Many companies and organizations use various HR applications to identify candidates for open job positions. An example of a system that identifies candidates for job openings is described in U.S. patent application Ser. No. 16/121,401, filed on Sep. 4, 2018, and titled “System, Method, and Computer Program for Automatically Predicting the Job Candidates Most Likely to be Hired and Successful in a Job,” the contents of which are incorporated by reference herein. Such systems typically display a list of potential candidates and enable a user (e.g., an HR manager) at the organization to view profiles for the potential candidates. The profiles may be resumes, talent profiles, or enhanced talent profiles as described in the incorporated U.S. patent application Ser. No. 16/121,401.
In reviewing candidate profiles, conscious or unconscious biases by reviewers at the organization may cause them to overlook candidates that are well qualified for a job position. A bias may be a type of bias that is generally known to organizations, such as those related to gender, race, or age. Reviewers may also have biases that are unknown to the organization. For example, a reviewer may make assumptions about a candidate based on a hobby listed on the resume. For example, a reviewer looking for a salesperson with an outgoing personality and a “people person” may assume that someone who plays chess is an introvert, which may or may not be true. In such case, if the candidate otherwise meets the qualifications for the job, it would be better to interview the candidate instead of dismissing the candidate based on his/her profile.
To reduce the likelihood of bias in the initial screening process, it would be helpful to remove data from a candidate's profile that may influence bias. Therefore, there is demand for an HR application that not only identifies potential candidates for open job positions, but creates new profiles for the candidates that exclude data that may influence bias.
The present disclosure describes a system, method, and computer program for removing or replacing information in candidate profiles that may influence bias. The method is performed by a computer system that identifies potential candidates for open job positions and displays profiles for the identified candidates (“the system”).
In one embodiment, the system first creates or obtains a “full profile” for a candidate and then creates a “new profile” for the candidate that excludes or substitutes data in the full profile that may influence reviewer bias. The “new profiles” may be the profiles initially displayed to a user screening candidates for an open job position.
In one embodiment, the system excludes or substitutes the following types of information in creating the new profile: (1) data that personally identifies the candidate, such as the candidate's name, and (2) data that is indicative of gender, race, and/or age (or another defined class of bias). In one embodiment, a method for creating a profile for a job candidate that excludes data that may influence reviewer bias comprises the following steps:
In a further embodiment, the system may also exclude any information in the new profile that is not relevant to the job role for which the candidate is applying. In such embodiments, the system determines what data is relevant for the job for which the candidate is applying and, for data that is relevant, at what level it is relevant (i.e., at the base value or an abstracted level). The system then removes data from the profile that is not relevant for the job role and, where applicable, abstracts some of the remaining data to levels that are relevant for the job role.
The present disclosure describes a system, method, and computer program for removing or replacing information in candidate profiles that may influence bias. The method is performed by a computer system that identifies potential candidates for open job positions and displays profiles for the identified candidates (“the system”). An example of the system is described in U.S. patent application Ser. No. 16/121,401 (incorporated herein above).
For each open job position in an organization, the system displays a list (typically ranked) of identified candidates for the job position. A user (e.g., an HR manager) can select any of the candidate and see a profile associated with the candidate. In one embodiment, the initial profile viewed by the user is a “new profile” created by the system from a “full profile” for the candidate. The “full profile” may be the candidate's resume, talent profile, or enhanced talent profile (e.g., the “enhanced talent profile” described in the US patent application Ser. No. 16/121,401, incorporated herein above). The new profile is based on the candidate's full profile, but excludes data that may influence bias. An organization may configure the system to enable the user to see the full profile at a later point in the interviewing/screening process. A method for creating the new profile is described below.
In creating the new profile, the system excludes personally-identifiable information that is in the full profile, such as name and/or street address information (step 120). The system also excludes or replaces candidate data that may influence bias with respect to one or more defined classes of bias (e.g., gender, race, or age) (step 130). This step is described in more detail with respect to
The system obtains candidate data for set of training candidates, preferably from across a plurality of organizations and a variety of professions (step 220). The training candidate data includes their full candidate profiles (e.g., a resume, talent profile, or enhanced talent profile) and data that enables each of the candidates to be classified with a class value (e.g., name or school graduation date). The system may obtain the training candidate data from a talent repository managed by the system (or an organization/company) or from public data sources that store job-candidate/employment profiles. The system classifies each of the training candidates with a class value (e.g., male or female) (step 230).
The system obtains key-value pairs from the full candidate profiles of the training candidates (step 240), and for each of a plurality of key-values pairs and combinations of key-value pairs, the system determines if the key-value pair or combination of key-value pairs is indicative of a particular class value (step 250). In response to a key-value pair or a combination of key-value pairs being indicative of a particular class value, the system concludes that the key-value pair or combination of key-value pairs may influence bias with respect to the defined class (step 260). In creating a new profile for a candidate, the system removes or substitutes (with class-neutral data) the key-value pairs and combination of key-value pairs identified as influencing bias with respect to the defined class (step 270). “Neutral” data that serves as a substitute for key-value pairs may be an abstracted form of the key-value pair. For example, a particular US college may be replaced with an abstracted value of the college, such as “4-year US college.”
The methods of
For each of a plurality of key-value pairs and combinations of key-value pairs in the full profile of the training candidates, the system maintains a count of the number of times the key-value pair (or the combination) appears for male candidates and the number of times the key-value pair (or the combination) appears for female candidates (step 530), and determines whether the key-value pair or the combination (whichever is applicable) is associated with a particular gender for more than a threshold percentage (e.g., 80%) of candidates in the training set (step 540). If a key-value pair or a combination of key-value pairs is associated with a particular gender for more than the threshold percentage of candidates, the system concludes the key-value pair or the combination of key value-pairs (whichever is applicable) is indicative of the class value and, thus, may influence gender bias (step 550).
For each of the training candidates, the system creates an input vector for the training candidate with a plurality of key-value pairs and combination of key-value pairs obtained from the training candidate's full profile (step 630). To train a neural network, the system inputs the vectors for each of the training candidates into the neural network, along with the candidate's race value (step 640). The result is a neural network that is trained to predict the probability that a key-value pair or a combination of key-value pairs is associated with a particular race value (step 650). For a key-value pair or a combination of key-value pairs having more than a threshold probability (e.g., 90%) of being associated with a particular race value, the system concludes that the key-value pair or the combination of key-value pairs (whichever is applicable) is indicative of race value, and, thus, may influence racial bias (step 660).
For each of a plurality of key-value pairs and combinations of key-value pairs in the full profiles of the training candidates, the system maintains a count of the number of times the key-value pair (or the combination) appears for each of the age ranges (step 720), and determines whether the key-value pair or the combination (whichever is applicable) is associated with a particular age range for more than a threshold percentage (e.g., 80%) of candidates in the training set (step 730). If a key-value pair or a combination of key-value pairs is associated with a particular age range for more than the threshold percentage of candidates, the system concludes the key-value pair or the combination of key value-pairs (whichever is applicable) is indicative of age and, thus, may influence age bias (step 740).
In certain embodiments, creating a new profile for candidate also includes removing any data that is not relevant to the job role for which the candidate is applying. The methods describe above with respect to
For each of the relevant keys, the system identifies at what level the key matters most for the job role (step 820). In other words, for each of the relevant keys, the system identifies whether the actual value for the key matters most or whether an abstracted value for a key matters most. For example, for the “university” key, does the particular university attended by a candidate matter or is whether a candidate went to a top 20% school what matters?
In creating the new profile for the candidate, the system excludes any key-value pairs that are in the full candidate profile but are irrelevant for the job role (step 830). For each of the relevant keys in which an abstracted value matters most for the job role, the system determines whether the candidate's actual value for the key is encompassed by the relevant abstracted value (step 840). For example, if what matters most for the “university” key is whether a candidate went to a top 20% school, then system determines whether the university attended by the candidate is a top 20% school. They system may use published or inputted university rankings to make this determination.
If the candidate's actual value in his/her full profile is encompassed by the relevant abstracted value, the system replaces the key-value pair in the candidate's full profile with the relevant abstracted value in the new profile (step 850). For example, the system may replace “Massachusetts Institute of Technology” in a full profile with “top 20% of engineering schools” in the new profile. Otherwise, the system either excludes the key-value pair from the new profile or replaces the key-value pair with an abstracted value relevant to the candidate in the new profile, depending on how the system is configured (also step 850). For example, if the candidate attended a 4-year college that is not ranked in the top 20% of schools (according to the ranking(s) used by the system), then the system may not specify college information for the candidate or the system may replace the candidate's specific college with something like “US university.” Key-value pairs that are not abstracted or removed in accordance with step 850 remain in the new profile.
If a candidate applies for multiple job positions at an organization, then the system may create a new profile for the candidate for each of the job roles, as what is relevant for one job role may not be relevant for another job role.
Turning to
For each of a plurality of keys in the candidate data, the system computes, using the neural network, how well actual values for the keys and abstractions of values for the keys predict the job role (step 940). The “abstracted values” may be preconfigured by a system administrator or may be determined automatically by clustering values for keys into groups, where a group encompasses multiple values. The system may test multiple abstraction levels for a key.
If an actual value for a key or an abstraction of a key's value is a good predictor for the job role, the system concludes that the key is relevant (step 950). The level (i.e., the base level or an abstracted level) that is the best predictor of the job role is the level that matters most for the key. For example, for the “undergraduate college” key, if “top 20% school” is a better predictor than a particular university value (e.g., “Stanford”), then “top 20% school,” which is an abstracted value, is the level that matters most for that key. Conversely, if for the “skills” key, the value “java” is a better predictor for a job role than an abstracted value that encompasses a wider range of skills, then the base level (i.e., the actual value) for the key is the level that matters most for that key. If neither the actual values for a key, nor abstractions of values for a key, are good predictors, the system concludes that the key is not relevant for the job role (step 960).
For each of a plurality of keys in the ideal candidate data, the system determines if any actual values for the key and any abstractions of values for the key apply to a least a threshold percentage (e.g., 80%) of ideal candidates (step 1030). If no actual value or abstraction of values is applicable to at least a threshold number of ideal candidates, the system concludes that the key is not relevant for the job role (step 1040). If one or more actual values or abstracted values is applicable to at least a threshold number of ideal candidates, the system concludes that the key is relevant for the job role (step 1050). The lowest level applicable to a threshold percentage of ideal candidates is the level that matters most for the job role. For example, if both an abstracted value for a key and a particular actual value for a key apply to at least a threshold percentage of ideal candidates, the base level (actual value) for the key is the level that matters most for that key.
The methods described herein are embodied in software and performed by a computer system (comprising one or more computing devices) executing the software. A person skilled in the art would understand that a computer system has one or more memory units, disks, or other physical, computer-readable storage media for storing software instructions, as well as one or more processors for executing the software instructions.
As stated an example of a computer system for performing the methods described herein is set forth in U.S. patent application Ser. No. 16/121,401 (incorporated herein above). In addition to the software modules described in U.S. patent application Ser. No. 16/121,401, the system may have a software module for performing the methods described herein.
As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the above disclosure is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
This application is a continuation of U.S. patent application Ser. No. 16/209,834 filed Dec. 4, 2018, and is a continuation-in-part of U.S. patent application Ser. No. 17/033,575 filed Sep. 25, 2020, which is a continuation of International Patent Application No. PCT/US2020/012317 filed Jan. 6, 2020. The contents of the above-mentioned applications are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
7472097 | Scarborough et al. | Dec 2008 | B1 |
9665641 | Zhang | May 2017 | B1 |
10185712 | Gidney | Jan 2019 | B2 |
10467339 | Shen | Nov 2019 | B1 |
10776758 | Benedict | Sep 2020 | B1 |
20050004905 | Dresden | Jan 2005 | A1 |
20050086186 | Sullivan et al. | Apr 2005 | A1 |
20050216295 | Abrahamsohn | Sep 2005 | A1 |
20050261956 | Kato | Nov 2005 | A1 |
20060235884 | Pfenniger et al. | Oct 2006 | A1 |
20060271421 | Steneker et al. | Nov 2006 | A1 |
20070047802 | Puri | Mar 2007 | A1 |
20070112585 | Breiter et al. | May 2007 | A1 |
20090144075 | Flinn et al. | Jun 2009 | A1 |
20100100398 | Auker | Apr 2010 | A1 |
20100153149 | Prigge et al. | Jun 2010 | A1 |
20100153150 | Prigge et al. | Jun 2010 | A1 |
20110055098 | Stewart | Mar 2011 | A1 |
20110238591 | Kerr | Sep 2011 | A1 |
20110276505 | Schmitt | Nov 2011 | A1 |
20110276507 | O'Malley | Nov 2011 | A1 |
20110313940 | Kerr | Dec 2011 | A1 |
20130096991 | Gardner et al. | Apr 2013 | A1 |
20130137464 | Kramer | May 2013 | A1 |
20130290208 | Bonmassar et al. | Oct 2013 | A1 |
20140039991 | Gates et al. | Feb 2014 | A1 |
20140074738 | Thankappan | Mar 2014 | A1 |
20140122355 | Hardtke et al. | May 2014 | A1 |
20140282586 | Shear et al. | Sep 2014 | A1 |
20140330734 | Sung et al. | Nov 2014 | A1 |
20150142711 | Pinckney et al. | May 2015 | A1 |
20150161567 | Mondal et al. | Jun 2015 | A1 |
20150178682 | Matthews et al. | Jun 2015 | A1 |
20150244850 | Rodriguez et al. | Aug 2015 | A1 |
20150309986 | Brav et al. | Oct 2015 | A1 |
20150317610 | Rao et al. | Nov 2015 | A1 |
20160012395 | Omar | Jan 2016 | A1 |
20160034463 | Brewer | Feb 2016 | A1 |
20160034853 | Wang et al. | Feb 2016 | A1 |
20160055457 | Mather et al. | Feb 2016 | A1 |
20160098686 | Younger | Apr 2016 | A1 |
20170061081 | Jagannathan et al. | Mar 2017 | A1 |
20170236182 | Ignatyev | Aug 2017 | A1 |
20170243162 | Gavrielides et al. | Aug 2017 | A1 |
20170344555 | Yan et al. | Nov 2017 | A1 |
20170357945 | Ashkenazi et al. | Dec 2017 | A1 |
20180039946 | Bolte et al. | Feb 2018 | A1 |
20180150484 | Dupey et al. | May 2018 | A1 |
20180218330 | Choudhary et al. | Aug 2018 | A1 |
20180232751 | Terhark et al. | Aug 2018 | A1 |
20180308061 | Jadda et al. | Oct 2018 | A1 |
20180336501 | Le et al. | Nov 2018 | A1 |
20180357557 | Williams et al. | Dec 2018 | A1 |
20180373691 | Alba et al. | Dec 2018 | A1 |
20190057356 | Larsen | Feb 2019 | A1 |
20190066056 | Gomez et al. | Feb 2019 | A1 |
20190114593 | Champaneria | Apr 2019 | A1 |
20190164176 | Pydynowski | May 2019 | A1 |
20190197487 | Jersin et al. | Jun 2019 | A1 |
20190205838 | Fang et al. | Jul 2019 | A1 |
20200007336 | Wengel | Jan 2020 | A1 |
20200065769 | Gupta et al. | Feb 2020 | A1 |
20200117582 | Srivastava et al. | Apr 2020 | A1 |
20200160050 | Bhotika et al. | May 2020 | A1 |
Entry |
---|
Peck, Damien. (Victorian Government to remove job seeker details from resumes to improve diversity: The Victorian Government is reducing details on resumes for public service jobs to avoid discrimination for jobseekers. ABC Premium News [Sydney] May 20, 2016.) (Year: 2016). |
Pedreschi et al., “Discrimination-Aware Data Mining,” Aug. 24-27, 2008, KDD 08, Las Vegas, Nevada, 9 pages. |
CUSTOMERGLU “Hire the best candidate for your Company using Artificial Intelligence” (2016), medium.com/SCustomerGlu, 2016, 5 pages. |
Barbara Depompa, “Time for a Diversity ‘Reboot’”, SC Magazine 29:4: 26-29, 2018, Haymarket Media, Inc., pp. 1-4. |
Elejalde-Ruiz, What Resume? Hiring is in the Midst of a Tech Revolution; Cutting Costs, Turnover, Eliminating Bias, South Florida Sun-Sentinel, 2018, pp. 1-3. |
Sarah K. White, 4 Ways Technology has Changed Recruitment—For Better (and Worse), Cio Cxo Media, Inc., 2017, pp. 1-3. |
Sarah Dobson, “Feds Try to Blank Out Bias”, Canadian HR Reporter 30, 9, HAB Press Limited, 2017, pp. 1-3. |
David Hausman, “How Congress Could Reduce Job Discrimination by Promoting Anonymous Hiring”, Stanford Law Review 64.5, 2012, pp. 1343-1369, Stanford University, Stanford Law School. |
Hardt et al. “Equality of Opportunity in Supervised Learning,” arXiv:1610.02413v1, Oct. 7, 2016, 22 pages. |
Liu et al., “Delayed Impact of Fair Machine Learning,” arXiv:1803.04383v2, Apr. 7, 2018, 37 pages. |
Dixon et al., “Measuring and Mitigating Unintended Bias in Text Classification,” Proceeding of the 2018 AAAI/ACM Conf. on AI, Ethics, and Society, Feb. 2-3, 2018, 7 pages. |
International Application No. PCT/US2020/012317, International Search Report and Written Opinion mailed Apr. 9, 2020, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20210264373 A1 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2020/012317 | Jan 2020 | WO |
Child | 17033575 | US | |
Parent | 16209834 | Dec 2018 | US |
Child | 17319524 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17033575 | Sep 2020 | US |
Child | 17319524 | US |