Connect with us


Podcast: Can AI fix your credit?



Podcast: Can AI fix your credit?

Credit scores have been used for decades to assess consumer creditworthiness, but their scope is far greater now that they are powered by algorithms. Not only do they consider vastly more data, in both volume and type, but they increasingly affect whether you can buy a car, rent an apartment, or get a full-time job. In this second of a series on automation and our wallets, we explore just how much the machines that determine our credit worthiness have come to affect far more than our financial lives.

We Meet:

  • Chi Chi Wu, staff attorney at National Consumer Law Center  
  • Michele Gilman, professor of law at University of Baltimore
  • Mike de Vere, CEO Zest AI


This episode was produced by Jennifer Strong, Karen Hao, Emma Cillekens and Anthony Green. We’re edited by Michael Reilly.



Miriam: It was not uncommon to be locked out of our hotel room or to have a key not work and him have to go down to the front desk and handle it. And it was not uncommon to pay a bill at a restaurant and then have the check come back. 

Jennifer: We’re going to call this woman Miriam to protect her privacy. She was 21 when she met the man she would marry… and.. within a few short years.. turn her life… and her financial position… up-side-down.

Miriam: But he always had a reason and it was always someone else’s fault.

Jennifer: When they first met, Miram was working two jobs, she was writing budgets on a whiteboard, and she was making a dent in her student debt.

Her credit was clean.

Miriam: He took me out to dinner and he took me on little trips, you know, two or three night vacation deals to the beach or, you know, local stuff. And he always paid for everything and I just thought that was so fun.

Miriam: And then he started asking if he could use my empty credit cards for one of his businesses. And he would charge to the full amount, about 5,000 and then pay it off within, I mean, two or three days every time. And he just called it flipping. That happened for a while. And during that, that just became a normal thing. And so I kind of stopped paying attention to it. 

Jennifer: Until one day…her entire world came crashing down.

Miriam: I had, let’s see a six year old, a two year old and a four year old and it’s Halloween morning and we’re in the dining room getting ready to take her to preschool. And, um, the FBI came and arrested my husband and like, it’s just like the movies, you know, they go through all your stuff and they send a bunch of men with muddy boots and guns into your house. 

Jennifer: A federal judge convicted her husband of committing a quarter million dollars of wire fraud… and Miriam discovered tens of thousands of dollars of debt in her name. 

She was left to pick up the pieces… and the finances.

Miriam: I mean my credit score was below 500 at one point. I mean, it just plummeted and that takes a long time to dig out of, but I have learned that it’s sort of a little by little thing… which I had to educate myself on.  I mean, since this whole debacle here, um, I’ve never missed anything. It’s like… more important to me than most things… is keeping my credit score golden.

Jennifer: She’s a survivor of what’s known as “coerced debt,”. It’s a form of economic abuse… usually by a partner or family member.  

Miriam: There’s no physical wounds. Right. And there’s, this, isn’t something you can just like call the police on somebody. And, and also it’s not usually a hostile situation. It’s usually pretty, it’s a calm conversation where he works his way in and then gets what he wants.

Jennifer: Economic abuse isn’t new… but like identity theft, it’s become a whole lot easier in a digital world of online forms and automated decisions.

Miriam: I know what an algorithm is. I get that. But like, what do you mean my credit algorithm? 

Jennifer: She got back on her feet… but many don’t… and as algorithms continue to take over our financial credit system…some argue this could get a lot worse.

Gilman: We have a system that makes people  who are experiencing hardship out of their control, look like deadbeats, which in turn impacts their ability to gain the opportunities necessary to escape poverty and gain economic stability. 

Jennifer: But others argue the right credit-scoring algorithms… could be the gateway to a better future… where biases can be eradicated… and the system made fairer. 

De Vere: So from my perspective, credit equals opportunity. It’s really important as a society that we get that right. We believe there can be a 2.0 version of that, leveraging machine learning. 

Jennifer: I’m Jennifer Strong and in this second of a series on automation and our wallets… we explore just how much the machines that determine our credit worthiness.. have come to affect far more than our financial lives. 


Jennifer: It used to be when someone wanted a loan…they formed relationships with people at a bank or credit union who made decisions about how safe, or risky, that investment seemed.

Like this scene from the 1940’s Christmas classic, It’s a Wonderful Life… where the film’s main character decides to loan his own money to customers to keep his business afloat…. after an attempted run on the bank.

George: I got $2,000! Here’s $2000 this will tie us over until the bank reopens. All right, Tom, how much do you need?

Tom: $242.

George: Oh Tom. Just enough to tide you over until the bank reop—.

Tom: I’ll take $242!

George: There you are. 

Tom: That’ll close my account. 

George: Your account is still here. That’s a loan!

Jennifer: These days banks make loans without ever meeting many of their customers… Often, these decisions are automated… based on data from your credit report… which tracks things like credit card balances, car loans, student debt… and includes a mix of other personal data…   

In the 1950s the industry wanted a way to standardize these reports… so data scientists figured out a way to take that information… run it through a computer model and spit out a number…. 

That’s your credit score… and it’s not just banks who use them to make decisions. Depending on where you live, all sorts of groups refer to this number… including landlords…insurance companies… even, employers.

Wu: Consumers are not the customers for credit bureaus. We are, or our data is the commodity. We’re not the customers, we’re the chicken. We, we’re the thing that gets sold….

Jennifer: Chi Chi Wu is a consumer advocate and attorney at the National Consumer Law Center. 

Wu: And so, as a result, the incentives in this market are kind of messed up. The incentives are to serve the needs of creditors and other users of reports and not consumers.

Jennifer: When it comes to credit reports, there are three keepers of the keys…. Equifax, Experian, and Transunion. 

But these reports are far from comprehensive… and they can be inaccurate. 

Wu: There are unacceptably high levels of errors in credit reports. Um, now the data from the definitive study by the federal trade commission found that, uh, one in five consumers had a verified error on their credit report. And one in 20 or 5% had an error so serious it would cause them to be denied for credit, or they would have to pay more. 

Jennifer: Complaints to the federal government about these reports have exploded in recent years…  and last year during the pandemic? Complaints about errors doubled.

These make up more than half of all complaints filed with the C-F-P-B — or the Consumer Financial Protection Bureau of the U-S government.

But Wu believes even without any errors, the way credit scores are used… is a problem. 

Wu: So the problem is employers… landlords. They start looking at credit reports and credit scores as some sort of reflection of a person’s underlying responsibility, their value as a person, their character. And that’s just completely wrong. What we see is people end up with negative information on their credit report because they’ve struggled financially because something bad has happened to them. So people who’ve lost their jobs, who’ve gotten sick. Um, they can’t pay their bills. And this pandemic is the perfect illustration of that and you can really see this in the racial disparities in credit scoring. The credit scores for black communities are much lower than for white communities and for Latin X communities, it’s somewhere in between. And has nothing to do with character. It has everything to do with inequality.

Jennifer: And as the industry replaces older credit-scoring methods with machine learning…she worries this could entrench the problem. 

Wu: And if left unchecked, if there is no intentional control for this, if we are not wary of this, the same thing will happen to those algorithms that happened to credit scoring, which will be, they will impede the progress of the historically marginalized communities.

Jennifer: She especially worries about companies who promise their credit-scoring algorithms are more fair because they use alternative data….data that’s supposedly less prone to racial bias…

Wu: Like your cell phone bill, or your rent, um, to the more funky fringy, big data. What’s in your social media feed for the first type of alternative data that is sort of conventional or financial, um, my mantra has been the devil’s in the detail. Some of that data looks promising. Other types of that data can be very risky. So that’s my concern about artificial intelligence and machine learning. Not that we should never use them. You just, you have to use them, right? You have to use them with intentionality. They could be the solution. If they’re told one of your goals is to minimize disparities for marginalized groups. You know your goal is to be as predictive or more predictive with less disparities.

Jennifer: Congress is considering restricting employers’ use of credit reports… and some states have moved to ban them in setting insurance rates… or  access to affordable housing.

But awareness is also an issue.

Gilman: There are a lot of credit reporting harms that are impacting people without their knowledge. And if you don’t know that you’ve been harmed, you can’t get assistance or remedies,

Jennifer: Michelle Gilman is a clinical law professor at the University of Baltimore…

Gilman: I wasn’t taught about algorithmic decision-making in law school and most law students still aren’t. And they can be very intimidated by the thought of having to challenge an algorithm.

Jennifer: She’s not sure when she first noticed that algorithms were making decisions for her clients. But one case stands out… of an elderly and disabled client whose home health care hours under the Medicaid program were drastically cut.. even though the client was getting sicker…

Gilman: And it wasn’t until we were before an administrative law judge in a contested hearing that it became clear the cut in hours was due to an algorithm. And yet the witness for the state who was a nurse, couldn’t explain anything about the algorithm. She just kept repeating over and over that it was internationally and statistically validated, but she couldn’t tell us how it worked, what data was fed into it, what factors it weighed, how the factors were weighed. And so my student attorney looks at me and we’re looking at each other thinking, how do we cross examine an algorithm?

Jennifer: She connected with other lawyers around the country who were experiencing the same thing. And she realized the problem was far bigger …

Gilman: And when it comes to algorithms, they are operating across almost every aspect of our client’s lives.

Jennifer: And credit reporting algorithms are the most pervasive.

Her firm sees victims who get saddled with unexpected debt…sometimes due to hardship…other times from medical bills…or… because of identity theft, where someone else takes loans in your name… 

But the impact is the same…it weighs down credit scores… and even when the debt is cleared, it can have long-term effects.

Gilman: As a good consumer lawyer, we need to know that sometimes just resolving the actual litigation in front of you, isn’t enough. You have to also go out and clean up the ripple effects of these algorithmic systems. A lot of poverty lawyers share the same biases that the general population does in terms of seeing a computer generated outcome and thinking it’s neutral, it’s objective, it’s correct. It’s somehow magic. It’s like a calculator. And none of those assumptions are true, but we need the training and the resources to understand how these systems operate. And then we need as a community to develop better tools so that we can interrogate those systems so that we can challenge these systems.

<music transition> 

Jennifer: After the break… We look at the effort to automate fairness in credit reporting.


De Vere: AI helps in two ways: it’s more data and better math. And so if you think of limitations on current math, you know, they can pull in a couple of dozen variables. And, uh, if I tried to describe to you Jennifer, uh, with two dozen variables, you know, I could probably get to a fairly good description, but imagine if I could pull in more data and I was describing you with 300 to a thousand variables that signal and resolution results in a far more accurate prediction of your credit worthiness as a borrower.

Jennifer: Mike de Vere is the CEO of Zest AI. It’s one of several companies seeking to add transparency to the credit and loan approval process… with software designed to account for some of the current issues with credit scores… including racial, gender and other potential bias.

To understand how it works…we first need a little context. In the U-S it’s illegal for lenders (other than mortgage lenders) to gather data on race. This was originally meant to prevent discrimination.

But a person’s race has a strong correlation with their name…where they live… where they went to school…and how much they’re paid. That means…even without race data…a machine learning algorithm can learn to discriminate anyway…simply because it’s baked in.

So, lenders try to check for this and weed out the discrimination in their lending models. The only problem? To verify how you’re doing you kind of need to know the borrowers’ race… without that…lenders are forced to make an educated guess. 

De Vere: So the accepted approach is an acronym BISG and it basically uses two variables, your zip code and your last name. And so my name is Mike De Vere and the part of California I’m from, with a name like that I would come out as Hispanic or Latin X, but yet I’m Irish.

Jennifer: In other words…the industry standard for how to do this is often flat out wrong. So his company takes a different approach.

De Vere: We believe there can be a 2.0 version of that—leveraging machine learning. 

Jennifer: Rather than predict race on only two variables…it uses many more…like the person’s first and middle names…and other geographic data – like their census tract… or school board district.

He says in a recent test in Florida, this method outperformed the standard model by 60-percent.

De Vere: Why does that matter? That matters because it’s your yard stick to how you’re doing.  

Jennifer: Then, he takes an approach called adversarial de biasing.

The basic idea is this. The company starts with one machine learning model that’s trained to predict how risky a given borrower is.

De Vere: Let’s say it has 300 to 500 data points to assign risk for an individual.

Jennifer: It then has a second machine learning model that tries to guess the race of that borrower… (based on the findings of the first one). 

If the predictions of the second model match the outputs of the race predictor… he says it means the system is encoding bias…and should be adjusted… by tweaking how much it weighs each of the data points.

De Vere: So those 300 to 500 signals we can tune up or tune down if it becomes a proxy for race. And so what you end up with is not only a performant model that delivers good economics, but at the same time, you have a model that is nearly colorblind in that process. 

Jennifer: He says it’s led to more inclusive lending practices.

De Vere: We work with one of the largest credit unions in the U-S out of Florida. And so what that means for our credit union is more yeses for more of their members. But what they were really excited about is it was a 26% increase in approval for women. Twenty-five percent increase in approval for members of color.

Jennifer: While it’s encouraging… Anyone claiming to have a fix for decades of harm caused by algorithmic decision-making… will have a lot to overcome to win people’s trust.

It’s a task made even harder when the proposed fix to a bad algorithm… is another algorithm.

The Treasury Department recently issued guidance – highlighting the use of AI credit underwriting as a key risk for banking… warning of the costs that come with their opaque nature… and adding a note that, quote, “Bank management?.. should be able to explain and defend underwriting and modeling decisions.” 

Which… even with the most transparent tools… still feels like a tall order. 

And without modern regulation it’s also unclear just who monitors these credit scoring monitors… and who decides whether things like phone data or information from social media are fair play?

Especially while the end results continue to be used for non-credit purposes… like employment or insurance.


This episode was produced by me, Karen Hao, Emma Cillekens and Anthony Green. We’re edited by Michael Reilly.

Thanks for listening, I’m Jennifer Strong. 



The hunter-gatherer groups at the heart of a microbiome gold rush



The hunter-gatherer groups at the heart of a microbiome gold rush

The first step to finding out is to catalogue what microbes we might have lost. To get as close to ancient microbiomes as possible, microbiologists have begun studying multiple Indigenous groups. Two have received the most attention: the Yanomami of the Amazon rainforest and the Hadza, in northern Tanzania. 

Researchers have made some startling discoveries already. A study by Sonnenburg and his colleagues, published in July, found that the gut microbiomes of the Hadza appear to include bugs that aren’t seen elsewhere—around 20% of the microbe genomes identified had not been recorded in a global catalogue of over 200,000 such genomes. The researchers found 8.4 million protein families in the guts of the 167 Hadza people they studied. Over half of them had not previously been identified in the human gut.

Plenty of other studies published in the last decade or so have helped build a picture of how the diets and lifestyles of hunter-gatherer societies influence the microbiome, and scientists have speculated on what this means for those living in more industrialized societies. But these revelations have come at a price.

A changing way of life

The Hadza people hunt wild animals and forage for fruit and honey. “We still live the ancient way of life, with arrows and old knives,” says Mangola, who works with the Olanakwe Community Fund to support education and economic projects for the Hadza. Hunters seek out food in the bush, which might include baboons, vervet monkeys, guinea fowl, kudu, porcupines, or dik-dik. Gatherers collect fruits, vegetables, and honey.

Mangola, who has met with multiple scientists over the years and participated in many research projects, has witnessed firsthand the impact of such research on his community. Much of it has been positive. But not all researchers act thoughtfully and ethically, he says, and some have exploited or harmed the community.

One enduring problem, says Mangola, is that scientists have tended to come and study the Hadza without properly explaining their research or their results. They arrive from Europe or the US, accompanied by guides, and collect feces, blood, hair, and other biological samples. Often, the people giving up these samples don’t know what they will be used for, says Mangola. Scientists get their results and publish them without returning to share them. “You tell the world [what you’ve discovered]—why can’t you come back to Tanzania to tell the Hadza?” asks Mangola. “It would bring meaning and excitement to the community,” he says.

Some scientists have talked about the Hadza as if they were living fossils, says Alyssa Crittenden, a nutritional anthropologist and biologist at the University of Nevada in Las Vegas, who has been studying and working with the Hadza for the last two decades.

The Hadza have been described as being “locked in time,” she adds, but characterizations like that don’t reflect reality. She has made many trips to Tanzania and seen for herself how life has changed. Tourists flock to the region. Roads have been built. Charities have helped the Hadza secure land rights. Mangola went abroad for his education: he has a law degree and a master’s from the Indigenous Peoples Law and Policy program at the University of Arizona.

Continue Reading


The Download: a microbiome gold rush, and Eric Schmidt’s election misinformation plan



The Download: a microbiome gold rush, and Eric Schmidt’s election misinformation plan

Over the last couple of decades, scientists have come to realize just how important the microbes that crawl all over us are to our health. But some believe our microbiomes are in crisis—casualties of an increasingly sanitized way of life. Disturbances in the collections of microbes we host have been associated with a whole host of diseases, ranging from arthritis to Alzheimer’s.

Some might not be completely gone, though. Scientists believe many might still be hiding inside the intestines of people who don’t live in the polluted, processed environment that most of the rest of us share. They’ve been studying the feces of people like the Yanomami, an Indigenous group in the Amazon, who appear to still have some of the microbes that other people have lost. 

But there is a major catch: we don’t know whether those in hunter-gatherer societies really do have “healthier” microbiomes—and if they do, whether the benefits could be shared with others. At the same time, members of the communities being studied are concerned about the risk of what’s called biopiracy—taking natural resources from poorer countries for the benefit of wealthier ones. Read the full story.

—Jessica Hamzelou

Eric Schmidt has a 6-point plan for fighting election misinformation

—by Eric Schmidt, formerly the CEO of Google, and current cofounder of philanthropic initiative Schmidt Futures

The coming year will be one of seismic political shifts. Over 4 billion people will head to the polls in countries including the United States, Taiwan, India, and Indonesia, making 2024 the biggest election year in history.

Continue Reading


Navigating a shifting customer-engagement landscape with generative AI



Navigating a shifting customer-engagement landscape with generative AI

A strategic imperative

Generative AI’s ability to harness customer data in a highly sophisticated manner means enterprises are accelerating plans to invest in and leverage the technology’s capabilities. In a study titled “The Future of Enterprise Data & AI,” Corinium Intelligence and WNS Triange surveyed 100 global C-suite leaders and decision-makers specializing in AI, analytics, and data. Seventy-six percent of the respondents said that their organizations are already using or planning to use generative AI.

According to McKinsey, while generative AI will affect most business functions, “four of them will likely account for 75% of the total annual value it can deliver.” Among these are marketing and sales and customer operations. Yet, despite the technology’s benefits, many leaders are unsure about the right approach to take and mindful of the risks associated with large investments.

Mapping out a generative AI pathway

One of the first challenges organizations need to overcome is senior leadership alignment. “You need the necessary strategy; you need the ability to have the necessary buy-in of people,” says Ayer. “You need to make sure that you’ve got the right use case and business case for each one of them.” In other words, a clearly defined roadmap and precise business objectives are as crucial as understanding whether a process is amenable to the use of generative AI.

The implementation of a generative AI strategy can take time. According to Ayer, business leaders should maintain a realistic perspective on the duration required for formulating a strategy, conduct necessary training across various teams and functions, and identify the areas of value addition. And for any generative AI deployment to work seamlessly, the right data ecosystems must be in place.

Ayer cites WNS Triange’s collaboration with an insurer to create a claims process by leveraging generative AI. Thanks to the new technology, the insurer can immediately assess the severity of a vehicle’s damage from an accident and make a claims recommendation based on the unstructured data provided by the client. “Because this can be immediately assessed by a surveyor and they can reach a recommendation quickly, this instantly improves the insurer’s ability to satisfy their policyholders and reduce the claims processing time,” Ayer explains.

All that, however, would not be possible without data on past claims history, repair costs, transaction data, and other necessary data sets to extract clear value from generative AI analysis. “Be very clear about data sufficiency. Don’t jump into a program where eventually you realize you don’t have the necessary data,” Ayer says.

The benefits of third-party experience

Enterprises are increasingly aware that they must embrace generative AI, but knowing where to begin is another thing. “You start off wanting to make sure you don’t repeat mistakes other people have made,” says Ayer. An external provider can help organizations avoid those mistakes and leverage best practices and frameworks for testing and defining explainability and benchmarks for return on investment (ROI).

Using pre-built solutions by external partners can expedite time to market and increase a generative AI program’s value. These solutions can harness pre-built industry-specific generative AI platforms to accelerate deployment. “Generative AI programs can be extremely complicated,” Ayer points out. “There are a lot of infrastructure requirements, touch points with customers, and internal regulations. Organizations will also have to consider using pre-built solutions to accelerate speed to value. Third-party service providers bring the expertise of having an integrated approach to all these elements.”

Continue Reading

Copyright © 2021 Seminole Press.