A woman walks past tents for homeless people lining the streets of Los Angeles, California, on February 1, 2021.
Frederick J. Brown | AFP | Getty Images
In December, single mother Courtney Peterson was laid off from her job at a now-shuttered residential transitional living program. In addition to the flexibility that allowed her to take her seven-year-old son to work sometimes, the money was enough to pay rent at an apartment building in Los Angeles’s Van Nuys neighborhood, where they lived for a year and a half.
Peterson said she started researching potential avenues for help and immediately worried about January’s rent. She said they lived in a travel trailer when her son was a baby and she didn’t want to go back to that situation.
“I started contacting local churches or places that claimed to offer rental assistance,” Peterson told CNBC. “But many of them want me to receive an active eviction notice so they can help me. I feel like I’m running out of options. I’ve contacted just about everyone I can think of with no luck.”
Instead of receiving an eviction notice, Peterson received a letter from the Homeless Prevention Division of the Los Angeles County Department of Health Services offering a lifeline. The pilot program uses predictive artificial intelligence to identify individuals and families at risk of homelessness and provide assistance to help them stabilize and stay housed.
According to the U.S. Department of Housing and Urban Development, there were more than 181,000 homeless people in California in 2023, an increase of more than 30% from 2007. A report from the California Auditor found that the state spent $24 billion on homelessness from 2018 to 2023.
Dana Vanderford, deputy director of the county’s Homeless Prevention Department, said the technology, launched in 2021, has helped the department serve nearly 800 individuals and families at risk of homelessness, including 86% of participants retained health services in permanent housing after exiting the program.
Individuals and families can receive between $4,000 and $8,000, she said, with much of the program’s funding coming from the American Rescue Plan Act. Tracking down individuals offering help and convincing them that the services offered are genuine and not a scam can be a challenge, but once contact is made, assistance is quickly sprung into action.
“We often meet with clients within days of losing their housing or days after they have a medical emergency. The time we spend meeting people feels critical,” Vanderford said. “Our ability to swoop in, call a person, connect them with resources, and prevent 86 percent of the people we work with from imminent loss of housing feels pretty remarkable.”
Peterson said she and her son received about $8,000 to cover rent, utilities and basic needs so she could stay in the apartment while she looked for a new job. The program works with clients for four months and then follows up at six months, 12 months and 18 months after discharge. Amber Lung, a caseworker who helped Peterson, said they could see firsthand the importance of prevention.
“Once people do lose their housing, it feels like there’s a lot more barriers to regaining it, so if we can fill in a little bit of the gap that might help them retain their housing, I think stabilizing the situation is better than people ending up in shelters or on the streets. It’s much easier to get them back to where they were,” Long said.
Anticipate risk
The artificial intelligence model was developed over several years by UCLA’s California Policy Lab using data provided by the Los Angeles County Chief Information Office. Cal State Executive Director Janey Rountree said the chief information officer consolidated data from seven different county departments, which was de-identified for privacy reasons, including emergency room visits, behavioral health care and services from Large public welfare programs from food stamps to income supports and homeless services. The program also extracts data from the criminal justice system.
The data, which has been tied together over the years and can be used to predict who will remain homeless, was developed while the Policy Lab was getting results testing the model’s accuracy.
Once the model identifies patterns of homelessness, the lab uses it to try to predict the future, creating an anonymous list of individuals at highest risk to lowest risk. The lab provides the list to the county in an effort to reach those who may be at risk of losing their housing before they do so.
However, past research Anonymous data was found to be traceable to individuals based on demographic information. A comprehensive study of data privacy based on 1990 U.S. Census data found that 87% of Americans can be identified by zip code, date of birth and gender.
“California has a severe, decades-long housing shortage and rising housing costs, which is why our people are experiencing homelessness,” Rountree said. “The biggest misconception is that homelessness is driven by individual risk factors, and in fact it is clear that the root cause is structural economic problems.”
Rountree said the Policy Lab is providing the software to the county for free and has no plans to monetize it. Working closely with people with relevant subject matter expertise, from teachers to social workers, using artificial intelligence can help promote positive social outcomes, she said.
“I just want to emphasize how important it is for every community experiencing homelessness to test and innovate around prevention,” she said. “This is a relatively new strategy in the life cycle of homelessness services. We need more evidence. We need more experimentation around how to find people at risk. I think it’s just about doing One way to get to this point.
The National Alliance to End Homelessness found in 2017 that a chronically homeless person costs taxpayers an average of $35,578 per year, and those costs are reduced by nearly half on average when they are placed in supportive housing.
Los Angeles County has had preliminary conversations with Santa Clara County about the program, and San Diego County is exploring a similar approach, Vanderford said.
Government use of artificial intelligence
Artificial intelligence in the hands of government agencies faces scrutiny due to potential consequences.Police dependence on artificial intelligence technology has led to false arrestIn California, voters rejected a plan to abolish the state’s bail system in 2020 and replace it with an algorithm that determines individual risk over concerns it would impact individual risk. increase bias In the judicial system.
Broadly speaking, Margaret Mitchell, chief ethics scientist at AI startup Hugging Face, said the ethics of government use of AI depends on the context and security of identifiable information, even if it is anonymous. Mitchell also noted how important it is to obtain informed consent from people seeking help from government programs.
“Are people aware of all the signals that are being collected and the risks associated with them, and the dual-use issues against their malicious use?” Mitchell said. “There’s also the question of how long the data will be kept and who might ultimately see it.”
While the purpose of the technology is to provide assistance to people in need in Los Angeles County before they lose their housing, Mitchell said it’s a positive thing from a “virtue ethics” perspective, but from a utilitarian perspective It appears that there are broader issues.
“The questions might be, ‘What’s the cost to taxpayers, and what’s the likelihood that this system will actually prevent homelessness?'” she said.
As for Peterson, she is looking for a job, hoping to find a remote position that will give her flexibility. In the future, she hopes to obtain a vocational nursing license and one day buy a house where her son can have his own room.
“It means a lot because you know my son wasn’t always stable. I wasn’t always stable,” she said of the program’s assistance. “Being able to call this place home and know that I don’t have to move out tomorrow and my son doesn’t have to find new friends right away… it means a lot to me and my son.”