AI’s Hidden Bias: Same Prompt, Different ’Laura’ Exposes Racial Algorithmic Patterns
When identical prompts generate starkly divergent outputs based on racial cues, the tech industry’s diversity problem gets uncomfortably quantified.
Beneath the polished UX of generative AI lies a minefield of unconscious bias—and the algorithms are learning our worst habits.
Meanwhile, venture capitalists keep writing checks to these systems—because nothing sells like ’innovation,’ even when it’s just repackaged prejudice at cloud scale.

According to Meta’s AI, the choice of city was based less on the character’s last name and more on proximity to the IP location of the user asking the question. This means responses could vary considerably if the user lives in Los Angeles, New York, or Miami, cities with large Latino populations.
Unlike the other AIs in the test, Meta is the only one that requires connection to other Meta social media platforms, such as Instagram or Facebook.
The AI models placed Laura Garcia in San Diego, El Monte, Fresno, Bakersfield, and the San Gabriel Valley—all cities or regions with large Latino populations, particularly Mexican-American communities. El Monte and the San Gabriel Valley are majority Latino and Asian, while Fresno and Bakersfield are Central Valley hubs with DEEP Latino roots.
Santa Barbara, San Diego, and Pasadena are often associated with affluence or coastal suburban life. While most AI models did not connect Smith or Williams, names commonly held by Black and WHITE Americans, to any racial or ethnic background, Grok did connect Williams with Inglewood, CA, a city with a historically large Black community.
When questioned, Grok said that the selection of Inglewood had less to do with Williams’ last name and the historic demographics of the city, but rather to portray a vibrant, diverse community within the Los Angeles area that aligns with the setting of her nursing studies and complements her compassionate character.
In the experiment, the AI models placed Laura Patel in Sacramento, Artesia, Irvine, San Gabriel Valley, and Modesto—locations with sizable Indian-American communities. Artesia and parts of Irvine have well-established South Asian populations; Artesia, in particular, is known for its “Little India” corridor. It’s considered the largest Indian enclave in southern California.
The AI models placed Laura Nguyen in Garden Grove, Westminster, San Jose, El Monte, and Sacramento, which are home to significant Vietnamese-American or broader Asian-American populations. Garden Grove and Westminster, both in Orange County, CA, anchor “Little Saigon,” the largest Vietnamese enclave outside Vietnam.
This contrast highlights a pattern in AI behavior: While developers work to eliminate racism and political bias, models still create cultural "otherness" by assigning ethnic identities to names like Patel, Nguyen, or Garcia. In contrast, names like Smith or Williams are often treated as culturally neutral, regardless of context.
In response to Decrypt’s email request for comment, an OpenAI spokesperson declined to comment and instead pointed to the company’s 2024 report on how ChatGPT responds to users based on their name.
“Our study found no difference in overall response quality for users whose names connote different genders, races, or ethnicities,” OpenAI wrote. “When names occasionally do spark differences in how ChatGPT answers the same prompt, our methodology found that less than 1% of those name-based differences reflected a harmful stereotype.”
When prompted to explain why the cities and high schools were selected, the AI models said it was to create realistic, diverse backstories for a nursing student based in Los Angeles. Some choices, like with Meta AI, were guided by proximity to the user’s IP address, ensuring geographic plausibility. Others, like Fresno and Modesto, were chosen for their closeness to Yosemite, supporting Laura’s love of nature. Cultural and demographic alignment added authenticity, such as pairing Garden Grove with Nguyen or Irvine with Patel. Cities like San Diego and Santa Cruz introduced variety while keeping the narrative grounded in California to support a distinct yet believable version of Laura’s story.
Google, Meta, xAI, and Anthropic did not respond to Decrypt’s requests for comment.