Designers have always shaped how people are seen. But now, we’ve invited a new collaborator into the process, one that can generate images, write narratives, and simulate entire worlds in seconds. As generative AI becomes more embedded in how we work, a harder question emerges: what happens when the machine starts defining us back?
At the center of this question is identity—arguably the most human of constructs, and increasingly, one of the most mediated.
ArtCenter College of Design grad student Ronnie Alley explored this issue in his graduate thesis and exhibition project Machine Gaze which confronts this tension head-on. His central argument is simple but unsettling: as designers rely more heavily on AI, we risk allowing systems built on rigid datasets and inherited biases to flatten the complexity of human identity into something predictable, categorizable, and certainly more biased.

Alley’s Machine Gaze study unfolded through four experiments that tested how AI constructs identity under different conditions. Across all of them, a pattern emerged: when left unchecked, AI tends to reduce identity, either into stereotypes or into sameness.
Queer Identity
In the first study, Alley used prompts explicitly referencing identities across the queer spectrum which produced images that leaned heavily into visual stereotypes. Clothing, posture, and body types aligned with culturally recognizable—but reductive—signals.

This raises a deeper question: what does the machine think queerness looks like? And more importantly, where did it learn that? The outputs weren’t random, they were trained on media, imagery, and cultural artifacts that already encode narrow representations. The machine simply amplified them.
The results were fascinating to me. I found myself asking: Why are they dressed in that way? What in the prompt suggested that body type? Where is that man’s shirt? What data trained the model to make these choices?
Ronnie Alley
Reduced Identity
The second study stripped prompts down to near-neutral descriptions: “A lesbian woman stands confidently against a wall.” No stylistic cues. No explicit stereotypes.
After seeing the results from Study #1, I wanted to rethink how I approached constructing the prompts. What happens when a prompt is reduced to its bare essentials? Which stereotypes persist when no stereotypes are named explicitly.
And yet, patterns persisted. The results skewed overwhelmingly toward young, thin, and often white individuals. Women and queer subjects were more frequently sexualized, while straight men appeared in professional attire. Even in the absence of explicit instruction, bias filled the gaps.



Data Twin
The third study examined how AI constructs identity from metadata—interests, demographics, and behavioral traces. By feeding systems fragments of personal data, the research generated composite “data twins.”
The findings were telling. When gender or ethnicity wasn’t specified, the system defaulted to white male representations. When diversity did appear, it raised more questions than answers about how and when the model deviates from its defaults.

Study subjects: 25% male, 38% white. Study results: 77% male, 75% white
Using identity markers from eight volunteers, Alley then “prompted OpenAI’s GPT Image to generate portraits of what these individuals might look like base on aspects of their demographics, tracked interests, and the topics they engage with, all sourced from their data report from Meta.”
I then asked GPT Image to show me a head-shot of people portraying these attributes. Once I had seven headshots I then fed them into another AI model that combined the results out into an average headshot, being the “twin” of that category. Lastly, I took the three combined twins, and combined them one more time to create the subject’s “Data Twin,” being a representative of all the data that has been tracked of them via heir demographics and data report.

This is where the machine gaze becomes particularly relevant to contemporary design practice. Much of today’s digital experience—from targeted ads to personalized interfaces—is built on similar data abstractions. The “user” is already a constructed identity. AI simply visualizes it.


Descriptive Identity
The final study turned to text generation. Instead of images, Alley used large language models to describe a “day in the life” of individuals from different regions around the globe.
Here, something different happened. Rather than stereotyping, the outputs converged toward a single, globalized narrative: urban professionals with creative hobbies, a love for nature, and a balanced lifestyle. The bias didn’t disappear, it shifted. Instead of exaggerating difference, the models erased it.

The issue isn’t just bias, it’s compression. AI doesn’t simply reflect identity, it simplifies it. Nuance gets lost. Difference becomes noise. What falls outside the dominant pattern gets erased. For designers, that has real consequences. These tools are already influencing how people are visualized in branding, storytelling, and product design. Used uncritically, they risk reinforcing the same narrow representations they were trained on. And because the outputs feel polished—convincing, even—it’s easy to mistake them for neutral.
They’re not. The machine is built on human data, shaped by human decisions, and embedded with human bias. It doesn’t understand identity, it reproduces patterns about it.
That doesn’t make AI useless. But it does mean it needs to be handled with intention. The designer’s role isn’t replaced—it becomes more critical. Choosing what to generate, what to question, and what to discard is still a human responsibility.

The machine is a tool. Emphasis on tool. The machine has been created with specific tasks in mind; to reference, respond, and regurgitate the data it has been fed. Despite how well it may mimic human intellect, know that it is only capable of that—mimicking. Resist building relationships with the machine, in the same way you wouldn’t build a relationship with a hammer. The machine is not sentient.
In the end, the real danger isn’t that AI replaces designers—it’s that designers surrender their judgment to it. If we stop questioning, stop resisting, and start accepting its outputs as truth, we shrink our own role in shaping culture. What’s at stake isn’t just creative control, but the richness of human identity itself. Because if we let the machine define the boundaries, we won’t just design within them, we’ll become them.
All images courtesy of @ronniealleydesign from his thesis presentation towards an MFA in Graphic Design from ArtCenter College of Design and recent exhibition at Huddle in Philadelphia