How Al is Generating a New Creative Class and Storytelling Medium
- Jaden Kirshner

- May 7
- 6 min read
Updated: May 28
Pundits and tech evangelists are promising the world that AI will make the creative process cheaper, faster, more efficient, and even lead to better creative outputs. While the first three points are crucial, I want to hone in on analyzing the validity of the last one. When I came across a 3-minute AI-generated YouTube video titled Capital of Conformity, it sparked an epiphany for me. Capital of Conformity was the first AI-generated video I had seen that felt like it could not have been made without AI. Not necessarily because the quality of the content was that much better than anything I’ve seen prior, but because it made me realize AI can fundamentally alter who gets to tell stories, what stories get told, and how stories are experienced.
Capital of Conformity is unlike any of the AI-generated videos I have previously watched on social media. Instead, Capital of Conformity has a cohesive story world, a topical message, a distinct tone, and an artist’s touch. This all stems from the ingenuity of Aze Alter, a 28-year-old director based out of Canada who created this expansive narrative world. He leveraged his success and talents as a music director into generative AI-filmmaking.

Aze has shared on his social media that he began pursuing generative AI filmmaking because he has many sci-fi universes he envisioned in his head, but tight budgets and logistical constraints with traditional methods prohibited bringing them to life. Now, Aze created a digital series that has reached almost 2 million people, telling a story that would have likely never been told before these tools were accessible.
But just as YouTube democratized distribution, AI is democratizing development and production for creators. Some people are skeptical that making it so easy to produce and distribute content will diminish the value of a great artist. My take is that now the only differentiator out there is the artist. Not money. Not resources. Just talent, a great story, and the ability to find your audience.
Capital of Conformity is not for everyone and especially not for the faint of heart, but if you can look past the AI glitches and uncanny valley of some shots, it is evident that Aze is building something really special. On his social media, Aze posted a behind-the-scenes look at the painstaking, months-long process of making a seven minute video for the Capital series. Whether it’s creating thousands of Midjourney images to find the perfect shot for a disfigured head or adjusting a slot machine’s glare on a character’s face to match the lighting, he completely disproves the notion that with AI you can click one button and it generates a great movie for you.
I find his process essential to point out because in our USC Art.Ificial x Stanford Entertainment Lab survey, 58.4% of respondents said that AI compromises the authenticity of filmmaking. While in many instances I tend to agree that content made with AI tools doesn’t feel as authentic, there is a select group of creators (all of whom were directors in a different medium first) who are using AI to tell personal, never-before-seen stories. These are the creators I’m most excited to see get their hands on generative AI tools.

Watching Capital of Conformity made me consider an impending paradigm shift in the broader entertainment industry. It is pushing the boundaries of what types of stories can reach a wider audience. Aze is harnessing AI tools to tell a story that traditionally could only be made by a major studio with substantial capital and infrastructure. However, the project would also likely have been rejected by most of these very studios. It is difficult right now for original ideas based on no pre-existing intellectual property to get made, especially if they include world-building and period elements like Capital of Conformity. In a 2022 study, Ampere Analysis found that 64% of scripted subscription video-on-demand originals are based on some pre-existing IP. While the market has ebbed and flowed since 2022, IP still dominates originals in greenlighting meetings.
Imagine Aze trying to sell an executive on this concept in a pitch meeting: “Well it’s 1984 meets Black Mirror, all in a neo-retro style. We are going to need to build sets with the same scope of Blade Runner 2049, and the idea is not based on any proven IP.” That’s a pretty difficult vision for an executive to greenlight in this market. Beyond the financial constraints, it’s also difficult to communicate the expansive world Aze has in his head through a conventional treatment, logline, and deck. However, using generative AI, Aze can start producing his idea and reach his audience directly without needing approval from a studio.
Then, if Aze does want to turn this into a larger-scale project through a studio, he can now use this series as proof of concept in his pitch. Having an AI-generated version of the series gives him a much better chance of getting this made because now he is pitching more than just an abstract idea. Essentially, AI-generated content will become a more effective way to convey concepts in a pitch. Even more, I predict that once copyright protection around AI content becomes clearer, it will become a new form of intellectual property for companies to option and adapt.
AI is also enabling students to begin exploring ideas they were holding captive in their imaginations. I am a teaching assistant for a USC graduate course called Directing Techniques: AI Filmmaking, taught by Professor Ben Hansford. He started the class by telling the students, “You know that dream script that your other professors said doesn’t have an audience or that an executive said would be too expensive, well you are going to learn how to make those dream projects a reality in this course.” And it’s true! Students are creating trailers of Cyberpunk Zombie stories, pre-vizing music videos with elaborate setups, and generating realistic footage of Orcas at sea. In the past, all these projects would’ve been told they were not within the scope of student filmmaking and shouldn’t even be attempted.
AI, just like the camera, can even change how stories are told. When the camera was invented, many of the earliest films were just recordings of stage plays. As time went on, filmmakers started to realize that the camera had particular characteristics to it (i.e., camera movement, the ability to manipulate time with editing, and in camera effects, to name a few). Then, filmmakers started to use these characteristics to tell stories that could not be captured through a stage play. In my opinion, the unique attributes of generative AI content fall into the four I’s: immersion, interactivity, iteration, and individualization. We will see these four I’s become more prevalent in their stories as more creators adopt AI tools.
Aze's Capital of Conformity leans into all four I's, and is a major reason I believe his series has been so successful. He makes the show immersive by posting a variety of media types—videos, music, graphic novels, merchandise—all on one central platform for fans to engage with the storyworld in a way that best suits them. For interaction, Aze has sprinkled in choose-your-own-adventure components where fans vote on the soundtrack for an episode or the logo to use for the evil organization. His process of releasing new content is very iterative because he can use AI to quickly test ideas with his core fans before releasing to a wider audience. In one instance, he posted three AI-generated theme songs and let his paid subscribers vote, ultimately choosing the overwhelming favorite before investing substantial time and resources to fully develop the song. Lastly, Aze has experimented with giving each viewer a unique experience through name-based notifications, and as AI translation tools and memory enabled characters improve, the possibilities for individualized story worlds will only expand.

Even with this excitement, I would be remiss to say I don’t feel ambivalent about this topic. I was assured to learn through our research project that I wasn’t alone in having these mixed feelings. In Professor Holly Willis’ interview for our project, she mentions that while her students are embracing AI, they have major concerns revolving around how this technology could upend copyright laws, disrupt the labor market, damage the environment, and perpetuate biases. If we don’t take these issues into consideration, then Capital of Conformity’s cautionary tale might be more prescient than we may think. With that being said, Gen Z is eager for a creative revolution, and if AI is the spark that empowers the next generation of original storytellers like Aze, then many of us are all for it.
Keep in mind this video is +1.5 years old, so the fidelity of the AI generations is not as realistic as they are today. For a more recent AI-generated video by the same artist with higher quality visuals, check out his newer short The Age of Beyond.




Comments