We Cannot Offload the Narrative
Reflections on music, education, and human agency in the age of AI
Written by: Jorge Costa
On December 18, I had the opportunity to speak at an H-CORPS event in San Jose about a subject I have been exploring -and wrestling with- for quite some time: artificial intelligence in music-making.
As a musician, producer, sound engineer, and educator, this conversation is deeply personal to me. My relationship with AI in music did not begin with excitement. It began with discomfort. Fear, even.
I think many people in creative fields can relate to that.
There is something unsettling about spending years developing a craft, only to be told that a machine can now generate something that resembles the result. It raises difficult questions, not only about work and technology, but about meaning, creativity, and identity.
At the same time, I do not believe pretending these tools do not exist is an appropriate response. They are already here. They are already shaping professional workflows. And they are already becoming part of the world our students will inherit.
So for me, the real question is not whether AI exists. The question is:
How do we engage with it without losing the human core of what we do?
That question led me to an experiment at Arizona State University.
A couple of very talented students had an incomplete but very promising song and were not sure how to move it forward. Instead of asking AI to replace the songwriting process, and with permission of the songwriters, I invited students from my AI in Music Production class to use AI tools to help prototype possible directions. The goal was not to let the machine write the song for them. The goal was to explore, generate options, and help the students get unstuck.
Then we handed everything back to the original songwriters.
Because it was always their song.
What mattered most to me was what happened next: the students chose not to let the AI prototype become the destination. They finished the song themselves. They arranged it, developed it, and brought it back into human hands. Then students from my recording class helped bring it to life in the studio through a live ensemble performance.
In the end, what emerged was not really an AI story. It was a human story.
A story about collaboration.
A story about learning.
A story about creative agency.
A story about using technology as a catalyst, not a substitute.
That distinction matters to me.
If AI helps us explore ideas, deepen collaboration, and become better creators, then maybe it has a place. If it mainly helps us remove people from the process and offload meaning to an algorithm, then I am not only not interested but opposed to it.
As educators, we do not get to decide whether AI will be part of the future our students enter. But we do get to decide how we prepare them to think about it.
For me, this is not just about technology. It is about pedagogy. It is about critical thinking. It is about helping students ask not only, Can I use this tool? but also, Why would I use it? What do I gain? What do I lose? And who might be left behind?
This is the heart of it for me:
We cannot offload the narrative to an algorithm.
We must lead with our stories, our voices, our values, and our humanity. Then, if we choose, we can use AI to help us explore and prototype. But it cannot become a substitute for the human experience of making meaning.
That is true in music. I believe it is true far beyond music as well.
If you’d like, watch the presentation linked here for the fuller story, including the student project and the music that emerged from it.
These are conversations I believe we need to keep having — in public, across disciplines, and with human agency at the center.
I would love to hear your thoughts.


Great to see that the students still chose to finish their song instead of accepting AI outputs wholesale. That is the sort of belief in human creativity, passion and intent that we need to drive us into the next decade and the rest of the century. AI can scale good intentions or bad, and we need more people to lean into using AI for good and knowing when AI is beneficial and when it isn't, to ensure we harness this technology responsibly and for human advancement, not regression.