Turns out the tool was never the point.
A friend of mine — let's call him Marc — showed me something at dinner last month.
He pulled out his phone, grinned like a kid on Christmas morning, and hit play.
"I made this with Suno. Took me ten minutes."
It was a pop track. Upbeat. Clean production. A catchy hook. Lyrics about chasing dreams under city lights.
It sounded like… every song you've ever forgotten.
Nice enough to play in the background of a TikTok ad. Not good enough to make you feel anything. The musical equivalent of a stock photo.
"Pretty cool, right?" he said.
"Yeah," I lied :( "Very cool."
Two weeks later, I was on a call with a client — a professional composer who scores short films. She'd been experimenting with the same tool. Same Suno. Same $10/month subscription. Same AI.
She played me a track.
I stopped talking. I actually forgot I was on a work call for about 30 seconds.
It was a piano piece that started sparse, almost broken, then slowly layered in strings that swelled at exactly the right moment — the kind of moment that makes your chest tighten for reasons you can't explain. The melody resolved in a way that felt inevitable and surprising at the same time.
"How did you make that?" I asked.
She shrugged. "I knew what I wanted. I just had to tell it."
Same tool. Same AI. Same subscription price.
Completely different universe of output.
And that difference had nothing to do with the AI.
I've experienced this firsthand comparing my AI image/video outputs to experts like Rory Flynn — when I took his courses, I understood why.
The Paradox Nobody Talks About
Here's the line I keep coming back to:
The best AI-generated music is created by the best musicians. The best AI-generated videos are created by the best creators. The best AI-generated software is created by the best engineers.
Read that again. Let it sit.
Because it breaks the main narrative we've been sold about AI.
The mainstream story goes like this: "AI is the great equalizer. Now anyone can make music, write code, create videos, design products — no experience required."
And technically, that's true. Anyone can.
But here's the part they leave out:
Anyone can produce output. Almost nobody can produce something that matters.
AI didn't erase the gap between amateurs and experts. It made the gap visible.
Before AI, an amateur's lack of skill was hidden behind the wall of inability.
Now anyone can produce "plausible" output. The world is now flooded with "plausible," "good enough" toys. Therefore, high-quality premium products have more value than ever.
And with or without AI, high quality requires highly skilled people — that's why learning is not outdated, that's what I say every week to my students.
Three Fields. Same Pattern.
Let me show you what I mean.
1. Music
Marc typed: "Make me an upbeat pop song about chasing dreams."
The composer typed something closer to: "A sparse piano intro in C minor, 72 BPM. Melancholic but not sentimental. Introduce muted strings at bar 16. Build tension with a suspended 4th before resolving to the relative major at the bridge. Think Nils Frahm meets Ólafur Arnalds."
Same tool. But the composer brought 15 years of understanding why certain chord progressions create tension, why a suspended resolution makes your brain lean forward, why sparse arrangements feel more emotional than dense ones.
She didn't need AI to make music. She used AI to make her music faster.
Marc needed AI to make music at all. And the AI gave him exactly what he asked for: a generic, statistically average song. Because that's all he knew to ask for.
2. Video & Visual Creation
I see this constantly in my agency.
A junior designer asks Midjourney for "a professional website hero image, modern, clean." They get something that looks like it was made for a dental clinic in 2019. It's fine. It's forgettable. It communicates nothing.
A senior designer asks for something like: "Editorial-style hero image. Asymmetric composition, negative space on the left third for headline overlay. Warm desaturated palette — think Kinfolk magazine. Subject is a single ceramic mug on raw linen, shot from 45 degrees with soft directional light from the upper right. Grain. No people."
The difference isn't the prompt length. It's the visual vocabulary behind it.
The senior designer has spent years studying composition, color theory, editorial photography, and brand systems. When they talk to the AI, they're speaking a language the AI understands — because that language was in the training data. They know the references. They know the terms. They know what "good" looks like, so they can describe it precisely enough for the model to find it in its probability space.
The junior designer doesn't know what they want. So they describe vibes. And vibes produce average.
AI speaks fluent expertise. It barely understands vibes.
3. Software Engineering
This one is personal.
I've seen junior developers use Cursor or Claude Code to generate entire features in an afternoon. The code compiles. The tests pass. The demo looks great.
Then a senior engineer reviews it and finds:
- A library that was deprecated 6 months ago
- A race condition hidden behind what looks like clean async/await
- A complete (overkill) re-implementation of something that already exists
- An authentication flow that works perfectly — unless someone sends a malformed token, in which case it silently fails and grants access anyway
- Architecture that violates every pattern the team agreed on
The junior didn't know to check for any of this. They didn't even know these problems existed (unconscious incompetence).
The AI produced code that looked correct — because it is statistically correct. But "statistically correct" and "production-ready" in MY CONTEXT are two very different things.
Meanwhile, the senior engineer uses the same AI to scaffold boilerplate, then spends their time on what actually matters: edge cases, security, architecture decisions, performance under load.
The AI does 40% of the senior's work.
The AI does 150% of the junior's work — and that's the problem.
When AI does everything, you learn nothing. When AI does the boring parts, you focus on what matters.
"But AI Democratizes Creativity!"
I hear this one a lot. Usually from people selling AI tools.
And I want to be careful here, because there's a kernel of truth.
AI does democratize access.
Before Suno, you needed thousands of dollars of equipment, years of practice, and a recording studio to produce a song. Now you need a laptop and a subscription.
Before Midjourney, you needed a design degree or years of self-study to create a professional-looking image. Now you need a text box.
Before Cursor, you needed months of learning to write functional code. Now you need a description.
That's real. That matters. I'm not dismissing it.
But here's what AI does not democratize:
- Taste — knowing what's good vs. what's merely acceptable
- Judgment — knowing which problems to solve and why
- Domain knowledge — understanding the deep mechanics of your craft
- Creative vision — the ability to imagine something that doesn't exist yet
AI gives everyone a brush. It doesn't give everyone an eye.
And in a world where everyone has a brush, the eye is what people pay for.
"But Charafeddine, I'm not an expert yet. Are you saying AI is useless for me?"
No. I'm saying AI is different for you.
If you're a beginner, AI is a LEARNING tool — not a PRODUCTION tool. Use it to study, to experiment, to understand. Ask it "why does this chord progression work?" not "write me a chord progression." Ask it "what's wrong with this code?" not "write me the code."
The danger isn't using AI as a beginner. The danger is using AI to skip being a beginner.
Because the skills you skip are exactly the skills that make AI useful later.
The Taste Gap
So why does this happen? Why does the same tool produce masterpieces for some and mediocrity for others?
It comes down to something I call the Taste Gap.
AI models are trained on the entire distribution of human output — from garbage to genius. When you give the model a vague prompt, it does what any statistical engine does: it regresses to the mean. It gives you the average.
Average music. Average design. Average code. Average writing.
Average is what happens when there's no strong signal pulling the output toward a specific point in quality space.
Expertise IS that signal.
When the composer specifies a suspended 4th resolving to the relative major, she's not just using fancy terminology. She's giving the model a precise coordinate in its probability space — a coordinate that lives far from the average, in the region where the actually good stuff lives.
When the senior engineer specifies error handling patterns, she's constraining the model away from the "common path" (which is usually the insecure path) and toward the "correct path" (which is rarer in training data and therefore harder for AI to find on its own).
Your expertise is a GPS coordinate. Without it, AI wanders the map.
Think of it like a museum with a million rooms. AI has access to all of them. But without a guide, it takes you to the lobby — the room everyone visits, the room with nothing interesting in it.
Your knowledge is the guide. It says: "Skip the lobby. Third floor, east wing, the small room with the Vermeer that nobody talks about."
And suddenly, the AI takes you somewhere extraordinary.
What This Means for You
If you're reading this and feeling a little uncomfortable, good. That's the point.
Here's what this paradox means in practice:
1. Investing in your craft is more valuable than ever — not less.
Every hour you spend actually learning music theory, studying design principles, mathematics, understanding system architecture, or reading about your domain is now leveraged by AI. It's not wasted time. It's the thing that makes AI work for you instead of against you.
The people who will dominate the next decade aren't the ones who learned AI first. They're the ones who learned their craft first — and then picked up AI as an amplifier.
2. "Good enough" is a trap.
AI made "good enough" trivially easy to produce. Which means "good enough" is now worthless. It's table stakes. It's the baseline.
If your standard is "does this look professional?" — congratulations, you're competing with every teenager who has a ChatGPT subscription.
The new standard is: "Does this have a point of view? Does it reveal something? Does it make someone feel something they didn't expect to feel?"
That can't be prompted. That has to be lived.
3. Learn the tool THROUGH the craft, not instead of it.
Here's my recommendation for anyone at any level:
- If you're a beginner: Use AI to accelerate learning, not to bypass it. Ask it to explain, to critique, to teach. Don't ask it to do the work for you. You need the reps. (If you read my letter about banning AI for juniors, you know how strongly I feel about this.)
- If you're intermediate: Use AI for the boring 40% — the scaffolding, the first drafts, the boilerplate. Then spend your time on the 60% that actually requires judgment. This is where you grow fastest.
- If you're an expert: AI is your superpower. You already have the taste, the vocabulary, and the vision. AI just removes the friction between your imagination and the final product. Use it aggressively. But never stop sharpening the eye that guides it.
The Uncomfortable Conclusion
I'll say the quiet part out loud.
AI didn't make talent optional.
AI made talent the only thing that matters.
When everyone has access to the same tools, the tool is no longer the differentiator. YOU are.
Your taste. Your judgment. Your domain expertise. Your ability to look at an AI output and feel — in your gut, before you can even articulate why — that something is off. Or that something is exactly right.
That feeling isn't magic. It's the accumulated residue of thousands of hours of practice, study, failure, and refinement.
AI can't replicate that. AI can only amplify it.
And if there's nothing to amplify?
You get Marc's pop song. Nice enough. Forgettable. A digital ghost that exists for 3 minutes and then dissolves into the noise of a billion other "good enough" outputs that nobody will ever remember.
Or — you get a piano piece that makes someone forget they're on a work call.
Same tool.
Different human.
That difference is you. Invest in it.
If you only take one thing from this letter:
AI is a microphone. It doesn't write the song. It just makes sure everyone hears exactly how good — or how mediocre — your song already was.
Get good first. Then get loud.
Have a great weekend.
— Charafeddine (CM)