top of page

Part 1: What Do AI and Jazz Musicians Have in Common?

  • Writer: Mark Eastwood
    Mark Eastwood
  • Apr 8
  • 2 min read

Learning the Chart }

ree

Most people don’t understand jazz. That’s not a knock — it’s just true. To the untrained ear, it can sound chaotic, even random. Instruments seem to collide. Solos come out of nowhere. It’s common to hear: “I don’t get it — it sounds like everyone’s just doing their own thing.”


But good jazz is anything but random. Underneath what sounds like improvisation is structure. A deep foundation. Players aren’t winging it — they’re working within a shared framework: time signatures, chord progressions, modal shifts. The better the foundation, the more freedom there is to explore.


AI works the same way.


When we started building Genversation Lab, we weren’t sure exactly what our AI could do — only that we wanted it to have real conversations over SMS and VoIP. That was always the vision. But early on, we gave it broad, open-ended tasks: “Ask this person — someone dealing with insomnia, with this background and this daily routine — for morning and evening updates. Get the data however you think is best.”


What we got back was a mess. The language was strange. The timing was off. It misinterpreted answers or jumped into awkward questions. It was like handing a saxophone to someone and saying: “Play something beautiful that everyone will love.”


Too broad. Too vague. No structure. Set up for failure.


We stopped treating the AI like some invincible super genius that could walk on water and solve anything we threw at it.


Instead, we started treating it like a musician in training.


We worked on fundamentals — tone, pacing, question structure, data extraction, how to say hello and goodbye, how to listen before speaking. We trained the model to follow a progression.


Our model didn’t start out as a genius. It had to be shaped, slowly and deliberately. Scales before solos. Structure before style.


And in fields like healthcare, structure matters. When someone is logging sleep data, reporting side effects, or sharing their mental health symptoms, the role of the AI is to stay on rhythm. To play the notes clearly. To not miss a beat.


But over time, something else started to happen.


Not everyone keeps time the same way. Some skip beats. Others linger. Some don’t even know what song they’re in.


That’s when our model began to evolve. It stopped just playing.


It started following the charts and reading the music.


Comments


bottom of page