DARPA has funded a new project called MUSICA (Musical Improvising Collaborative Agent), which aims to explore more natural and deeper relationships between computer and humans. Basing it on how jazz musicians riff off with one another, the project hopes to develop a similar dynamic by teaching machines to play improv jazz. Machines were first loaded with jazz solo recordings from different musicians, and by analyzing the musical components such as harmony and rhythm the computers will begin to come up with an appropriate musical response in real-time. Ben Grosser of University of Illinois at Urbana Champaign says that hopefully by next summer they'll be able to develop a "call and answer" system where "I can play a line of music, and the system will analyze that line and give an answer as close to real time as possible."
Grosser shares that language-based interfaces are quite limited, and teaching robots to communicate without an actual language might make "interactions between humans and machines a lot deeper." In order for machines and humans to work well together, the former must first learn how to pick up "what the humans are putting down." Grosser admits that this project is a "crazy idea," but he believes that this will give a different understanding of the human creative process by figuring out the limits of computational creativity.