What We Do
Claypot’s treasure is its library of over 20,000 unique, artistically conceived samples recorded from the improvisations of Claypot’s collective of performing musicians. The core Claypot team of musicians and data scientists carefully annotate each sample and train neural networks to identify morphological and semantic features called Temporal Semiotic Units. Claypot composers use the sample library to compose live and sample-based compositions in collaboration with the performers who created the samples.
Currently, Claypot is developing an app called ConcertNotes designed to analyze the morphosemantic features of music both in streaming and live settings:
Through ConcertNotes, users’ (audience members) enter responses covering their impressions of the music they are hearing in real time. Through this process, ConcertNotes builds a detailed profile on each individual’s listening patterns which is then generated into a data visualization for the user.
While no personal data is collected, users are able to see anonymized visualizations of how the rest of the audience responded to a concert they attended. They can also use the visualizations of their own listening history to gain insights into how they perceive music and better articulate their tastes. Finally, musicians themselves can use the anonymized data on their concerts to understand how audiences respond to specific musical moments in their performances.