# Music Visual World Paradigm
This project adapts the Visual World Paradigm—a well-established method in psycholinguistics—to the domain of music reading. The goal is to understand how musicians process musical notation and, ultimately, to test whether alternative notational designs could improve reading efficiency.
## The experiment
Participants listen to a short musical excerpt while viewing multiple notation fragments on screen. One fragment correctly represents the audio; the others contain errors. Using eye-tracking, we measure where participants look and how quickly they identify the correct notation.
This approach mirrors spoken word recognition studies in language research, where eye movements reveal how people integrate auditory and visual information in real time.
## Why it matters
Music notation lacks the semantic anchors that guide language processing. By examining how musicians' gaze patterns evolve during audio-visual matching tasks, we can identify which aspects of notation design help or hinder comprehension.
The long-term aim is to validate this paradigm for music, then apply it to compare conventional notation against new designs that may be easier to learn and read.
## Technical setup
- **Eye tracker**: Tobii Pro Fusion at 250Hz
- **Software**: Tobii Pro Lab (Advanced Screen Project)
- **Design**: Within-subject, randomised stimulus presentation
- **Variables**: Sentence density, vocal register, response time, error rate, and gaze patterns