top of page

Research Interests

I am drawn to music as both a remarkable human capacity and a window into the mind. Music performance fascinates me because it demands so much of cognition all at once—perception, memory, prediction, and precisely timed action must unfold together in real time. How do musicians develop these abilities? What can studying them teach us about the mind more broadly?

Within music performance, I am especially interested in music reading—the process by which musicians decode notation and translate it into sound and action. Music notation is a cultural invention, a visual symbol system that musically literate individuals learn to use. In this way, it invites comparison with text reading, as both involve the rapid visual processing of learned symbols and their mapping onto sounds.

Yet music notation differs from text in ways that may be cognitively significant. For example, text is typically read sequentially along a horizontal line; music notation often presents multiple voices simultaneously, requiring integration across both horizontal and vertical dimensions. In text, a letter's position on the page generally does not affect its meaning; in music notation, vertical position directly encodes pitch. And while text reading is primarily oriented toward comprehension, music reading is more directly tied to action—musicians often must translate what they see into motor output in real time.

These characteristics raise broader questions: What is universal about how humans process learned visual symbols, and what varies across domains? (domain-generality vs. domain-specificity) How does acquiring a new symbol system reorganize visual processing? (e.g., the neuronal recycling hypothesis) I am also interested in how symbols become linked to perceptual experience—skilled musicians often describe "hearing" notation internally, raising questions about cross-modal binding.

In my previous work, I used eye-tracking to study how pianists coordinate visual processing with motor execution during sight-reading performance. This research examined eye-hand span—the distance between where a performer looks and where they play—and suggested that skilled readers are distinguished less by how far ahead they look than by how flexibly they adjust this distance in response to musical demands. Building on this foundation, I am now expanding my methodological toolkit to include EEG and computational approaches.

Current Research

Since beginning my Ph.D. in Cognitive Science and joining the Music and Mind Lab at Indiana University Jacobs School of Music (2024–present), I have been broadening both my theoretical perspectives, drawing on visual cognition, psycholinguistics, reading research, statistical learning, and predictive processing, and my methodological skills, including EEG/ERP techniques and computational modeling.

For my first-year project, I conducted a behavioral study examining harmonic syntax processing through written notation and its relationship to sight-reading proficiency.

For my second-year project, I am conducting an EEG study investigating neural markers of sight-reading expertise, focusing on how visual processing of musical notation differs between skilled and less-skilled sight-readers, particularly during the early stages of perception.

© 2025 by Yeoeun Lim

bottom of page