At the intersection of sound and sight, Mariano Salcedo, a master’s student in MIT’s Music Technology and Computation Graduate Program, is developing an innovative AI system designed to visualize music and other auditory stimuli. His research centers on the concept of neural cellular automata (NCA), which combines traditional cellular automata with machine learning techniques to create regenerating images that respond to sound.
Salcedo’s approach allows users to generate music-driven visuals that reflect the energy of the audio input. “This approach enables anyone to create music-driven visuals while leveraging the expressive and sometimes unpredictable dynamics of self-organized systems,” he explains. Through a web interface he has designed, users can manipulate the relationship between the music’s energy and the NCA system, crafting unique visual performances from any audio stream.
A New Academic Path
As one of the inaugural students in the Music Technology and Computation Graduate Program, Salcedo’s background includes an SB in Artificial Intelligence and Decision Making from MIT, where he focused on signal processing and its implications for understanding AI. His current work is supported by the Alex Rigopulos Fellowship, named after the co-founder of Harmonix Music Systems, who has expressed enthusiasm for the program’s potential to foster innovation in music technology.
From Mechanical Engineering to Music Technology
Salcedo’s journey to this point was not straightforward. Initially a mechanical engineering student, he shifted his focus to artificial intelligence after a transformative experience with a language model chatbot. This encounter ignited his passion for music technology, leading him to explore how AI could enhance musical creativity. He has since engaged in various projects, including DJing at MIT and conducting research in music technology during a trip to Chile.
Research and Community Engagement
Salcedo’s research, titled “Artificial Dancing Intelligence: Neural Cellular Automata for Visual Performance of Music,” was presented at the Association for the Advancement of Artificial Intelligence conference in January 2026. He believes his work could extend beyond music visualization to improve models of self-organized systems, such as those found in nature and society.
Throughout his academic journey, Salcedo has emphasized the importance of community and collaboration. He values the relationships he has built with fellow students and faculty, who encourage diverse pursuits and the exploration of individual ideas. His commitment to understanding the broader implications of music and technology reflects his desire to make a meaningful impact in the field.
Ultimately, Salcedo aims to share the joy of music and technology with a wider audience, inviting them to engage with the creative processes that shape his work. “I want users to feel movement and explore sounds and their impact more fully,” he states, highlighting his vision for a future where music and technology converge to enrich human experience.
This article was produced by NeonPulse.today using human and AI-assisted editorial processes, based on publicly available information. Content may be edited for clarity and style.








