Below are a series of selected AudioVisual Projects by Renée Qin between 2023 and 2024 that merge sound, visuals, and interactivity to craft immersive experiences. Using innovative techniques, these works transform music and sound inputs into dynamic visual narratives, seamlessly bridging the auditory and visual realms. From physics-based interactions to generative landscapes, each project delves into themes of emotion, space, and connection, offering audiences a unique exploration of harmony between sound and sight. Through a thoughtful fusion of artistic creativity and technological experimentation, Renée reimagines how sound and visuals can interact, creating experiences that inspire, engage, and resonate deeply.
|
Celestial Sparks (2024): An Interactive Audiovisual Sequencer
Celestial Sparks is an immersive audiovisual sequencer that captures the beauty of fireworks inspired by flickering stars, merging music composition with dynamic visual storytelling. Each star acts as a trigger, producing an ethereal blend of sound and visual effects that simulate the burst, glow, and graceful fade of fireworks. The sequencer generates a unique performance every time, with randomized particle shapes, colors, and tonal layers. Stars play notes drawn from a pentatonic sequence, creating harmonic and ambient soundscapes that are both soothing and exploratory. This interplay of light, color, and sound transforms into a celestial canvas of melodies, subtle animations, and evolving textures. Designed to be both interactive and generative, Celestial Sparks invites users to engage with music and visuals in real time, providing an experience that is equal parts performance, creation, and spectacle—a harmony of the cosmic and the creative. |
|
Melodic Bounce (2024): Fusion of Sound and MotionMelodic Bounce is an interactive MIDI visualizer that transforms sound into dynamic, real-time visuals. By uploading a MIDI file, the program extracts a single track and represents its pitch and rhythm through the physics-driven motion of a ball bouncing and colliding with platforms. Each collision and rebound is perfectly synchronized with the melody, visually illustrating the music’s flow. Users can interact with the experience by changing platform colors and triggering particle effects during collisions, adding layers of visual playfulness and personalization. The smooth, natural motion of the ball bridges the realms of audio and visuals, offering a captivating and satisfying experience that transforms a melody into a tangible, visual journey. With its seamless synchronization of music, motion, and interactivity, Melodic Bounce reimagines the way audiences experience sound, making it as much fun to watch as it is to hear.
|
|
Dualscape (2024): A Symbiosis of Sound and Visuals
Dualscape is an immersive audio-visual narrative that dynamically explores sound visualization using ChuGL. By integrating auditory inputs generated through ChucK, the project transforms music into visually captivating displays, featuring mirrored waveforms and spectral landscapes. These symmetrical and ethereal visuals respond directly to the audio, creating a seamless interplay between sound and sight. More than a visualizer, Dualscape serves as an interpretative layer that deepens emotional and sensory engagement, enhancing the audience's connection to the music. The project highlights the artistic fusion of technology, sound, and visual design, offering a transformative experience where the boundaries between audio and visuals dissolve. The demonstration video features Renee Qin’s composition “Ephemeral” (2024), a piece that perfectly complements the project’s reflective and transient nature. Through this synergy of sound and visuals, Dualscape invites audiences into a space where music is not only heard but also seen, reimagining the possibilities of audiovisual storytelling. |
|
Theremin (2023) - A Fusion of Sound and Technology
This project focused on constructing a theremin, a unique musical instrument that produces sound without physical contact. Using basic electronic components such as NAND circuits and operational amplifiers, Renee Qin initially assembled the instrument on a breadboard at Stanford CCRMA’s Max Lab before soldering it onto a permanent board to enhance sound quality. The demonstration video features Renee Qin performing the theme music from “Doctor Who” on the self-built theremin. This project showcases the fascinating interplay between sound, electronics, and physics, while emphasizing the artistic and technical challenges of one of the earliest and most innovative electronic instruments. |