Sensory Datascape Series explores how digital technology reshapes perception, centering on ‘fine-tuning’—the active calibration of our senses through technological mediation. As we navigate reality increasingly through digital interfaces rather than direct encounters, we must ask: Do we actively shape perception, or passively consume experiences curated by technology?
Begun in 2019, the project used LiDAR to translate visual landscapes into soundscapes, challenging perception and expanding spatial sensing. In 2023 Fine-Tuning Human Sense 1.0, a digital implant apparatus shaped like insect eyes introduced a new perceptual mode—where spatial structures could be perceived in motion, but details became blurred and sounds became noise. To see clearly, viewers had to stop and blink repeatedly until their eyes twitched, deliberately disrupting habitual seeing.
The 2025 Version 2.0 evolves with A.I. and enhanced spatial sound. In this version, unseen spaces appear as darkness, with AI-generated textual predictions, acting not just as a tool but as a mediator of spatial perception. Viewers fine-tune perception through blinking, confronting gaps between AI predictions and sensory experience. They keenly perceive the gap between AI-generated text and sensory input, actively shaping perception at the intersection of mediated and human sensing.
Throughout the series, digital devices act as “Digital Implants,” reshaping our senses as we co-evolve with technology. As technology outpaces our sensory adaptation, we must ask: Can we actively fine-tune our senses like zoom lenses in an age of information overload? The answer lies in whether we passively receive or actively shape our perceptual evolution.
Sensory Datascape Series cultivates sensory literacy in the digital age, inviting viewers to recalibrate their senses through active engagement. In an era of constant information flow, it challenges us to critically navigate and refine our perception.
Sensory Datascape Series explores how digital technology reshapes perception, centering on ‘fine-tuning’—the active calibration of our senses through technological mediation. As we navigate reality increasingly through digital interfaces rather than direct encounters, we must ask: Do we actively shape perception, or passively consume experiences curated by technology?
Begun in 2019, the project used LiDAR to translate visual landscapes into soundscapes, challenging perception and expanding spatial sensing. In 2023 Fine-Tuning Human Sense 1.0, a digital implant apparatus shaped like insect eyes introduced a new perceptual mode—where spatial structures could be perceived in motion, but details became blurred and sounds became noise. To see clearly, viewers had to stop and blink repeatedly until their eyes twitched, deliberately disrupting habitual seeing.
The 2025 Version 2.0 evolves with A.I. and enhanced spatial sound. In this version, unseen spaces appear as darkness, with AI-generated textual predictions, acting not just as a tool but as a mediator of spatial perception. Viewers fine-tune perception through blinking, confronting gaps between AI predictions and sensory experience. They keenly perceive the gap between AI-generated text and sensory input, actively shaping perception at the intersection of mediated and human sensing.
Throughout the series, digital devices act as “Digital Implants,” reshaping our senses as we co-evolve with technology. As technology outpaces our sensory adaptation, we must ask: Can we actively fine-tune our senses like zoom lenses in an age of information overload? The answer lies in whether we passively receive or actively shape our perceptual evolution.
Sensory Datascape Series cultivates sensory literacy in the digital age, inviting viewers to recalibrate their senses through active engagement. In an era of constant information flow, it challenges us to critically navigate and refine our perception.
Director, sound algorithm, production: Hoonida Kim
AI & Vision recognition programming: Park Jae Hyeon
Structural engineering: Choi Jong Eon
Technical assistant: Moon Sung Yun , Dasol Jung
Videographer: Jeong gil woo, Yang Seung Wook, taxu lee, Yonggi Joe, Hoonida kim
Special thanks: Shin Yeasul, Woo Heeseo, Ku Yena, Go Dam, Philip Liu
Fine-Tuning Human Senses — Supported by ZER01NE (Hyundai Motors), Coreana Museum; Selected works from Sensory Datascape Series — Commissioned by the National Museum of Modern and Contemporary Art, Korea.
Hoonida Kim’s (KR) work focuses on technology that deeply penetrates through the human ecology, as well as the transformations in ecology triggered by technology. Human perception and sensibility fail to track the speed of social change and are unable to sense and analyze the enormous amount of information flooding our society. With such acknowledgement, Kim has been producing "environmental recognition apparatuses" to actively respond to such changes. He has also made unique attempts to directly and indirectly implant these "Apparatuses" to humans. Selected exhibitions include National Museum of Contemporary Art Seoul, SongEunArtSpace, Perigee Gallery, Cheongju Craft Biennale, Nam June Paik Art Center.
Hoonida Kim’s (KR) work focuses on technology that deeply penetrates through the human ecology, as well as the transformations in ecology triggered by technology. Human perception and sensibility fail to track the speed of social change and are unable to sense and analyze the enormous amount of information flooding our society. With such acknowledgement, Kim has been producing "environmental recognition apparatuses" to actively respond to such changes. He has also made unique attempts to directly and indirectly implant these "Apparatuses" to humans. Selected exhibitions include National Museum of Contemporary Art Seoul, SongEunArtSpace, Perigee Gallery, Cheongju Craft Biennale, Nam June Paik Art Center.
In the Sensory Datascape Series, Hoonida Kim explores how digital technology not only mediates our perception, but actively reshapes it. Version 2.0 transforms the act of sensing into a deliberate practice—requiring viewers to blink, pause, and recalibrate as they confront AI-generated predictions layered over blurred vision and spatial sound. The work turns passive seeing into an active negotiation between human senses and machine interpretation. By positioning digital tools as "implants" that fine-tune rather than simply extend perception, the project offers a nuanced reflection on how we co-evolve with technology. It challenges the assumption that more information leads to clearer understanding, instead revealing how mediation can both enhance and obscure. The jury recognizes the Sensory Datascape Series for its thoughtful engagement with the complexities of sensory perception in the digital age, and for encouraging viewers to question the terms under which they see, hear, and know.
In the Sensory Datascape Series, Hoonida Kim explores how digital technology not only mediates our perception, but actively reshapes it. Version 2.0 transforms the act of sensing into a deliberate practice—requiring viewers to blink, pause, and recalibrate as they confront AI-generated predictions layered over blurred vision and spatial sound. The work turns passive seeing into an active negotiation between human senses and machine interpretation. By positioning digital tools as "implants" that fine-tune rather than simply extend perception, the project offers a nuanced reflection on how we co-evolve with technology. It challenges the assumption that more information leads to clearer understanding, instead revealing how mediation can both enhance and obscure. The jury recognizes the Sensory Datascape Series for its thoughtful engagement with the complexities of sensory perception in the digital age, and for encouraging viewers to question the terms under which they see, hear, and know.