morphecore was an experimental lecture-performance that probed new possibilities for performance at the intersection of neuroscience and dance. Armed with fMRI and brain decoding technology, the project reconstructed bodily poses from visual cortex activity, generating choreography that was further manipulated to test a range of physical variables—from the effects of gravity to muscle elasticity and joint rotation—in an exploration of what modes of dance might arise free from real-world constraints. The results were presented in a video narrated by a Daito Manabe avatar, culminating in a dance performance by this digital Daito that became increasingly abstract as it transcended the physical limits of the human body. The dance performance drew on 3D scan data of Daito Manabe, visual cortex activity recorded by fMRI, and motion capture data from Shingo Oono and ELEVENPLAY dancers.

By parsing and reconstructing dance as three constituent elements—pose, motion, and choreography—we sought to probe modes of physical expression free from the constraints of gravity and the physical limits of the body’s range of motion. Further studying the “noise“ and “glitches” that arise in chaotic neural processes, the project anticipated a future when dance might be generated by sound stimuli to the brain that produce an interactive response in the body.

As the coronavirus pandemic prevented the gathering of new data from in-person test subjects, the work was produced as a prototype based on simulations created with reference to prior fMRI data and procedures acquired in 2018. Although a future actual data set will inevitably elicit different results, the brain decoding methodologies underlying the project remain the same. It is a work in progress.

Research

Since 2014, Daito Manabe has experimented with the brain decoding technology being researched by Dr. Yukiyasu Kamitani at Kyoto University. “Brain decoding” seeks to look into the mind’s eye by reconstructing images seen by test subjects based on brain activity in their cerebral cortex. In 2018, Manabe adapted this technique in a series of installations and live shows that generated images imagined when listening to music, in a novel departure from conventional artificial synesthesia and VJ approaches to visualization that rely on music waveform and spectrums. In *morphecore*, brain decoding technology was used to extract poses imagined in the mind. These poses were assembled with motion data and choreography to create a dancing CG Manabe avatar.


Credit

Motion capture dancer and choreographer: Shingo Okamoto
Supervisor: MIKIKO (ELEVENPLAY)
Music co-producer: Hopebox

Editing Director: Kenichiro Shimizu (PELE)
CG Director: Kenta Katsuno (+Ring)
Effects Artists: Tetsuro Takeuchi (quino grafix)
Effects Artists: Jun Satake (TMS JINNIS)
Effects Artists: Tai Komatsu (cai), Keisuke Toyoura (cai)
Effects Artists: Mikita Arai (Freelance)
Effects Artists: Tsukasa Iwaki (+Ring)
CG Producer: Toshihiko Sakata (+Ring)
Data Processing: 2bit
Motion Capture: Crescent, inc.
3D Scan: K’s DESIGN LAB
Compositor: Naoya Kawata (PELE)
Project Manager: Naoki Ishizuka (Rhizomatiks) + Yurino Nishina (PELE)
Producer: Takao Inoue (Rhizomatiks)

Related

  • Sónar+D CCCB 2020

    Online, Barcelona, Spain
    LiveTalk/Lecture
    2020
  • ZER01NE DAY 2021

    Online, Seoul, Korea
    ScreeningTalk/Lecture
    2021
  • Prix Ars Electronica, Honorary Mention in the Interactive Art +

    Linz, Austria
    Award
    2022
  • Cairotronica 2021

    AUC Tahrir Cultural Center, Cairo, Egypt
    Exhibition
    2021
  • SEEING The INVISIBLE

    12 botanical gardens in various countries, including the Jerusalem Botanical Garden, Jerusalem, Israel
    Exhibition
    2021
  • Daito Manabe Audiovisual Performance

    VS., Osaka, Japan
    Live
    2023