Perfume x TECHNOLOGY | 東京2020で変わる | NHK Tokyo2020
Perfume×TECHNOLOGY is a program where NHK and Perfume explore the new possibilities of entertainment as they approach TOKYO 2020.
Join Perfume, the group that continues to amaze the world with their use of the latest technology, as they conduct various experiments and introduce the TECHNOLOGY behind the scenes.
Stage direction/choreography: MIKIKO
Interaction design: Rhizomatiks
Special stage performance brought to you by Perfume in collaboration with the creative team
behind the 2016 Rio Summer Olympics/Paralympics Closing Flag Handover Ceremony.
This performance was conducted under the auspices of This is NIPPON Premium Theater.
Sponsored by NHK, the program is an exciting platform for cultural expression building up to the 2020 Tokyo Olympics/Paralympics.
※ Recorded approximately 2 weeks before the Reframe performance.
Perfume’s silhouettes soar to superhuman proportions, rotating behind the trio in a mysterious dance. Where does reality end and projection begin? FUSION was crafted to blur the boundary between performer and technology, resulting in a surreal effect that surely caught many viewers by surprise.The performance was achieved by meticulously blending CG imagery with the real-life movement of Perfume onstage. In a technique known as “dynamic projection,” imagery was projected at precisely the right moment onto a series of screens programmed to move nimbly in the background. Furthermore, successful execution of the scene relied on advanced “virtual shadow” techniques, which convincingly fused real-life shadow with the virtually-generated silhouettes.
During the performance of Negai, photos uploaded by fans from around the globe glittered like a constellation of brilliant stars in the night sky. Leading up to the Reframe performance, we unveiled the Reframe Your Photo project. We developed a groundbreaking system which enabled users to submit snapshots, which were then analyzed and matched with similar stills extracted from a collection of Perfume’s music videos to date. The technology identified a range of distinguishing features in user-submitted images, compared these features with scenes from Perfume’s music videos, and automatically determined which scenes were a closest match. These matches were then selected and displayed in an interface for users to enjoy, post on the project webpage, and share on social media.
In the opening moments to the song Mugenmirai CG elements were being synthesized to create AR in real-time. This was achieved by outfitting drones with cameras, then transmitting this footage to a system capable of rapidly analyzing positional information. The AR graphics were generated in real-time by tracking the position of the frames manipulated by Perfume onstage. We also developed a system capable of precisely projecting an array of lasers onto multiple stipulated points. The movement of the performers’ hands was tracked using motion capture technology in advance, which provided the data necessary to project the lasers in synchronization with the choreography. Watching the completed performance, one must marvel at the seamlessness of the laser projection.This is a testament to Perfume’s precision as dancers. The performance was a success thanks to their absolute accuracy from rehearsal to concert.
In this groundbreaking project, fans can now upload their own snapshots to see how their lives intersect with Perfume’s universe. Through the power of advanced image analysis and machine learning technologies, this system analyzes user-submitted photos to find the closest match in a database cataloguing individual frames from Perfume’s music videos.
A complex array of lasers can be projected on stage by controlling the exposure timing of vertically-oriented CMOS image sensors and the timing of the lasers. By calibrating data parameters such as the trajectory and positions of the lasers with an optical motion capture system collecting the location data of the cameras, the lasers can appear anywhere on screen. With this system, lasers can be used to trace 3-dimensional objects, resulting in a display different than the one created within the venue.
From the 3/21 (Wednesday) live stream of the reality-changing “Reframe” stage which was made possible by a fusion of leading edge technologies. Your submitted photos may have been used!?
For “Spring of Life” (2012) Perfume appeared with LEDs in their outfits that were controlled wirelessly and 3D scanning technology was used to project giant images of them performing on the screens behind them.
For “Magic of Love” (2013) the stage was transformed by projection mapping. Sensors tracked Perfume dancing and projected images onto their outfits.
For “Cling Cling” (2014), Perfume incorporated drones into their stage production, a rarity back then. They lit up the stage with 9 drones operating simultaneously!
For “Pick Me Up” (2015) seamless MR (Mixed Reality) technology was used to fuse the world on stage with CG in real time. The result was a performance that could not be expressed in the real world!
With technologies including motion capture and 3D scanning combining to create a dynamic VR (Virtual Reality) environment, Perfume appears to float in mid-air during their performance of “FLASH”!
A rare uncut performance of “TOKYO GIRL” from the special program Uchimura Gorin Sengen that aired in October of 2017! Messages from around the world were assembled, appearing during the live in a fusion of Dynamic VR (Virtual Reality) and MR (Mixed Reality).
For “TOKYO GIRL” (2017) Perfume performed on a 200m helipad overlooking Shibuya. Using Dynamic VR (Virtual Reality) and MR (Mixed Reality), all of Shibuya became their stage!