The opening ceremony for the esports finals features a real-time AR gaming character being interviewed and joining a dance troupe on-stage
Real-time facial animation and live ray tracing was used for the first time during the live broadcast of Riot Games’ League of Legends Pro League (LPL) regional finals.
The broadcast saw an AR computer game character interacting live with people on stage – dancing on stage with real dancers and being interviewed live on stage.
The AR character is seamlessly integrated into the scenes, with incredibly realistic CG lighting and shadow effects ensuring the character’s reactions to the lighting in the venue matched that of the real-life presenters and dancers.
An algorithm tracked every ray of light from the CG light source, and then simulated the way the light interacts with the virtual character.
The performance is remarkable from a technical point of view as ray tracing requires huge amounts of rendering power, which is why it has previously only been used for non-real-time VFX for film and TV.
Riot Games utilised a range of products to enable the effect – tech and creative services company The Future Group’s Pixotope for the AR graphics; Cubic Motion for the real-time performance-driven facial animation; Vancouver’s Animatrik for motion capture; and camera tracking hardware company Stype for all the live camera tracking on the project.
The release of Nvidia’s powerful RTX series graphics cards also helped make the real-time ray tracing possible.
The Future Group CEO Halvor Vislie, said: “For this project, Riot Games brought in several outstanding partners who all worked tirelessly to deliver on the client´s creative ambition, and give the audience something extra special.”