The future of live television, where the viewer becomes the director, has been unveiled in the cutting-edge surroundings of the University of Salford’s MediaCityUK campus.
University researchers were joined by partners from 10 technology companies, broadcasters, higher education institutions and other organisations across Europe for the final demonstration of the FascinatE Project – an £8m EU-funded study examining the next generation of interactive broadcast media technology.
Researchers have spent the last three-and-a-half years developing a complete future broadcast system, bringing together ultra-high definition panoramic video, 3D audio and new ways in which viewers can control the pictures and sound from their television, PC, tablet or mobile phone.
Participants and guests, including students, academics and representatives from the broadcast and creative industries, experienced the results of the project at a special live broadcast from the University’s Digital Performance Lab at its MediaCityUK building.
Visitors enjoyed a behind-the-scenes demonstration, showing how a live performance can be captured for broadcast as an interactive event, involving the premiere of deeper than all roses, a large-scale music composition, incorporating EE Cummings poetry, by the University’s Director of Music Professor Steve Davismoon. The performance featured live dance artists Joseph Lau and Shona Roberts and music from student band Bears?BEARS! with guest vocalist Anikó Tóth.
A live feed of the performance could then be watched by guests on a range of screens and devices, including flat panel TVs, tablet PCs and the huge, high-definition video wall in the Egg reception area of the University’s MediaCityUK campus. Viewers controlled their own virtual camera using swipes on tablets or hand gestures in front of larger displays, and could even zoom in on the sounds from individual musicians or the lead singers.
Explained University FascinatE team leader Ben Shirley, from the School of Computing, Science & Engineering: “We captured the performance in an ultra-high definition, 180-degree panorama using an ‘Omnicam’, which incorporates six camera feeds.
“This was combined with footage from a broadcast camera and audio from a special 3D microphone and other mics to create an amazingly interactive experience for viewers. They could effectively become their own director, panning around the performance and zooming in on areas which interested them.
“And focusing in on one part of the scene also changed the audio – for example zooming in on the guitarist brought the instrument’s sound to the front, with the singer fading into the background.”
Guests could alternatively leave the broadcast in the hands of a virtual director – software which uses video data analysis to decide on the best camera angles and most interesting parts of the performance, with no human input.
Ben Shirley continued: “the full FascinatE system may be five or 10 years away but some elements developed by the project may well be in operation as early as 2014.”