3D Production: FIFA World Cup™ 2010

The 3D OBVans from AMP and Telegenic are covering 25 Gameas of the FIFA World Cup 2010

In December 2009, FIFA and Sony announced plans for 3D coverage of 25 FIFA World Cup matches. Integral in the selection and adoption of the technologies were Peter Angell, HBS director of production & programming, who served as FIFA special 3D project leader, and Duncan Humphreys, 3D consultant to HBS for the World Cup and partner in UK-based 3D production company Can Communicate. The aim was to deliver to TV-viewing soccer fans a new experience of their sports and to cheer even more about the World Cup. The tournament should help to kick off what TV makers, networks – and advertisers – hope to become a new dimension in home sports viewing and sports viewing in cinemas.


Just as high-definition TV improved sports viewing by adding a sharper, wider field of vision, 3D adds depth to the field, increasing the illusion that you are watching the event in person but closer to the action.

AMP Car8 being air-freighted with an Antonov to South Africa


For the 3D production of the 2010 World Cup, Sony has developed a 3D platform that combines processor, switcher, lenses and camera rigs. The company’s system integration facility in Basingstoke was fitting a 3D layer onto the T16 HD truck from UK outside broadcast supplier Telegenic as well as onto the Car8 HD truck from the French production company AMP. Both units were air-freighted directly to South Africa with Antonov aero planes. The AMP truck handles productions in two stadiums, Ellis Park and Soccer City in Johannesburg while the Telegenic unit does matches in Durban, Cape Town, and Port Elizabeth.

The 3D Layer OBVans from AMP and Telegenic

The Sony MVS-8000 production switcher in the AMP Car8 and in the Telegenic T16 were upgraded with a 3D software package and in the monitor wall 24- and 42-inch LMD series 3D displays were installed. PVM 23-inch monitors were utilized to view camera setup and stereo channel balance. A convergence area was implemented where eight MPE-200 multi-image processors with MPES-3D01 stereo image processing software help the convergence engineers to maintain camera alignment and to control the rigs. Each box takes in two video streams along with lens metadata from camera left and right outputs and provides electronic picture correction. It can correct horizontal and vertical image shift, toe-in correction, tilt and rotation, zoom synchronization, color misalignment and any inversions caused by the use of mirror rigs.


3D Camera Layer with MPE-200

The MPE-200 is designed to give outside broadcasters the ability to produce quality 3D without necessarily the expense and additional time required for use of fully motorized rigs. The box calibrates the optical centers of the two lenses throughout the entire zoom range. After alignment, a convergence operator can set the required interaxial distance of the rig, and the software will calculate and correct for any misalignment during production. He can also monitor and adjust the signals to ensure they do not go beyond the depth budget boundaries.


At the FIFA World Cup the HDC-1500 cameras are working with Canon HJ22ex7.6B lenses, but Fujinon lenses are likely to be supported as well. Same is true for the rigs: At the World Cup the MPE-200 is controlling 3D rigs from Element Technica, however there is no reason why it won’t be able to control 3ality-, P+S- or Swiss-rigs at some point. The product combines hardware based on the Sony Cell processor found in the PS3 and 3D software developed at the Sony R&D center in Basingstoke. In later versions of the software it is also planned to deliver enhanced graphics manipulation and digital effects. The processor includes a histogram displaying how much convergence is being pulled and also provides a variety of 3-D monitoring methods, including 50 percent mix, above/below, anaglyph, difference and side by side. To each camera pair a convergence puller is assigned, responsible for alignment, set up of cameras and rig, and pulling convergence live.


The vast majority of coverage is captured in native 3D by mixing the signals from the eight HDC-1500 pairs, but certain shots are up-converted in order to deliver the best possible presentation of the action. Peter Angell explains: “Our primary goal is to tell the story of the match as well as possible, but that doesn’t mean littering the coverage with 2D shots. If there’s a particular incident which has only been captured on a 2D camera, or a 2D camera has the best angle, then, editorially, that shot is critical to the story, and we would be penalizing the viewer if that weren’t included.”After weeks of tests in the run up to the World Cup, the team finally decided to go with JVC’s IF-2D2D1 2D to 3D conversion box, although some specific shots being adapted for 3D using the Sony MVS-8000 vision mixer. The video effects function of the MVS-8000 switcher can be used to split the image and create, what Angell terms, a “pseudo-3D” image from 2D cameras.


For the inclusion of 3D live graphics HBS is working with FIFA’s graphics supplier deltatre for the positioning of graphics on the Z-axis depending on the shot selection.In South Africa both outside broadcast vehicles are housing XT[2]+ servers which enable dual feeds to be recorded and played back instantly in full timecode synchronization. The combination of hardware and software (XT[2]+ and MulticamLSM) renders all existing capabilities of MulticamLSM available for live 3D productions including instant replay, loop recording, live clipping, playlist management, live slow motion, cuing and highlight editing. Each of the two 3D OBVans is housing six XT[2]+ servers under control of an LSM remote device for the production of the matches.
All the 25 matches are recorded in HDCAM SR (SRW-5800) with full bandwidth left- and right-eye signals on a single tape.

The Positioning of the 3D Cameras in the Stadiums

For coverage of all the 25 3D games in South Africa the Quasar rigs from Element Technica are used with Sony’s HDC-1500 cameras and Canon HJ22ex7.6B zoom lenses. There are 16 Quasar rigs in total, eight with each of the two 3D OBVans from AMP and Telegenic.

Each game is covered with eight camera pairs, with four positioned on slightly lower main camera shooting platforms (compared to 2D) and four at the field level. In each stadium there are four positions for Quasar side-by-side configurations: Main camera wide, main camera tight, and goal line left/right. The side by side camera pairs are far enough away from the action that they are not likely to converge. There is nothing coming very close and there are no deep-background elements as opposed to the positions on the pitch where the action might be 5-20 meters away with deep background elements in between 50-100 meters. Therefore four under/thru configurations are positioned at bench left/right and behind the goal left/right. These are mirror rigs because the action gets close to them. The cameras have to get closer together than the side by side ones. The under/thru rigs also allow the camera men to have a full viewfinder and the lens controls in the back as normal.


3D Production Camera Plan

Any camera operator can go to the back of the camera and know how it works. With the level of integration through the Quasar rigs, the Canon lenses and the Sony cameras with the Sony MPE-200 processor box and the CCU, there is one fibre running from the truck to the rig, so all image data, communications, rig control, lens control, metadata — everything including power — is now going through a single SMPTE fibre. For instance, interocular and convergence can now be set with the Sony box or locally at the rigs. And full metadata information is available to the convergence pullers and stereographers in the 3D OBVan for analysis of the two images.

The camera positions in each of the five 3D venues (Durban, Cape Town, Port Elizabeth and Johannesburg with Ellis Park and Soccer City) are pretty identical and similar positions make it easier for the production crews to work in different venues without having to massively adjust production philosophies.

The ability to quickly break down and set up the 3D production gear has been almost as important as the new skills related to the production. The team quickly realized that color coding all of the equipment for a given rig made it much easier to ship and assemble. About an hour of optical adjustment is all that is required to get the cameras ready for the match.

Also helping with the quality of the production is the decision to match a camera operator with a convergence puller. HBS even made the teamed-up camera and convergence operators swap roles to understand how the other half lives and allowing them to learn how things they do in their regular position can lead to problems for their partner. The convergence puller must intuitively follow objects, like the football, out of the screen and make a decision to pull convergence in a split second. It’s a bit like a camera operator reacting to an event but the key thing is to understand the impact of what the convergence will achieve. While some people argue that convergence pullers should be replaced by automated convergence technology for financial reasons, the lesson learnt in South Africa so far is that for convergence pulling you need a human brain.

The 3D Creative Aspect

While the first of the World Cup matches were focused on just using 3D to create uniform game coverage, the crews have now graduated to making the productions more sophisticated with better cutting and timing. With each match, the productions improve. Many of the veteran 2D crew members are relearning their craft at the World Cup.

For example, the crews have learned that shooting in 3D allows the production to "cross the line" more often by allowing the cutting to cameras on both sides of the field. With 2D video, if the action is moving from left to right all the camera cuts need to be from the same side of the field to prevent the viewer from getting disoriented. Not so with 3D. With 3D, it is easier for viewers to orient themselves, and there is an increased perception of where the camera is on the field.

Another lesson learnt is the need to frame the main stadium 3D cameras a bit tighter on game action. However, the camera operators face a delicate balance because if they are too tight on the shot it requires more panning. That can lead to quick movement, which can introduce motion blur and compression artifacts into the picture. It is a delicate balancing act for the camera operators, however already the coverage of the first game South Africa – Mexico presented stunning results. During breakaway and wide-angle shots, the 3D effect was more subtle, as if you were watching from the stands. But in replay and other close-ups of individual players, the 3D effects almost put you into the action. The post-goal close-ups of celebrations of fans in the stands and players on the field brought home the energy of the event. Ironically, while the cameras in the stand are going a little tighter, the cameras on the field are going a little wider. Wider shots from the field level introduce more elements that can add depth to a scene.

The quality of game coverage remained high throughout the broadcast, with few if any transmission glitches or convergence problems. The highlights were the replays, with the replay of the South African goal when the ball seemed destined to fly out of the screen and hit the viewer, topping the list. Most of the game coverage shot from the traditional soccer upper-level camera positions also provided enough depth, giving viewers a sense of the distance between players and the speed of the action.

The Coverage of the Games in 3D: The Directors View

When Bruno Hullin, one of the two directors working on the 3D World Cup productions, sat down for his first 3D production on June 11 for the game between Mexico and South Africa, he knew the challenge ahead of him: how to take his skills from years of directing in 2D and apply them to 3D. And it’s a challenge that dozens of directors around the globe will face in the near future.“The secret in 3D is that you start wide and zoom while, with the lower-pitch cameras, you need to be very wide and have players in the front of the image,” he says.Telling the story of the match is paramount, and Hullin says, when relying primarily on cameras that are close to the action, the production team needs to learn new ways to follow the action.

3D Match Schedule

“In 2D, I always cut by looking at the men on the pitch because that tells mewhat is happening,” says Hullin. “But, when I am on a 3D camera on the pitch, I have to look at the eyes of the players to understand what is happening and where to cut.”One of the top tools in the 2D productions at the 2010 World Cup have been ultra-motion camera systems. Shooting at upwards of 300 frames per second has given viewers a great view of tightly shot with emotional facial reactions. For the moment, however, those shots don’t work for 3D.
“With 3D,” Hullin notes, “we need layers of objects and, with low, tight shots from ultra-motion systems, we only have one layer, which is the player or the coach on the background, so there is no depth.”That said, he does believe there is a role for those shots within a 3D production because the images are so impressive.
One of the issues still to be sorted out for all 3D broadcasts is the number of cameras. Does 3D need the same number as a 2D broadcast, or can it get by with fewer? Hullin says that eight cameras is enough for the 2010 World Cup but more cameras located at a lower level would help tell the story even better since pitch cameras are the primary tool for telling the story of the match in 3D.“You can direct it like a 2D game, but it will not be interesting,” he says. “To be interesting, you need to find the spirit of 3D, because there are things that are possible in 3D that cannot be done in 2D. For example, you can stay with a wide shot in 3D while, in 2D, you would force the cut and force the view. But, in 3D, you allow the viewer to choose what they want to look at.”
On the creative side, 3D enables camera operators to finally leave the 4:3 “safe area” and use the full 16:9 screen area, as there is no letterboxing in 3D. In 3D, graphics can be pushed to the true edge of the picture and the full screen can be utilized for game action.

The 3D Master Control and Playout at the IBC

All the 3D productions in the stadiums were supervised by Peter Angell and Duncan Humphreys in a small 3D control environment at the IBC in Johannesburg direct next to the 3D playout rack. “Ninety-nine percent of people will see the World Cup in 2D HD so we can’t do anything to risk that coverage,” stresses Angell. “As long as 3D is a premium event proposition, it will face issues in the short term at any stadium which is already full of cameras and paid seating. If we were starting from scratch it would be straightforward but the 3D element adds another eight positions to the 32 per World Cup venue already dedicated to the 2D host coverage so it is difficult to get space.”

One rule to have emerged about 3D outside broadcasts is that coverage requires fewer cameras than 2D, with cut-aways and replays not as necessary to tell the story. Nonetheless no specialty cameras have been included in the 3D mix for the World Cup with HBS looking to convert occasional 2D shots from the armory of its other cameras to augment the 3D. “We have to be judicious about it,” Angell insists. “The goal is to tell the story as well as possible but that doesn’t mean littering the coverage with 2D shots. Ideally we need a means of cross conversion that retains enough of the 3D image so that it makes sense in the story we tell.”

Angell and Humphreys concluded that a conservative approach to 3D would be the best option, which meant devising a depth budget that wouldn’t jar the audience’s perception. “We needed to decide exactly how to manage the depth budget and also how we decide to break it,” explains Humphreys. “Having it fixed at the beginning of production is fine but it’s important that you know how you can break that budget for effect and when it makes sense to do so.” Footballs randomly booted into the stands and toward a 3D camera would make an obviously stunning 3D shot but decisions need taking about what the outer limits of the convergence should be.

For the World Cup games the convergence was settled broadly on a depth budget of 2-2.5% positive parallax (into the screen) and 0.5-1% negative (out of the screen).

The FIFA World Cup 3D coverage will be delivered to cinema screens and to 3D TV sets. The position and size of graphical elements needed consideration for both types of viewing environment. “Graphics can contribute to the overall 3D impact but if you play with them too much they become enormously distracting,” says Humphreys. “We initially put the clock and score as far into the screen corner as possible only to find it was more a problem in the cinema than it was for TV.”

“We are working out a set of values for where the graphics are best positioned on screen to give maximum effect without being completely overpowering,” says Angell. “The graphics will generally sit just in front of the screen plane, but if a player runs towards a camera we have the possibility of shifting it so we don’t end up with a situation where the graphic appears in front of the action when in 3D terms is should be behind. It’s a subtle trick to pull off.

“We are acutely aware of the enormous responsibility we have because a lot of people will see 3D through the World Cup for the first time. We want them to walk away from the experience feeling satisfied. We also need to manage expectations. You cannot compare Hollywood movies or games to live sport. With the best will in the world live sport is not going to be “Avatar”. Its producers had years to work out the 3D effects – we have milliseconds.”

Recording is on HDCAM SR dual stream VTR (SRW5800) on-site and at the IBC in Johannesburg, as well as to an EVS XT[2] server controlled by IPDirector and one EVS XF[2] (removable storage) at the IBC for long term archive.

Discreet left and right eye channels of 1080i50 HDSDI are sent from the 3D OBVans to the IBC over a JPEG2000 contribution network compressed to 300Mbps. From the IBC two redundant 3D signals will be sent via satellite to European theaters and homes via London, the site of FIFA’s distribution partner GlobeCast, using eight International Datacasting encoders (two at each of four venues) with integrated Sensio Technologies 3D processing.

From London the 3D contend will be distributed to signal providers such as Eutelsat for transmission to cinemas across Europe.

The 3D Cinema Experience

It is not just at the IBC FIFA HD Cinema that the FIFA World Cup is being delivered in stunning 3D. Ten broadcast networks and 400 theaters are distributing the World Cup 3D feed, including ESPN, Al Jazeera, SBS Korea, SBS Australia, SogeCable in Spain, TF1 and Canal+ in France and SKY Perfect JSAT in Japan.

The 3D/HD Cinema at the IBC

The 3D feed is being screened around the world as cinemas are being turned into impromptu ‘stadiums’ for 3D broadcasts. Cinema chains in Brazil, Mexico, the United States, Italy, Belgium, the UK, France and Spain Korea and Japan have signed up to receive the live 3D broadcast of the 2010 World Cup. The remaining four matches from the half finals onwards will be broadcasted to large screens including Gaumont and Europalace in France; Kineopolis in Belgium; Movieplex and Cine Cite in Italy and Digital Cinema Media in the UK, which has signed a deal with SuperVision Media to show both semifinals and the final in 40 screens across the Odeon, Cineworld, Vue and Empire chains. In the United States, Sensio is working with digital cinema delivery group Cinedigm.

As for 3D television, the World Cup is the largest showcase thus far, and video crews are gaining a wealth of experience from producing the games live in 3D. HBS, the host broadcaster of the tournament and producer of the 2D and 3D coverage is getting more and more skilled with each of the 25 games it is covering in 3D. When the games end July 11th, its crews will be one of the most experienced 3D teams in the world.