As Football Season is about to begin here in the US, we always find ourselves wondering what technologies will be implemented next. And it looks like we’ve got our answer from the guys at Replay Technologies. It comes no surprise that this new technology will be demonstrated during NBC’s Sunday Night Football program, which has been on the forefront of technology. In the past, NBC has enabled users to watch the entire Sunday Night Football games online using their streaming platform and enabled users to watch from almost any camera angle that they wished to from the comfort of their computer screens.

Now, NBC is bringing us a new technological advancement, FreeD. This technology was developed by the guys at Replay Technologies and utilizes twelve Teledyne DALSA 4K cameras to help recreate a real-time replay that is completely free of any perspective. Any and all perspectives are available, which means that the idea of a camera angle is no longer necessary. This technology uses the twelve 4K cameras, which we believe to be Teledyne DALSA’s Falcon2 Color to create a 360 dimensional image of the playing field using some computational videography. It creates images and volumetric shapes using the different angles to recreate and superimpose the video on the objects.

Once all of the computation has been done, it can be played back as a normal replay, which should be able to be used by the commentators on TV and hopefully, over time, the referees who generally are the ones that need it most. The primary limitation of this technology as I see it is the fact that it takes about 30 seconds to generate a replay of this type in order for it to be usable for primetime. While this is not bad, most replays nowadays are available almost instantaneously and I believe that primary boundary for this technology currently is the compute power that they’re using to do all of this computational videography. Perhaps they should have a chat with the guys from AMD or Nvidia about parallelizing their code in order to enable GPU acceleration as I’m sure there are many repetitive operations or parallel operations that occur during this mandatory 30 second rendering period.

Below, we’ve got a video of what the final product looks like when they tested the technology at a Yankees game earlier this year. The technology is currently only deployed at the Cowboys’ stadium, but I have a feeling that once everyone sees this in use, everyone will want it at their own stadium. It would be interesting to know what kind of a computer hardware bill the Cowboys had to foot in order to be able to computationally render twelve 4K cameras.