Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

Generate Content at a Fast Clip with Fan-Focused AI Highlights

AI AI, Emerging media, Experience, VR & Live Video Production 4 min read
Profile picture for user Lewis Smithingham

Written by
Lewis Smithingham
SVP of Innovation & Creative Solutions

VR headsets and production equipment images are collaged together

With an explosion of connected technology—from VR to virtual worlds, TikTok to Instagram brands and more—the business of broadcast is now the business of content, commerce and culture delivered fit to format. Essentially, broadcast today is all about having the best content pipeline that’s able to deliver to myriad audiences across channels.

59% of Gen Z watch longer videos they discovered on short-form video apps, demonstrating the need for broadcast rights-holders to embrace ecosystem-level thinking. We’ve worked alongside brands like Meta, Hasbro, TikTok and Verizon to evolve their broadcasting approach and meet the habits of today’s viewers through experiences that are immersive, interactive, and reach audiences where they are. Now, we’re developing an AI solution that will further revolutionize this next-generation broadcast workflow to create more engaging, personalized content for consumers with Fan-Focused AI Highlights.

Fan-Focused AI Highlights clips hyper-relevant content at speed and scale.

Fan-Focused AI Highlights, currently in development, uses AI and machine learning to instantly clip highlights in live broadcasts. The AI model is capable of segmenting individual people and objects in live broadcasts and effectively eliminates the need for manual selection and editing, a typically time-intensive process.

The speed and volume of content unlocked by Fan-Focused AI Highlights is crucial to delivering the snackable content today’s sports viewers crave. Gen Z now consumes more highlights (50%) than live content (35%), validating the appetite for a moment-based approach to content delivery that is also more personalized.

EVP, Global Head of Experience at Media.Monks and former NCAA player Jordan Cuddy offers one example of how this trend is impacting the world of sports. “With Lionel Messi now signed onto Inter Miami, many of his fans may not care to watch American soccer,” she says. “Rather than sit through a 90-minute game, they just want to see the eight minutes where he’s touching the ball.” Her point is backed up by the fact that 80% of Gen Z fans not only follow a professional athlete online but seek to watch the events those athletes participate in, as well as follow the brands they engage with. With Fan-Focused AI Highlights, you could automatically clip together a reel of the game focused on Messi’s—or any athlete’s—best plays with ease.

Deliver on the hunger for affinity-based content.

The same approach above could apply to even more niche content and viewer interests. Imagine a basketball game that AI automatically slices into social media content focused on footwear worn by the athletes, then pushed out to an audience of sneakerheads by an athletic apparel brand. This is easily achieved with Fan-Focused AI Highlights—helping brands and broadcast rights holders alike reach audiences in more relevant ways, while also expanding the quantity and value of your broadcast rights.

We’re in a new era where people are no longer defined by demographics broken up by where they live; now it’s about identity groups. Rather than carve up territories on a map, broadcasters can creatively package up content for numerous subcultures simultaneously, leveraging the power of AI and machine learning to distribute custom highlight content to tailored interest-based audiences more accurately and effectively. This is a massive opportunity for rights holders, as 73% of sports viewers perceive rights owners’ use of fan data as “disappointing” (23.4%) or “below expectations, but catching up” (49.7%).

Adapt broadcast content to fit today’s viewing habits.

Fan-Focused AI Highlights is the latest solution within our software-defined production offering, which effectively eliminates the need for a large physical plant—like large control rooms or OB trucks that cost tens of thousands to rent per day or the dozens of crew members to maintain them—in favor of versatile, nimble broadcast workstreams. Single-use appliances designed for one task alone make way for NVIDIA GPUs in the cloud (or a server rack), adding additional efficiency, flexibility and reduced cost, while remote teams allow rights holders to hire the best talent for the job regardless of their proximity to the event.

Software-defined production has even enabled us to do what was never done before. Working with UNC Blue Sky Innovations, we streamed the first sporting event in stereoscopic 3D at 60 frames per second and an 8K resolution, directly to VR headsets. The custom-designed pipeline features a RED Digital Cinema camera; RED CPUs that decode, color correct and de-warp footage directly from that camera; a Blackmagic controller for live switching and encoding (from NVIDIA GPUs for a high-quality bitrate); and a 1GB network to deliver the feed to an AWS instance on its way to VR headsets.

All this equipment took up the modest space of a standard foldout table—a small footprint for an innovative pipeline and history-making broadcast. Still, broadcast professionals are a traditionally superstitious bunch, and it’s easy to see why moving much of the equipment and processes to software could leave them wary: what if you run into connectivity issues or a data center goes down? The same data centers that AWS uses also host banks and other extremely sensitive operations, meaning there are multiple safeguards in place to ensure service isn’t interrupted. And if one does go down, we can spin it up on another one. With multiple redundancies in place, any technical difficulty with software is faster and easier to fix than if your truck generator went down.

A sustainable approach to innovation.

In addition to reduced risk and additional flexibility, software-defined production offers another important benefit: sustainability. Media.Monks won a Sustainability in Leadership award at NAB Show by greatly reducing the carbon footprint of broadcasts with AWS. In addition to avoiding travel-related emissions, the software-defined production workstream is powered by 95%+ renewable energy, further reducing environmental impact.

With Fan-Focused AI Highlights added to the mix, brands can continue to deliver even more personalized, relevant content designed for today’s audiences with less emissions, risk, cost and people on the ground.  As viewers crave a more moment-based approach to the media and entertainment they consume, this revolutionary broadcast model helps brands expand the value of their broadcast rights in innovative new ways.

Related
Thinking

Make our digital heart beat faster

Get our newsletter with inspiration on the latest trends, projects and much more.

Thank you for signing up!

Continue exploring

Media.Monks needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, please review our Privacy Policy.

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss