云端实现多机位零延时同步制作

对于在云端远程制作存在一个普遍的误解:信号延迟使多机位同步变得不可能。

尽管延迟可能是一个严重的问题,但说这样的说法太笼统了,因为云会造成延迟,从而导致在摄像机源之间进行切换,并且无法完成实时电视直播制作所需的所有其他事情,因此不切实际。

我在这里告诉您,不仅可以实现零延迟远程生产,而且这是释放在家或REMI(远程集成模型)生产的真正潜力的关键。 反过来,这使广播公司和其他视频制作者可以更经济地创建更多内容,从而随着观众逐渐发展其视频消费习惯而竞争并取胜。

2020年9月3日

There’s a common misperception about producing live remote television in the cloud: latency makes multi-camera synchronization impossible.

While latency can be a serious problem, it’s simply too general of a statement to say the cloud creates delays that make switching between camera sources and doing all of the other things need for live remote TV production too imprecise and therefore impractical.

I’m here to tell you that not only is zero delay remote production possible, but also that it is the key to unlocking the true potential of at-home or REMI (Remote Integration Model) production. That in turn makes it possible for broadcasters and other video producers to create more content more affordably and thus compete and win as viewers evolve their video consumption habits.

Defeating Delay With Delay

It might seem like a paradox, but the way to defeat delay when producing live television in the cloud is with delay.

To be sure, transporting video IP packets across a LAN and ultimately via the internet to the cloud can introduce jitter. Network congestion or its equivalent at router interfaces are likely culprits. The fact multiple camera sources are in play for most live video productions further compounds the issue.

Not only can each live source encounter its own network jitter, but live cameras can drift in time in relationship to one another making normal switching between shots difficult or even impossible in the cloud.

But there is a rather straightforward way to defeat this delay, and it involves introducing a sufficiently large enough delay between, for instance, the action shot on the field of play and when that shot is played out for distribution to accommodate all of the network jitter and other delays.

Achieving Multi-Camera Synchronization

If you’ve been involved with video production long enough, you may remember a device that was once critical to A-B roll editing called a time base corrector (TBC). I’ll spare you all of details, but at a high level, a TBC took in a video signal from a videotape player (one TBC for each player). Those tape machines could not be relied upon to play-out video with precise timing.

The TBC served as a digital buffer for incoming frames of video. Its precision internal clock enabled a buffered video frame to be read out line by line with perfect timing so that video from the A source and the B source tape players could be in perfect sync and switched between by a production switcher without creating anomalies like rolling bands in the video.

It’s necessary to create a time buffer of sufficient duration to accommodate these problems

Fast forward to today. Different technology, but the same strategy. To defeat the time anomalies that result from jitter, drifting cameras and other sources, it’s necessary to create a time buffer of sufficient duration to accommodate these problems. In other words, intentionally create a delay.

Further, by assigning a time stamp to each frame of video and associated audio track, it’s possible to sync time stamps from multiple cameras inside this intentionally created delay buffer thereby enabling in the cloud all of the video and audio production processes –like switching, slow motion replay and rolling in pre-recorded clips—required for a production.

Viewers at home are none that wiser that a delay was intentionally introduced to accommodate the realities of producing video in the cloud. Further, when compared to all of the other delays introduced into the traditional production and distribution chains needed to deliver a live production to the home, this delay buffer for cloud production is inconsequential.

The Opportunities

What cloud-based live video production offers is the ability to take at-home (REMI) production to a new level. Rather than transporting remote camera feeds in the form of IP packets from a venue like a sports stadium via the internet to a centralized production center –thus the term “at-home”—the cloud-based methodology enables directors and technical directors, graphic artists, slow motion operators, audio engineers and everyone else involved at that centralized production facility to be at home, literally.

The benefits for broadcasters and other video producers are obvious: lower-cost production

The benefits for broadcasters and other video producers are obvious: lower-cost production, minimal travel to remote sites, more remote events covered, more content produced and –perhaps the most important enabled by the cloud—the ability to enlist the best production talent regardless of where they are to produce a show.

All of this is made possible by leveraging production in the cloud for live events, and that’s made possible by applying this strategy to make zero delay remotes a reality.

Start a Discussion

For any information on this case or for any enquiries on the above products

We can help


查看了此条博客的人还看了

ANI与TVU深化合作:AI与5G齐发力 为新闻生产赋能

南亚地区规模最大的多媒体新闻机构ANI(Asian News International,亚洲国际新闻社)宣布全面升级其外采设备,大规模部署TVU Networks最新推出的TVU One5G直播背包。这标志着双方在2023年启动AI新闻制作合作后,再次迈出战略性的一步。...

【论文转载】基于5G云服务的广电融媒体直播方案研究与实践

文章重点介绍了基于TVU Partyline云互动、TVU AnywhereTVU One等产品的多平台协同方案,展示了5G云服务在复杂直播场景下实现高质量、低延时、稳定传输的显著优势。...

TVU云制播助力”主播大学”活动再创传播新纪录

今年,Kai Cenat的"Streamer University"成为TwitchCon历史上最受关注的创作者活动之一。活动全程采用TVU 云制播方案,展现了该技术在大规模多机位场景中的应用实力。...

18846ssssssssssss