Why MoQ + SCTE-35 Could Change the Future of Live Streaming

# livestreaming# ai
Why MoQ + SCTE-35 Could Change the Future of Live StreamingAnkush Banyal

For years, live streaming infrastructure has been split into two worlds. One side focused on...

For years, live streaming infrastructure has been split into two worlds.

One side focused on ultra-low latency. The other focused on broadcast-grade reliability and monetization.

If you wanted sub-second delivery, you usually sacrificed traditional broadcast workflows. If you wanted SCTE-35 markers, SSAI, and mature OTT tooling, you accepted higher latency.

Now that may finally be changing.

Between Media over QUIC (MoQ) and modern SCTE-35 workflows, we are starting to see the foundations of a streaming stack that can support both:

real-time interactivity
broadcast-scale delivery
and professional monetization

all at the same time.

And honestly, that is a pretty big shift.

First, What Exactly Is MoQ?

MoQ (Media over QUIC) is an emerging transport protocol being developed to improve how live media is delivered over the internet.

Instead of relying on older HTTP-based delivery methods that were never truly designed for real-time media, MoQ is built on top of QUIC.

That matters because QUIC gives us:

lower latency
better congestion handling
reduced buffering
faster recovery from packet loss
improved scalability for live delivery

The important thing is this:

MoQ is not just "another streaming protocol."

It is an attempt to rethink internet media delivery from the ground up.

Especially for:

live sports
auctions
betting
cloud gaming
interactive broadcasts
multi-view streaming
real-time experiences

Basically, anywhere milliseconds matter.

The Problem With Traditional Streaming

Traditional OTT delivery usually looks something like this:

Camera → Encoder → Packager → CDN → HLS/DASH → Viewer

This works well for scale.

But latency becomes painful.

Even with Low-Latency HLS, many workflows still operate in the 2-6 second range.

For interactive experiences, that is often unacceptable.

Meanwhile, WebRTC solves latency extremely well.

But historically, WebRTC environments have lacked some of the mature broadcast ecosystem features like:

SCTE-35 signaling
SSAI workflows
advanced packaging pipelines
large-scale CDN integration
traditional OTT monetization

This created a gap in the market.

MoQ is interesting because it has the potential to bridge these two worlds.

Where SCTE-35 Fits Into All This

SCTE-35 is one of the hidden building blocks of modern broadcast infrastructure.

Most viewers never hear about it.

But broadcasters rely on it constantly.

SCTE-35 markers tell downstream systems:

when ads should start
when ads should stop
where content boundaries exist
how SSAI systems should behave

Without SCTE-35, dynamic ad insertion becomes much harder.

And monetization at scale becomes messy.

This is why preserving SCTE-35 markers across workflows matters so much.

Especially when moving from traditional broadcast pipelines into internet-native streaming architectures.

Why Developers Should Care

If you are building modern streaming infrastructure today, you are probably seeing the same trend everyone else is:

Users expect television-quality reliability. But they also expect real-time responsiveness.

That combination changes everything.

For example:

Sports Betting

A 5-second delay can completely break the experience.

But sports platforms still need:

ad insertion
monetization
CDN scalability
regional feeds
metadata signaling
Interactive Events

Live polls, synchronized experiences, and audience participation only work when latency stays extremely low.

Creator Platforms

Modern creators increasingly want:

low latency
simulcasting
monetization
dynamic ad insertion
audience engagement

The infrastructure stack is evolving fast because these expectations are colliding.

MoQ Is Still Early, But Important

It is important to be realistic here.

MoQ is still emerging.

The ecosystem is not fully mature yet.

Tooling, interoperability, CDN adoption, and workflow standards are still evolving.

But the direction is very clear.

The industry is actively moving toward:

lower latency
metadata-aware delivery
more efficient transport layers
internet-native broadcasting
scalable real-time media systems

And that is exactly why developers should pay attention now.

Because the teams that understand these architectures early will be the teams building the next generation of streaming platforms.

The Bigger Picture

For a long time, streaming architecture has been fragmented.

Broadcast engineers lived in one ecosystem. Real-time developers lived in another.

Now those worlds are starting to merge.

Protocols like MoQ. Technologies like SCTE-35 passthrough. Low-latency delivery. Modern SSAI. WebRTC. QUIC.

These are no longer separate conversations.

They are becoming part of the same infrastructure stack.

And honestly, that is what makes this moment exciting.

We are finally reaching a point where streaming platforms may not have to choose between:

low latency
scalability
monetization
or broadcast-grade workflows

They may eventually be able to have all four.

Final Thoughts

MoQ will not replace every protocol overnight.

SCTE-35 is not new.

But together, they represent something important:

The industry is moving toward streaming architectures that are both real-time and production-grade.

That shift is going to affect:

OTT platforms
broadcasters
sports streaming
betting platforms
cloud gaming
creator ecosystems
enterprise streaming
and pretty much every company building live video products

The next few years in streaming infrastructure are going to be very interesting.

And this time, the internet might finally be catching up to what live media actually needs.