QoE Metrics
Quality of Experience (QoE) describes how viewers actually feel your stream: fast start, no stalls, and a stable, sharp picture. On AIOZ Stream, we track QoE with a composite view that balances perceived quality (e.g., VMAF) against stall risk (rebuffer and long startups) and looks at stability over time. These signals guide ABR, ladder tuning, and even ad decisions so sessions stay smooth on real networks.
What is QoE?
QoE is the outcome viewers notice, not a single number. People remember whether the video started quickly, whether it froze, and whether the picture held steady without distracting jumps. That’s why our approach pairs a perceptual quality estimate with penalties for interruptions and slow starts, then checks how stable the session felt.
Related: Adaptive streaming (ABR) • HLS vs WebRTC
A composite model you can act on
Think of QoE as a balance:
QoE ≈ Perceived quality (VMAF) – Rebuffer penalty – Startup penalty + Stability.
This isn’t a rigid formula. It’s a practical way to reason about trade-offs. If stalls rise, the rebuffer penalty can outweigh the benefit of a sharper picture. Others, if quality is steady at a high rung, stability improves the session even if the absolute bitrate is modest.
The core metrics
- Startup time: How long from play to first frame. Viewers feel the difference between 1.2s and 3s. Faster starts reduce abandonment and make the whole session feel responsive.
- Rebuffer ratio: Total stall time divided by watch time. Even a single two-second freeze can break immersion. Keep this low and smooth out spikes by region and device.
- Average VMAF (or similar): A proxy for perceived picture quality across the segments a viewer actually watched. Look at the average and the spread; a high average with wild swings still feels rough.
- Stability: Time spent at the highest sustainable rung and how often you switch. Fewer, more purposeful switches feel better than frequent tiny jumps.
- Effective bitrate: What the viewer truly received over time. This helps you compare planned ladders with real delivery under changing networks.
- Error rate: Fetch or decode errors that interrupt playback. Treat these as first-class issues, not footnotes.
Measuring QoE on AIOZ Stream
The player emits events that map to the metrics above: play request, first frame shown, rendition change, stall start/end, and error. We aggregate these into session-level KPIs, then let you slice by device type, app version, ISP, and region.
Because AIOZ runs on a DePIN edge, we also measure how locality and node availability influence startup and stall risk.
For daily work, read the QoE dashboard like a story. If startup creeps up in a market, investigate DNS and first-segment fetch times. If rebuffer spikes on low-end Android devices, consider capping the top rung or widening your ladder steps there. When average VMAF is high but the stability chart is noisy, favor steadier switching rules.
In general, AIOZ Stream support signed playback telemetry from the player and the edge, with on-chain proofs available for transparent accounting and fraud-resistant payouts. That way QoE improvements translate into trusted business outcomes.
Starting targets
Treat these as starting points, not hard limits. Catalogs and audiences vary.
- Startup to first frame: under ~2 seconds for short-form; a little higher is acceptable for long-form/TV.
- Rebuffer ratio: under ~0.5–1.0% where feasible. Watch the 95th percentile, not just the average.
- Average VMAF: aim for high and steady rather than chasing the maximum with frequent swings.
- Stability: fewer switches and longer time on the highest sustainable rung.
Driving ABR and product decisions with QoE
You use QoE to tune your ladder, cap the top rung on small screens, and enable content-aware encoding where it saves bits without harming perception. Feed QoE and engagement into the ad-selection layer so revenue and experience grow together. Because delivery is decentralized, monitor cohorts by region and ISP and adjust defaults where networks are consistently weaker.
Quick start
Set up a simple test: upload a short VOD, enable analytics, and watch the first 100 sessions. If startup drifts, inspect first-segment latency. If stalls cluster on a device family, try a gentler switch policy and a wider ladder gap. Iterate weekly and track how the session-level story changes.
See: Quick Start • Player analytics
FAQ
Which metric should I fix first?
Start with startup and rebuffer. Viewers feel these immediately. Once they’re healthy, refine picture quality and stability.
Is VMAF required?
No single metric is mandatory. VMAF is a good proxy that correlates with human perception, but use it alongside A/B tests and your own support data.
How often should I sample QoE?
Continuously. Read daily aggregates for trend and use percentiles to find pockets of pain. During launches, watch real-time probes in the first hours.
Can QoE help ads without hurting experience?
Yes. Use QoE and engagement as inputs into ad decisioning so the system avoids heavy placements when a session is fragile and leans in when the session is strong.
How do I compare regions fairly?
Build cohorts by ISP and device. Compare each region to its own baseline first, then to your global median. Optimize where the delta is largest.