Video Streaming Fundamentals


What is Video Streaming and what are the video streaming fundamentals?

Video Streaming is a way of transmitting video data over a network in a steady flow (or stream) and allowing it to be played while further data still being received. Video streaming fundamentals are everything you need to know about live streaming.

What is the difference between Live Streaming and Video On Demand (VOD) streaming?

Live Streaming video features a live event as it is happening, similar to linear (classic) television. All viewers of a live event are watching the same content at the same time.

Video On Demand streaming is showing pre-recorded media, like movies. Viewers can watch at any time (i.e. on-demand) and are able to seek to any point in the video.

So let’s get started on all things video streaming fundamentals.

What are the stages of a typical Streaming Video workflow?

  1. Capture – convert analog image and sound into data
  2. Encode – compress audio and video data so it can better stream through the network
  3. Deliver – actually stream the encoded content to its destination(s)
  4. Decode – decompress the stream back into raw audio and video data
  5. Display – show media on the end-user device

What is Bandwidth?
A bit per second metric of available data communication capacity.
To stream without interruptions the total content bitrate (i.e. combined video+audio bitrate) has to be smaller than the available bandwidth.

What is a Media Container?
A Container is a standardized format that describes how video, audio and other assets (i.e. subtitles) are stored within a file or stream. Typical container formats in use nowadays are MKV, MP4, TS, AVI, MOV, etc

What is Video Resolution?
Resolution is a simple metric that describes the number of available pixels of the video picture. It can be described as a number of total pixels (i.e. 12 MegaPixels), number of lines (i.e. 720p) or as a multiply of the horizontal and vertical pixels available (i.e. 1920×1080)
Typical resolutions are 4K (4096×2160), 1080p (1920×1080 aka “Full HD”), 720p (1280×720 aka “HD Ready”)

What is Video Framerate?
Framerate is the number of Video Frames displayed each second, most commonly measured in Frames Per Second (FPS). While traditional cinema displays video at 24 FPS, modern content is displayed at 30, 60 or 120 FPS.

What is a Video Codec?
A codec is a set of video COmpression and DECompression (CO-DEC) rules and algorithms designed to shrink the video as much as possible while maintaining as much of its integrity.
Typical modern video codecs are: AVC (aka H.264), HEVC (aka H.265) and VP9; a few of the “legacy” codecs are Theora, H.263, DivX, XVid, WMV, MJPEG etc.

What is Video Bitrate?
Bitrate is a bits-per-second metric of the thickness of the video stream. The higher the bitrate the better the video quality.
A bitrate setting will usually be chosen as a compromise between high quality and expected transmission bandwidth.

What are the Audio Channels?
The term channels is a simple indication of how many audio tracks certain audio feeds contain. Sound can be mono (1 channel), stereo (2 channels), surround(5+1 or 7+1) etc

What is the Audio Sample Rate?
It is a measure of how frequent the sound wave is sampled, measured in times per second (or Hz). Typical values are 48000 and 44100. The higher the sample rate, the more acoustically accurate the digitization will be.

What is an Audio Codec?
Similar to the Video Codec, this is a set of compression and decompression rules for audio. Modern audio codecs are AAC, OPUS, VORBIS, MP3

What is Audio Bitrate?
Similar to the Video Bitrate, it is a measure of how much data is required to encode the audio at a certain quality threshold. Expect lower quality audio at lower bitrates.
In a typical audio/video stream, audio bitrate is much lower than video bitrate. Because the complexity of the digitized audio data (i.e. a waveform) is significantly lower than that of video (comprised of frames and pixels).

What is a Streaming Transport Protocol?
A set of rules to send media across the network. Typical transports are HTTP, RTP/RTSP, RTMP, ShoutCast

What is Adaptive Streaming (aka Adaptive Bitrate Streaming)?
A way to continuously vary the bitrate (and subsequently the quality) of the content to adapt to ever-changing bandwidth conditions.

What is Transcoding?
Decoding and re-encoding the media into a different format for the purpose of making it available in different qualities (for adaptive streaming) or to different devices.

What is RTMP?
A low-level Streaming Protocol. It is used in specialized applications and preferred for its reliability and relatively low latency.

What is HTTP Live Streaming (HLS)?
A high-level Streaming Protocol built on top of HTTP. As part of its media delivery process, it defines slicing a video into smaller (1-10 seconds long) segments and transmitting these over HTTP. At the end, video is reassembled as a playlist comprised of these segments.
It is widely used in consumer-grade applications due to ease of implementation, adaptive bitrate capabilities and the inherent ubiquity of HTTP.


By being aware of the Video streaming fundamentals listed and explained above, you are already 70% ready to get started. All you need now is to get the WpStream plugin to start streaming on your WordPress site in under 3 minutes as well as a video tutorial.

<iframe width=”560″ height=”315″ src=”” title=”YouTube video player” frameborder=”0″ allow=”accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture” allowfullscreen></iframe>

Beatrice Tabultoc

Beatrice Tabultoc

Beatrice is the digital marketing go-to at WpStream. She manages all things social media, content creation, and copywriting.

Start your free trial with WpStream today and experience the ability to broadcast live events, set up Pay-Per-View videos, and diversify the way you do your business.