BASIC CONCEPTS
Some information for frequently used digital video terms for users who are not deeply familiar with digital video technologies.
I. Signal Types
II.Video Standards
III.Video Compression
IV: File Wrappers
V: Streaming Protocols
VI. Various other Concepts
I. Signal Types
Analog video signals
Analog signals are a representation of time varying quantities in a continuous signal. Basically, a time variance is presented in a manner in which some sort of information is passed using various types of methods. The most common form of analog signal transmission occurs electrically. In order for this to happen, a voltage must be sent at a specific frequency.
Analog video is a video signal transferred by an analog signal. When combined into one channel, it is called composite video. Analog video may be carried in separate channels, as in two channel S-Video and multi-channel component video formats.
Composite Video: Composite (1 channel) is an analog video transmission - without audio - that carries standard definition video typically at 480i or 576i resolution. Composite video is usually in standard formats such as NTSC, PAL and SECAM and is often designated by the CVBS initialism, meaning "Color, Video, Blanking, and Sync."
Component Video: Component video is a video signal that has been split into two or more component channels. In popular use, it refers to a type of component analog video information that is transmitted or stored as three separate signals. Like composite, component-video cables do not carry audio and are often paired with audio cables. When used without any other qualifications, the term component video usually refers to analog YPbPr component video with sync on luma.
Digital Signals
A digital signal refers to an electrical signal that is converted into a pattern of bits. Unlike an analog signal (which is a continuous signal that contains time-varying quantities), a digital signal has a discrete value at each sampling point. The precision of the signal is determined by how many samples are recorded per unit of time.
Digitizing is the representation of an analog signal by a discrete set of its points or samples. The result is called as digital form for the signal. To tell the long story short, digitizing means simply capturing an analog signal in digital form.
grey - analog signal, black - digital signal
DVI: Digital Visual Interface was developed to create an industry standard for the transfer of digital video content. The interface is designed to transmit uncompressed digital video and can be configured to support multiple modes such as DVI-D (digital only), DVI-A (analog only), or DVI-I (digital and analog).
HDMI: High-Definition Multimedia Interface is the first industry-supported uncompressed, all-digital audio/video interface (single cable\single connector). HDMI provides an interface between any audio/video source, such as a set-top box, DVD player, or A/V receiver and an audio and/or video monitor, such as a digital television. HDMI supports SD or HD video, plus multi-channel digital audio on a single cable. It is able to transmit all HDTV standards and supports 8-channel digital audio with bandwidth to spare to accommodate future requirements.
SDI: Serial Digital Interface (SDI) is a video interface typically used in professional applications. It uses standard coaxial video cables with professional BNC connectors to carry video that is encoded as a digital data stream. Since SDI cables originally lacked the capacity to carry a full HD video signal, the standard has been expanded to include higher-bit rates as HD-SDI and finally as 3G-SDI (which is able to transfer data up to 3 Gbits/sec)
II.Video Standards
Standard Definition
Standard-definition television (SDTV) is a television system that uses a resolution that is lower than HDTV (720p and 1080p) or enhanced-definiton television (EDTV - 480p). The two common SDTV signal types are 576i, derived from the European-developed PAL and SECAM systems; and 480i based on the American NTSC system.
NTSC
NTSC,(named for National Television System Committee) is the analog TV standard that is used in most of North America, parts of South America and Pacific. It is developed in 1940s, featuring 525 lines, 60 fields per second as the standard (480i, 29.97 fps). Please note that, as a result of the market going to digital, big majority of over-the-air NTSC transmission is turned off in USA in 2009 and in Canada and most others in 2011.
PAL
PAL (short for Phase Alternating Line) is the analog TV standard which is set for the purpose of overcoming problems of NTSC after the introduction of color TV. It is developed in 1960s, featuring 625 lines, 50 fields per second (576i, 25 fps).
High Definition
High-Definition (HD) provides a resolution that is substantially higher than that of SD.
HDTV may be transmitted in various formats:
720p is a progressive HD signal format with 720 horizontal lines and an aspect ratio of 16:9. All major HDTV broadcasting standards include a 720p format which has a resolution of 1280×720.
1080i 25 is an interlaced HD signal format with a spatial resolution of 1920 × 1080 and a temporal resolution of 50 interlaced fields per second. This actually stands for 25 frames per second (25fps).
1080i30 is an interlaced HD signal format with a spatial resolution of 1920 × 1080 and a temporal resolution of 60 interlaced fields per second. This actually stands for 30 frames per second (30 fps).
1080p is a progressive HD signal format, with 1920 x 1080 spatial resolution and can refer to 1080p 24fps, 1080p 50fps or 1080p 60fps. 1080p format is often marketed as full HD or native HD.
III.Video Compression
When digitized, an ordinary analog video sequence can consume as much as 165 Mbps (million bits per second), equivalent to over 20 MBs (megabytes) of data per second. As may be expected, a series of techniques –called video compression techniques – have been derived to reduce this high bit-rate. The ability to perform this task is quantified by the compression ratio. As the compression ratio gets higher, the bandwidth consumption lessens. Of course, the goal of compression is to reduce the data rate while keeping the image quality as high as possible.
Of course there are advantages and disadvantages of video compression. Some of them can be listed as;
Advantages:
Occupies less disk space, results in substantially lower costs.
Reading, writing and file transferring is faster.
The order of bytes is independent.
Disadvantages:
Requires computing resources, the more complex the algorithm the more resources required.
Errors may occur while transmitting data.
Loss in video quality, generation losses
Video needs to be decompressed to be usable.
What is a ‘codec’ ?
Codec is a short name for coder-decoder, the algorithm that takes a raw data file and turns it into a compressed file (or the inverse-algorithm when it is the case for decompressing). Codecs may be found in hardware (such as in DV camcorders and capture cards) or in software. As compressed files contain only some of the data found in the original file, actually a codec is the necessary “translator” that decides what data makes it to the compressed version and what data gets discarded. In brief, a codec is used to compress and then decompress the content to get the compression needed to work with digital audio and video.
Video content that is compressed using one standard cannot be decompressed with a different standard. This is simply because one algorithm cannot correctly decode the output from another algorithm but it is possible to implement many different codecs in the same software or hardware, which would then enable multiple formats to be compressed.
DV is a codec that is launched in 1995 with joint efforts of leading producers of video camcorders. DV uses lossy compression of video (frame-by-frame basis- intraframe) while the audio stored uncompressed.
Sony and Panasonic have created their proprietary versions of DV;
DVCPRO, also known as DVCPRO25, is developed by Panasonic for use in ENG equipment. A possible important difference from baseline DV is that tape is transported 80% faster, resulting in shorter recording time. DVCPRO50 is developed later, which doubles the coded video data rate to 50 Mbit/s (this of course, cuts the total record time of any given storage medium in half compared to DVCPRO (25 Mbit/s).
DVCAM is developed by Sony for professional use. DVCAM has the ability to do a frame accurate insert tape edit, while baseline DV may vary by a few frames on each edit compared to the preview. Also tape transport is 50% faster than baseline DV.
DVCPRO HD, also known as DVCPRO100 is a HD video codec that can be thought of as four DV codecs that work in parallel. Video data rate depends on frame rate and can be as low as 40 Mbit/s for 24 fps mode and as high as 100 Mbit/s for 50/60 fps modes.
MPEG2
The MPEG-2 (1995) project mainly focused on extending the compression technique of MPEG-1 (1993) to cover larger pictures and higher quality at the expense of a higher bandwidth usage. MPEG-2 also provides more advanced techniques to enhance the video quality at the same bit rate. Of course, the expense was the need for far more complex equipment. MPEG-2 is considered important because it has been chosen as the compression scheme for over-the-air digital tv ATSC, DVB and ISDB, digital satellite TV services, digital cable tv signals, VCD (SVCD), DVD Video and Blu-ray discs.
H.264