Analog video basics

From Scanlines Wiki
Jump to navigation Jump to search

Analog Video is a way of presenting visual information on various 2D. Audio-video equipment uses various methods of modulation to encoded information in an electrical signal. There is a wide variety of modulation methods and signal formats, such as VGA, NTSC, PAL, S-Video and many others. Many devices exist that can produce video signals such as cameras, VCRs, video game consoles, Raspberry Pis, video synthesizers, computer graphics cards, and TV studio equipment such as video mixers.

Common to all video signals are their very specific waveforms. These waveforms are often encoding at extremely high frequencies in the order of millihertz, and therefore require extremely precise timing to generate a useful signal. In the earliest days of video technology, this problem of precision timing made video technology extremely expensive. Over time, microcontrollers were developed for encoding and decoding video signals, and video technology became an affordable form of entertainment for millions through computers, televisions and video-game consoles.

History and development

Some of the most widespread technology for creating, working with, and displaying analog video was developed for analog television, and variations on the signal formats used for analog TV broadcasts are commonly what people mean by "analog video". The signal most similar to over-the-air analog TV broadcasts that is worked with most commonly is composite video -- this is a color analog TV signal that has not been RF modulated for radio broadcast.

Cathode ray tubes were the first widespread form of analog displays, which operated by scanning an electron beam across a piece of glass (the screen) that has been chemically treated so that it lights up when electrons hit it. The analog video signal is used to control how fast the electron gun is emitting electrons, that is, the current flowing through the cathode ray tube.

To display a continuously-varying analog video signal as a sequence of images, it is necessary to map a one-dimensional signal, such as voltage over time, into a two-dimensional signal, such as brightness of each point of the screen. This process of splitting up an analog signal over time so that it forms a sequence of horizontal lines on a display that are perceived as an image is called scanning, and each horizontal line is called a scanline. The position of the scanlines, where they begin and end, and where the first and last scanlines fall is determined by synchronization pulses.

Hacking and circuit bending

Throughout the history of analog video a practice of glitching and experimenting has happened, akin to the way that record players were experimented with to invent the practice of turntablism in hip-hop, wherein "incorrect" use of the record player by scratching, muting, looping, stopping the motor, et cetera, turned the device into a completely new instrument. One simple but entertaining technique in video, is feeding an audio input into a composite video device like a CRT. For older ray tubes, which typically lack digital error-control mechanisms, whatever is fed as input into the RCA jack is used to guide the electron beam.

End of an era: the death of NTSC and the CRT

In 2012, full power broadcast stations in the United States, China, Japan and Canada had stopped using the NTSC signal which defined the CRT era, and by 2015 it had been dumped by all class-A stations. In 2021 the United States dropped NTSC from even low powered broadcast stations.

While the CRT is a fairly indescriminate dislay device, thus allowing more error-prone devices to still function moderately well, they have fallen out of favor due to their bulkiness and inferior resolution. The average CRT is only capable of a maximum of around 30 frames per second. In the Western hemisphere of Earth, planned obsolescence, driven by ever-increasing technological sophistication itself a result of the need for annual marketing of new products demanded by capitalism, spelled the end for the CRT along with earlier video formats such as VHS tapes or S-Video, leaving behind many who loved these older technologies and found them to be just as entertaining and valuable.

As the 90's rolled on, CRT televisions began to be phased out in favor of alternatives such as plasma and LCD technologies. These technologies did carry significant advantages: digital signals are much simpler to encode and decode than the composite video signal, with all its complex carrier waveforms and sub-carrier waveforms, and its two seperate sync pulses, all crammed into a single ~13MHz wave cycle. With the shrinking size and price of transistors used in binary logic, the basis for computing, error-correction went from a luxury to a necessity and most displays now feature automatic detection of only "valid" signals, making glitching and circuit bending more difficult, if not impossible for most digital displays.

On the other hand, the very same factor of MCU affordability and accessibility has breathed new life into the feild of experimental video: with PIC and AVR micros, some creators have taken to learning assembly language and C++ to program their own devices. It could be argued that the specifications of composite video was in itself even more obtuse and esoteric than new forms of video transmission.


Youtube Tutorials

Initial set of links from this scanlines post