Friday, April 29, 2011

Computer display standard 3

Professional Graphics Controller
With on-board 2D and 3D acceleration introduced in 1984 for the 8-bit PC-bus, intended for CAD applications, a triple-board display adapter with built-in processor, and displaying video with a 60 Hz frame rate. 640×480 (307k) 4:3 8 bpp
MCGA Multicolor Graphics Adapter Introduced on selected PS/2 models in 1987, with reduced cost compared to VGA. MCGA had a 320×200 256 color (from a 262,144 color palette) mode, and a 640×480 mode only in monochrome due to 64k video memory, compared to the 256k memory of VGA. 320×200 (64k)
640×480 (307k)
16:10
4:3
8 bpp
1 bpp
8514
Precursor to XGA and released about the same time as VGA in 1987. 8514/A cards displayed interlaced video at 43.5 Hz. 1024×768 (786k) 4:3 8 bpp
VGA Video Graphics Array Introduced in 1987 by IBM. VGA is actually a set of different resolutions, but is most commonly used today to refer to 640×480 pixel displays with 16 colors (4 bits per pixel) and a 4:3 aspect ratio. Other display modes are also defined as VGA, such as 320×200 at 256 colors (8 bits per pixel) and a text mode with 720×400 pixels. VGA displays and adapters are generally capable of Mode X graphics, an undocumented mode to allow increased non-standard resolutions. 640×480 (307k)
640×350 (224k)
320×200 (64k)
720×400 (text)
4:3
64:35
16:10
9:5
4 bpp
4 bpp
4/8 bpp
4 bpp
SVGA Super Video Graphics Array A video display standard created by VESA for IBM PC compatible personal computers. Introduced in 1989. 800×600 (480k) 4:3 4 bpp
XGA Extended Graphics Array An IBM display standard introduced in 1990. XGA-2 added 1024×768 support for high color and higher refresh rates, improved performance, and support for 1360 (1365,333)×1024 in 16 colors (4 bits per pixel). (+support 1056×400 [14h] Text Mode). 1024×768 (786k)
640×480 (307k)
4:3
4:3
8 bpp
16 bpp
XGA+ Extended Graphics Array Plus Although not an official name, this term is now used to refer to 1152×864, which is the largest 4:3 array yielding under one million pixels. Variants of this were used by Apple Computer (at 1152×870) and Sun Microsystems (at 1152×900) for 21-inch CRT displays. 1152×864 (995k)
640×480 (307k)
4:3
4:3
8 bpp
16 bpp
QVGA Quarter Video Graphics Array
320×240 (75k) 4:3
WQVGA Wide Quarter Video Graphics Array
480×272 (127.5k) 16:9
HQVGA Half Quarter Video Graphics Array
240×160 (38k) 3:2
QQVGA Quarter QVGA
160×120 (19k) 4:3

Monday, April 25, 2011

Computer display standard 2

Table of computer display standards
Video standard Full name Description Display resolution (pixels) Aspect ratio Color depth (2^bpp colors)
unnamed unnamed Various Apple Inc., Atari, Commodore PCs introduced from 1977 to 1982. They used TVs for their displays and typically included a 32×40 wide border in the overscan region of the television, with a usable space of only 320×200 or 160×200 or 80×200 (approximately). They used the non-interlaced (NI) mode to provide a stable image. The non-interlaced designation was dropped after all monitors were manufactured this way. 352×240 NI (approximately) 4:3 8 or 16 colors typical.
unnamed unnamed Commodore Amiga,Atari ST and others. They used NTSC or PAL-compliant RGB component displays or televisions. The interlaced (I) mode produced visible flickering. 704×480 I (approximately) 4:3 16 for ST or 4096 for Amiga (~256,000 for Amiga 4000).
MDA Monochrome Display Adapter The original standard on IBM PCs and IBM PC XTs with 4 KB video RAM.Introduced in 1981 by IBM. Supports text mode only. 720×350 (text) 72:35 1 bpp
CGA Color Graphics Adapter Introduced in 1981 by IBM, as the first color display standard for the IBM PC. The standard CGA graphics cards were equipped with 16 KB video RAM. 640×200 (128k)
320×200 (64k)
160×200 (32k)
16:5
16:10
4:5
1 bpp
2 bpp
4 bpp
Hercules
A monochrome display capable of sharp text and graphics for its time of introduction. Very popular with the Lotus 1-2-3 spreadsheet, which was one of the PC's first killer apps. Introduced in 1982. 720×348 (250.5k) 60:29 1 bpp
EGA Enhanced Graphics Adapter Introduced in 1984 by IBM. A resolution of 640×350 pixels of 16 different colors (4 bits per pixel, or bpp), selectable from a 64-color palette (2 bits per each of red-green-blue). 640×350 (224k) 64:35 4 bpp
Professional Graphics Controller
With on-board 2D and 3D acceleration introduced in 1984 for the 8-bit PC-bus, intended for CAD applications, a triple-board display adapter with built-in processor, and displaying video with a 60 Hz frame rate. 640×480 (307k) 4:3 8 bpp













Monday, April 18, 2011

Computer display standard 1

Various computer display standards or display modes have been used in the history of the personal computer. They are often a combination of display resolution (specified as the width and height in pixels), color depth (measured in bits), and refresh rate (expressed in hertz). Associated with the screen resolution and refresh rate is a display adapter. Earlier display adapters were simple frame-buffers, but later display standards also specified a more extensive set of display functions and software controlled interface.

Until about 2003, most computer monitors had a 4:3 aspect ratio and some had 5:4. Between 2003 and 2006, monitors with 16:9 and 16:10 (8:5) aspect ratios have become commonly available, first in laptops and later also in standalone monitors. Productive uses for such monitors, i.e. besides widescreen movie viewing and computer game play, are the word processor display of two standard letter pages side by side,as well as CAD displays of large-size drawings and CAD application menus at the same time. Further, 16:10 allows viewing of 16:9 video on a computer with player controls visible, and 16:10 is also comes very close to a golden rectangle, which is often considered the most aesthetically pleasing aspect ratio.

Beyond display modes, the VESA industry organization has defined several standards related to power management and device identification, while ergonomics standards are set by the TCO.

Standards
A number of common resolutions have been used with computers descended from the original IBM PC. Some of these are now supported by other families of personal computers. These are de-facto standards, usually originated by one manufacturer and reverse-engineered by others, though the VESA group has co-ordinated the efforts of several leading video display adapter manufacturers. Video standards associated with IBM-PC-descended personal computers are shown in the diagram and table below.

Vector Video Standards2.svg


Thursday, April 14, 2011

Page Interchange Language

Publishing Interchange Language, or "PIL" is a public domain language that allows precise description of the layout of content on pages, groups of multiple pages or any 2-dimensional area, which it calls a "canvas." It was developed between June 1990 and June 1991 by the Professional Publishers Interchange Specification Workgroup, a committee of software and hardware vendors serving the newspaper, magazine and print advertising markets. The committee was led by Quark and Atex.

At the time, physical cut and paste of images and typeset text was still required to assemble many pages because the specialized composition, pagination, text formatting and graphic design systems that produced the content could not operate together to produce integrated output. PIL was designed to allow electronic integration of content and layout, so that one system could print complete pages or layouts with all the typeset text and composed images that came from heterogeneous subsystems. PIL describes the layout and allows the use of any combination of markup languages and image formats to encode the content. It enables any publishing workflow of either sequential or simultaneous layout and content creation. PIL was successfully used to integrate many publishing systems including systems from Agfa, Atex, Autologic, Information International, Inc., Quark, Inc. and Scitex.

Many languages and formats now exist to describe content for the World Wide Web, and to define documents by their logical structure, so the same content can be reformatted for multiple purposes. However, PIL exists to describe precisely a graphical design and the placement of all content within it. It is useful for those who want to define a specific visual presentation rather than the sort of fluid layout that a web browser allows. It does not directly provide any logical structure of elements such as headings, citations, captions and so on. It defines a (theoretically infinite) hierarchy of canvases with coordinate systems, tags, frames, and content of any type. These can be used as needed to draw any type of document.

The complete public domain distribution of PIL includes the language specification document (including a BNF specification, example files, a programmer's guide, and C-language source code for a parser and an output engine to produce PIL. The source code is highly portable to any platform that supports C, either in the ANSI C or earlier K&R forms.

Monday, April 11, 2011

WirelessHD

WirelessHD is an industry-led effort to define a specification for the next generation wireless digital network interface for wireless high-definition signal transmission for consumer electronics products. The consortium currently has over 40 adopters; key members behind the specification include Broadcom, Intel, LG, Panasonic, NEC, Samsung, SiBEAM, Sony, Philips and Toshiba. The founders intend the technology to be used for the CE devices, PCs, and portable devices alike. The specification was finalized in January 2008.

Technology
The WirelessHD specification is based on a 7 GHz channel in the 60 GHz Extremely High Frequency radio band. It allows for uncompressed digital transmission of high-definition video and audio and data signals, essentially making it equivalent of a wireless HDMI. First-generation implementation achieves data rates from 4 Gbit/s, but the core technology allows theoretical data rates as high as 25 Gbit/s (compared to 10.2 Gbit/s for HDMI 1.3 and 21.6 Gbit/s for DisplayPort 1.2), permitting WirelessHD to scale to higher resolutions, color depth, and range.

The 60 GHz band usually requires line of sight between transmitter and receiver, and the WirelessHD specification ameliorates this limitation through the use of beam forming at the receiver and transmitter antennas to increase the signal's effective radiated power. The goal range for the first products will be in-room, point-to-point, non line-of-sight (NLOS) at up to 10 meters. The atmospheric absorption of 60 GHz energy by oxygen molecules limits undesired propagation over long distances and helps control intersystem interference and long distance reception, which is a concern to video copyright owners.

The WirelessHD specification has provisions for content encryption via Digital Transmission Content Protection (DTCP) as well as provisions for network management. A standard remote control allows users to control the WirelessHD devices and choose which device will act as the source for the display.

Promoters
    * Broadcom Corporation
    * Intel Corporation
    * LG Electronics Inc.
    * Panasonic Corporation
    * Philips Electronics
    * NEC Corporation
    * Samsung Electronics, Co., Ltd
    * SiBEAM, Inc.
    * Sony Corporation
    * Toshiba Corporation

Competition
    * Wireless Gigabit Alliance (WiGig) promotes a different specification for multi-gigabit wireless communications operating over the same unlicensed 60 GHz spectrum.
    * Wireless Home Digital Interface (WHDI) specification uses 20 or 40 MHz channels the 5 GHz unlicensed band, offering lossless video and achieving equivalent video data rates of up to 1.5 or 3 Gbit/s.

Friday, April 8, 2011

Slow motion 3

A VCR may have the option of slow motion playback, sometimes at various speeds; this can be applied to any normally recorded scene. It is similar to half-speed, and is not true slow-motion, but merely longer display of each frame.

In action films
Slow motion is used widely in action films for dramatic effect, as well as the famous bullet-dodging effect, popularized by The Matrix.

Formally, this effect is referred to as speed ramping and is a process whereby the capture frame rate of the camera changes over time. For example, if in the course of 10 seconds of capture, the capture frame rate is adjusted from 60 frames per second to 24 frames per second, when played back at the standard film rate of 24 frames per second, a unique time-manipulation effect is achieved. For example, someone pushing a door open and walking out into the street would appear to start off in slow-motion, but in a few seconds later within the same shot the person would appear to walk in "realtime" (normal speed). The opposite speed-ramping is done in The Matrix when Neo re-enters the Matrix for the first time to see the Oracle. As he comes out of the warehouse "load-point", the camera zooms into Neo at normal speed but as it gets closer to Neo's face, time seems to slow down, perhaps visually accentuating Neo pausing and reflecting a moment, and perhaps alluding to future manipulation of time itself within the Matrix later on in the movie.

In Broadcasting
Slow-motion is widely used in sport broadcasting and its origins in this domain extend right back to the earliest days of television, one example being the European Heavyweight Title in 1939 where Max Schmeling knocked out Adolf Heuser in 71 seconds.

In instant replays, slow motion reviews are now commonly used to show in detail some action (photo finish, Football (soccer) goal, ...). Generally, they are made with video servers and special controllers. The first TV slo-mo was the Ampex HS-100 disk record-player.

Wednesday, April 6, 2011

Slow motion 2

How slow motion works
There are two ways in which slow motion can be achieved in modern cinematography. Both involve a camera and a projector. A projector refers to a classical film projector in a movie theatre, but the same basic rules apply to a television screen and any other device that displays consecutive images at a constant frame rate.

Overcranking
For the purposes of making the above illustration readable a projection speed of 10 frames per second (frame/s) has been selected, in fact film is usually projected at 24 frame/s making the equivalent slow over cranking is rare, but available on professional equipment.

Time stretching
The second type of slow motion is achieved during post production. This is known as time-stretching or digital slow motion. This type of slow motion is achieved by inserting new frames in between frames that have actually been photographed. The effect is similar to overcranking as the actual motion occurs over a longer time.

Since the necessary frames were never photographed, new frames must be fabricated. Sometimes the new frames are simply repeats of the preceding frames but more often they are created by interpolating between frames. (Often this interpolation is effectively a short dissolve between still frames). Many complicated algorithms exist that can track motion between frames and generate intermediate frames that appear natural and smooth. However it is understood that these methods can never achieve the clarity or smoothness of its overcranking counterpart.

Simple replication of the same frame twice is also sometimes called half-speed. This relatively primitive technique (as opposed to digital interpolation) is often visually detectable by the casual viewer. It was used in certain scenes in Tarzan, the Ape Man, and critics pointed it out. Sometimes lighting limitations or editorial decisions can require it. A wide-angle shot of Roy Hobbs swinging the bat, in the climactic moments of The Natural, was printed at half-speed in order to simulate slow-motion, and the closeup that immediately followed it was true overcranked slow-motion.

Monday, April 4, 2011

Slow motion 1

Typically this style is achieved when each film frame is captured at a rate much faster than it will be played back. When replayed at normal speed, time appears to be moving more slowly. The technical term for slow motion is overcranking which refers to the concept of cranking a handcranked camera at a faster rate than normal (i.e. faster than 24 frames per second). Slow motion can also be achieved by playing normally recorded footage at a slower speed. This technique is more often applied to video subjected to instant replay, than to film. High-speed photography is a more sophisticated technique that uses specialized equipment to record fast phenomena, usually for scientific applications.

Slow motion is ubiquitous in modern filmmaking. It is used by a diverse range of directors directors to achieve diverse effects. Some classic subjects of slow motion include:

    * Athletic activities of all kinds, to demonstrate skill and style.
    * To recapture a key moment in an athletic game, typically shown as a replay.
    * Natural phenomena, such as a drop of water hitting a glass.

Slow motion can also be used for artistic effect, to create a romantic or suspenseful aura or to stress a moment in time. Vsevolod Pudovkin, for instance, used slow motion in a suicide scene in The Deserter, in which a man jumping into a river seems sucked down by the slowly splashing waves. Another example is Face/Off, in which John Woo used the same technique in the movements of a flock of flying pigeons. The Matrix made a distinct success in applying the effect into action scenes through the use of multiple cameras, as well as mixing slow-motion with live action in other scenes. Japanese director Akira Kurosawa was a pioneer using this technique in his 1954 movie Seven Samurai. American director Sam Peckinpah was another classic lover of the use of slow motion. The technique is especially associated with explosion effect shots and underwater footage.

The opposite of slow motion is fast motion. Cinematographers refer to fast motion as undercranking since it was originally achieved by cranking a handcranked camera slower than normal. It is often used for comic effect, time lapse or occasional stylistic effect.

The concept of slow motion may have existed before the invention of the motion picture: the Japanese theatrical form Noh employs very slow movements.

Saturday, April 2, 2011

Go motion 2

Bumping the puppet
Gently bumping or flicking the puppet before taking the frame will produce a slight blur, however care must be taken when doing this that the puppet does not move too much or that one does not bump or move props or set pieces.

Moving the table
Moving the table the model is standing on while the film is being exposed creates a slight, realistic blur. This technique was used by Aardman animation for the train chase in The Wrong Trousers and again during the lorry chase in A Close Shave. In both cases the cameras were moved physically during a 1-2 second exposure. The technique was revived for the full-length Wallace and Gromit: The Curse of the Were-Rabbit.

Go motion
The most sophisticated technique was originally developed for the film The Empire Strikes Back and used for some shots of the tauntauns; a more advanced and was later used on films like Dragonslayer and is quite different from traditional stop motion. The model is essentially a rod puppet. The rods are attached to motors which are linked to a computer that can record the movements as the model is traditionally animated. When enough movements have been made, the model is reset to its original position, the camera rolls and the model is moved across the table. Because the model is moving during shots, motion blur is created.

A variation of go motion was used in E.T. the Extra-Terrestrial to partially animate the children on their bicycles.

Go motion today
Go motion was originally planned to be used extensively for the dinosaurs in Jurassic Park, until Steven Spielberg decided to try out the swiftly developing techniques of computer-generated imagery instead.

Today, the mechanical method of achieving motion blur using go motion is rarely used, as it is more complicated, slow, and labor intensive than computer generated effects. However, the motion blurring technique still has potential in real stop motion movies where the puppet's motions are supposed to be somewhat realistic. Motion blurring can now be digitally done as a post production process using special effects software such as After Effects, Boris FX, Combustion, and other similar special effects commercial software.