Translate This Page

Visual effects

In film making, visual effects (abbreviated VFX) are the processes by which imagery is created and/or manipulated outside the context of a live action shot. Visual effects involve the integration of live-action footage and generated imagery to create environments which look realistic, but would be dangerous, expensive, impractical, or simply impossible to capture on film. Visual effects using computer generated imagery has recently become accessible to the independent filmmaker with the introduction of affordable and easy-to-use animation and compositing software.

Visual effects are often integral to a movie's story and appeal. Although most visual effects work is completed during post-production, it usually must be carefully planned and choreographed in pre-production and production. Visual effects primarily executed in Post-Production, with the use of multiple tools and technologies such as graphic design, modelling, animation and similar software, while special effects such as explosions and car chases are made on set. A visual effects supervisor is usually involved with the production from an early stage to work closely with production and the film's director design, guide and lead the teams required to achieve the desired effects.

Categories


Visual effects may be divided into at least four categories: Matte paintings and stills: digital or traditional paintings or photographs which serve as background plates for keyed or rotoscoped elements. Live-action effects: keying actors or models through bluescreening and greenscreening. Digital animation: modeling, computer graphics lighting, texturing, rigging, animating, and rendering computer-generated 3D characters, particle effects, digital sets, backgrounds. Digital effects (commonly shortened to digital FX or FX) are the various processes by which imagery is created and/or manipulated with or from photographic assets. Digital effects often involve the integration of still photography and computer generated imagery (CGI) in order to create environments which look realistic, but would be dangerous, costly, or simply impossible to capture in camera. FX is usually associated with the still photography world in contrast to visual effects which is associated with motion film production.'

               VFX can be categorized into:

Motion graphics

Motion graphics are graphics that use video footage and/or animation technology to create the illusion of motion or rotation, and are usually combined with audio for use in multimedia projects. Motion graphics are usually displayed via electronic media technology, but may be displayed via manual powered technology (e.g. thaumatrope, phenakistoscope, stroboscope, zoetrope, praxinoscope, flip book) as well. The term is useful for distinguishing still graphics from graphics with a transforming appearance over time without over-specifying the form

Motion graphics extend beyond the most commonly used methods of frame-by-frame footage and animation. Computers are capable of calculating and randomizing changes in imagery to create the illusion of motion and transformation. Computer animations can use less information space (computer memory) by automatically tweening, a process of rendering the key changes of an image at a specified or calculated time. These key poses or frames are commonly referred to as keyframes or low CP. Adobe Flash uses computer animation tweening as well as frame-by-frame animation and video

Since there is no universally accepted definition of motion graphics, the official beginning of the art form is disputed. There have been presentations that could be classified as motion graphics as early as the 1800s.Michael Betancourt wrote the first in depth historical survey of the field, arguing for its foundations in visual music and the historical abstract films of the 1920s by Walther RuttmannHans RichterViking Eggeling andOskar Fischinger.[2]

One of the first uses of the term "motion graphics" was by animator John Whitney, who in 1960 founded a company called Motion Graphics Inc.

Saul Bass is a major pioneer in the development of feature film title sequences. His work included title sequences for popular films such as The Man With The Golden Arm (1955), Vertigo (1958), Anatomy of a Murder (1959), North by Northwest (1959), Psycho (1960), and Advise & Consent (1962). His designs were simple, but effectively communicated the mood of the film

Digital intermediate


Digital intermediate (typically abbreviated to DI) is a motion picture finishing process which classically involves digitizing a motion picture and manipulating the color and other image characteristics. It often replaces or augments the photochemical timing process and is usually the final creative adjustment to a movie before distribution in theaters. It is distinguished from the telecine process in which film is scanned and color is manipulated early in the process to facilitate editing. However the lines between telecine and DI are continually blurred and are often executed on the same hardware by colorists of the same background. These two steps are typically part of the overall color management process in a motion picture at different points in time. A digital intermediate is also customarily done at higher resolution and with greater color fidelity than telecine transfers.


Although originally used to describe a process that started with film scanning and ended with film recording, digital intermediate is also used to describe color correction and color grading and even final mastering when a digital camera is used as the image source and/or when the final movie is not output to film. This is due to recent advances in digital cinematography and digital projection technologies that strive to match filmorigination and film projection.


In traditional photochemical film finishing, an intermediate is produced by exposing film to the original camera negative. The intermediate is then used to mass-produce the films that get distributed to theaters. Color grading is done by varying the amount of red, green, and blue light used to expose the intermediate. This seeks to be able to replace or augment the photochemical approach to creating this intermediate.


The digital intermediate process uses digital tools to color grade, which allows for much finer control of individual colors and areas of the image, and allows for the adjustment of image structure (grain, sharpness, etc.). The intermediate for film reproduction can then be produced by means of a film recorder. The physical intermediate film that is a result of the recording process is sometimes also called a digital intermediate, and is usually recorded to internegative (IN) stock, which is inherently finer-grain than camera negative (OCN).


One of the key technical achievements that made the transition to DI possible was the use of the 3D look-up tables (aka "3D LUTs"), which could be used to mimic how the digital image would look once it was printed onto release print stock. This removed a large amount of skilled guesswork from the film-making process, and allowed greater freedom in the colour grading process while reducing risk.


The digital master is often used as a source for a DCI-compliant distribution of the motion picture for digital projection. For archival purposes, the digital master created during the Digital Intermediate process can still be recorded to very stable high dynamic range yellow-cyan-magenta (YCM) separations on black-and-white film with an expected 100-year or longer life. This archival format, long used in the industry prior to the invention of DI, still provides an archival medium that is independent of changes in digital data recording technologies and file formats that might otherwise render digitally archived material unreadable in the long term.

History

Telecine tools to electronically capture film images are nearly as old as broadcast television, but the resulting images were widely considered unsuitable for exposing back onto film for theatrical distribution. Film scanners and recorders with quality sufficient to produce images that could be inter-cut with regular film began appearing in the 1970s, with significant improvements in the late 1980s and early 1990s. During this time, digitally processing an entire feature-length film was impractical because the scanners and recorders were extremely slow and the image files were too large compared to computing power available. Instead, individual shots or short sequences were processed for special visual effects. In 1992, Visual Effects Supervisor/Producer Chris F. Woods broke through several "techno-barriers" in creating a digital studio to produce the visual effects for the 1993 release Super Mario Bros. It was the first feature film project to digitally scan a large number of VFX plates (over 700) at 2K resolution. It was also the first film scanned and recorded at Kodak's just launched Cinesite facility in Hollywood. This project based studio was the first feature film to use Discreet Logic's (now Autodesk) Flame and Inferno systems, which enjoyed early dominance as high resolution / high performance digital compositing systems. Digital Film compositing for Visual Effects was immediately embraced, while optical printer use for VFX declined just as quickly. Chris Watts further revolutionized the process on the 1998 feature film Pleasantville, becoming the first visual effects supervisor for New Line Cinema to scan, process, and record the majority of a feature length, live-action, Hollywoodfilm digitally. The first Hollywood film to utilize a digital intermediate process from beginning to end was O Brother, Where Art Thou? in 2000 and in Europe it was Chicken Run released that same year.

The process rapidly caught on in the mid-2000s. Around 50% of Hollywood films went through a digital intermediate in 2005, increasing to around 70% by mid-2007.[1] This is due not only to the extra creative options the process affords film makers but also the need for high-quality scanning and color adjustments to produce movies for digital cinema.


Compositing

This article is about visual effects. For the process of combining several democratic motions, see compositing (democracy). For the technique of compositing typeset by hand, see typesetting. For compositing in graphic design and still photography, see photomontage. For other uses, see Compositing (disambiguation).


Four images of the same subject with original backgrounds removed and placed over a new background


The Galactic Center of the Milky Wayseen in a composite image with data from the Hubble Space Telescope, the Spitzer Space Telescope, and the Chandra X-ray Observatory

Compositing is the combining of visual elements from separate sources into single images, often to create the illusion that all those elements are parts of the same scene. Live-action shooting for compositing is variously called "chroma key", "blue screen", "green screen" and other names. Today, most, though not all, compositing is achieved through digital image manipulation. Pre-digital compositing techniques, however, go back as far as the trick films ofGeorges Méliès in the late 19th century; and some are still in use.

All compositing involves the replacement of selected parts of an image with other material, usually, but not always, from another image. In the digital method of compositing, software commands designate a narrowly defined color as the part of an image to be replaced. Then the software replaces every pixel within the designated color range with a pixel from another image, aligned to appear as part of the original. For example, one could record a television weather presenter positioned in front of a plain blue or green background, while compositing software replaces only the designated blue or green color with weather maps.

In television studios, blue or green screens may back news-readers to allow the compositing of stories behind them, before being switched to full-screen display. In other cases, presenters may be completely within compositing backgrounds that are replaced with entire “virtual sets” executed in computer graphics programs. In sophisticated installations, subjects, cameras, or both can move about freely while the computer-generated imagery (CGI) environment changes in real time to maintain correct relationships between the camera angles, subjects, and virtual “backgrounds.”

Virtual sets are also used in motion pictures filmmaking, some of which are photographed entirely in blue or green screen environments; as for example in Sky Captain and the World of Tomorrow. More commonly, composited backgrounds are combined with sets – both full-size and models – and vehicles, furniture, and other physical objects that enhance the “reality” of the composited visuals. “Sets” of almost unlimited size can be created digitally because compositing software can take the blue or green color at the edges of a backing screen and extend it to fill the rest of the frame outside it. That way, subjects recorded in modest areas can be placed in large virtual vistas. Most common of all, perhaps, are set extensions: digital additions to actual performing environments. In the film Gladiator, for example, the arena and first tier seats of theRoman Colosseum were actually built, while the upper galleries (complete with moving spectators) were computer graphics, composited onto the image above the physical set. For motion pictures originally recorded on film, high-quality video conversions called “digital intermediates” are created to enable compositing and the other operations of computerized post production. Digital compositing is a form of matting, one of four basic compositing methods. The others are physical compositing, multiple exposure, and background projection.

 
 

Check Your Reflexes Weather You Are Fit For Becoming A VFX Artist Or Not 

Test your REFLEX!

Click on "Start" first, and wait until the background color changes. As soon as it changes, hit "stop!"



UP