What is rendering?

Rendering refers to the process of translating structural information into sequential information. This abstract concept is much easier to understand with an example: In Adobe After Effects, when functions and processes are created, these are the structural elements. To make these functions and processes perceptible as a video, the program must generate individual images and layer them one after another, creating a video that we can watch. Rendering is often specifically associated with the generation or image synthesis of videos and photos, but the concept applies to music as well. When a DJ combines music on a computer, it may need to be rendered into an MP3 file so that it can be played in any audio player. This process is also referred to as rendering.

How are photos rendered?

Objects in a photo are typically three-dimensional (3D). To create a photo of something that doesn't exist physically, like a car, a 3D model is needed, usually created in CAD software. This model is constructed from vectors and polygons. In the case of complex objects, the more vectors there are, the more realistic the object appears. Then, an outer layer is added to the object, such as paint for a car or brushed metal surfaces, and sometimes transparent structures like glass. To prevent the object in the photo from looking flat, it is necessary to show it in the interplay of light sources. For example, in the case of a car, reflections and shadows should be visible. Once the object is complete and positioned in the desired environment, it needs to be turned into a photo or a 360-degree spin. The process triggered in programs like Adobe's Substance 3D Stager is called rendering. The result is a JPEG or TIFF file, a flat image. The object is then rotatable, allowing photos to be rendered in any desired position. In the case of 360-degree spins, the object is rotated by, for example, 10 degrees at a time, and a photo is rendered for each rotation. These photos are then assembled in different software to allow users to independently rotate the object. Another program, a player, is required for this.

How are videos rendered?

In essence, rendering videos and movies is quite similar to rendering photos, with the added aspect of motion. Similar to a flipbook, individual images or differences from the previous image in the sequence of film frames are laid out one after the other, depending on the video's compression method. For example, when recording a video of online gameplay, rendering essentially occurs beforehand – objects are positioned correctly, and the game creates changes in the video flow. When players record the video of the game to make subsequent edits, such as adding intertitles, effects, or filters, cutting the video, and adding music, it's necessary to render these features into the video and audio tracks. This results in the final movie.

The rendering process for movies involves several stages:

  1. There are movies that consist entirely of polygons and a dynamic CAD environment – essentially, these are computer games.
  2. Some movies are created in a manner quite similar to computer games but include human characters incorporated using blue or green screen techniques.
  3. Then there's the post-production phase of movies. What happened previously was already rendered, and post-production involves adjustments to exposure, colors, image sharpness, along with additional effects, text, transitions, and more.

All of these requirements make the rendering process a complex procedure that demands substantial computational resources and time.

Software used for these tasks is included in Adobe CC products, and in the 3D realm, software like Blender, SketchUp, Autodesk, Cinema 4D, or Twinmotion is used.

How long does it take to render a movie?

The time required for rendering varies significantly, especially for movies. Determining factors include the complexity, resolution or level of detail, type and number of effects, the density of cuts, and other factors that affect the degree of variation between individual frames within a video. Moreover, the time required depends on the performance of the computer used for rendering. Crucial factors here include the CPU's performance, the graphics card (e.g., NVIDIA), the size and speed of the RAM, and the speed of the data storage. When it comes to RAM, the rule is essentially "more is better" – 16 GB is the minimum for acceptable speeds. A larger amount of RAM can speed up the process, provided that the software and the operating system manage it accordingly. For data storage, it's important to use SSDs as the data source for production processes. Traditional hard drives (HDDs) can be used for storage.

With appropriate hardware and not too high a resolution, rendering can be significantly faster than the final video's duration. Even at high resolutions, real-time rendering is possible. With less powerful IT equipment, rendering can take significantly longer than the planned video duration.


How is music rendered?

Music created on a computer using music editing software must be rendered in the same way as films or videos. For instance, when using programs like Magix, Steinberg, Fruity Loops (FL Studio), or Ableton, their functionality is comparable to software used for film creation. Virtual instruments (e.g., MIDI) are utilized, and the output from these instruments is manipulated. There are filters, modulators, and adjustments available for volume, treble, bass, and more. Samples can also be integrated. Once a piece of music has been created using the software, it needs to be rendered in a sequential format, such as simple MP3, FLAC, or WAV for platforms like Spotify.