Either the CGI designers do their work together or our TVs will continue to put their movies on the ropes

Who would have said it? Iron Man, 2008 film directed by Jon Favreau, contains CGI (computer generated visuals) more persuasive From a large portion of the latest movies released by Marvel. Some “Spider-Man: No Way Home” or “Doctor Strange in the multivers of madness” sequences have a surprisingly less successful ending than the production that laid the foundation for the Marvel Cinematic Universe.

Iron Man came out in theaters about fourteen years earlier than the two movies I just mentioned, yet its visual ending is much more polished. The worrying thing is that this It is not an isolated case. It doesn’t just affect Marvel. Here’s another, more revealing example: “Jurassic Park,” the production that left Steven Spielberg speechless in 1993, has CGI and special effects that are more believable and better resolved than those of many films released nearly three decades later.

CGI poorly resolved could spoil disbelief comment

It’s clear that the artists involved in CGI design for today’s movies haven’t forgotten how to do their work. So what caused this drop in the quality of computer-generated photos? One reason is that every time More films are using this techniquePlus, there’s CGI in more and more shots, which often reduces the time designers have to improve on final images.

but this is not all. There’s also a technical reason for CGI shortcomings: it usually renders at 2K (2048 x 1080 dots). However, most movies shot with digital cameras are shot in 4K. Both elements must coexist in the same frame, and in order to smooth this difference in resolution it is necessary to process 4K images to Pass it to 2K.

QD-OLED panels are the best thing that ever happened to TVs: what certainties and uncertainties does this technology leave us

Then, once digital elements are combined with this newer resolution and the footage was originally captured in 4K, all frames are upscaled to 4K using AI algorithms. This last measure is necessary because some cinemas display images using 4K projectors, and above all, because TVs with a 4K UHD panel dominate the domestic market Clearly appalling.

If CGI is not resolved well, viewing at 2K and its subsequent scaling to 4K may not be up to par when we enjoy this content on a 4K UHD panel TV.

The problem is that if CGI doesn’t resolve well, it will render at 2K and subsequent scaling to 4K It may not be equal When we enjoy this content on a 4K UHD TV, especially if we use a newer generation device that is capable of retrieving a large amount of information. Currently, a large portion of cinemas are shown in resolutions below 4K, so if CGI is careful, the fact that it was originally rendered at 2K in this context isn’t a big deal.

However, TVs with a 4K UHD panel are unforgiving. Many seemingly CGI-correct films in theaters give us a less satisfying experience. When we see it on our 4K UHD TV. The digital elements of some frames are often not as believable as they should be, and when this happens the magic of cinema can be lost because the suspension of disbelief stops working.

Strong woman

CGI from the first images we’ve seen of “She-Hulk,” which will arrive at Disney+ on August 17, left a lot to be desired. In the latest trailer for the series, it appears to have improved, but we’ll be sure when we see how it looks on 4K UHD TVs.

When we watch a movie, especially if it’s science fiction, fantasy, action, or adventure, viewers voluntarily accept the need to set aside the criteria we normally use to judge the real world. We accept The rules suggested by the movie Because if we don’t, it’s impossible for us to enjoy it. Of course, for this tacit agreement to work, it is essential that what we see appears coherent to us. This is not permanent.

The subwoofer, the ugly duckling of hi-fi, is actually the Holy Grail of audio conditioning.

Poorly implemented CGI can ruin our experience no matter how much goodwill we set on our part. It could get us out of the movie, in particular If his presence in the footage is constant. This usually happens in superhero movies that have been too expensive on billboards for years, which is why some productions featuring Marvel and DC characters have been criticized by many fans for poor CGI.

If we want CGI not to get bogged down when we watch movies using it on a 4K UHD TV, the ideal option is to have it rendered natively in 4K

The solution to this problem is to improve the computer-generated images without letting you into the tight deadlines that movie production companies run into. Movies like “Iron Man”, “Jurassic Park” and many others show that it is possible to make very reasonable and satisfying CGI visuals. And the directors know that. But this is not enough. If in our homes we mostly have 4K UHD TVs and we want CGI not to get stuck when we watch movies we use them on, then the ideal is that Originally rendered in 4K.

The problem is that time is a very precious resource, and it’s not clear if production companies are willing to delay the time they invest in post-production for some of their films. The other ingredient in the recipe, 4K rendering, is also an important compromise because the computational effort required to render a frame at this resolution is much greater than that involved in rendering it at 2K. Anyway, sooner or later filmmakers They will have to jump through hoops. Otherwise, our TVs will continue to show all the flaws in your work.

Leave a Comment