CGI

How CGI Ruined Modern Cinema

Before the invention of Computer Generated Images, or CGI, movie makers came up with original and inventive ways to create the proper effects for scenes. Directors often utilized various camera tricks and hand-painted backgrounds to accentuate their film’s visuals, and such effects had to be balanced with the cost and difficulty of using processed film. Because of this, special effects of any kind were used frugally or not at all, and the films which did include them were often big-budget, high-production undertakings that received prodigious amounts of marketing.

Then, in 1973, the newest way of creating visual effects hit the big screen. Michael Crichton’s sci-fi western, Westworld, was the first movie to utilize the nascent technology, and, in doing so, the cinematic world was changed forever. The movie’s use of digital imaging set a precedent for future films despite only lasting several seconds. Three years later, Westworld’s sequel, Futureworld (1976), became the first film to use 3D images that were digitally generated.

Now, nearly fifty years later, the use of CGI has become less of a rare spectacular and more of a viewing expectation which has ruined the quality of modern cinema. Movie makers are relying too heavily on special effects and not enough on creating a moving plot, showing true character development, portraying honest human emotion and showcasing brilliant acting. 

For example, the top two most recent movies to be named the highest grossing film in box office history are Avatar (2009) and Marvel’s Avengers: Endgame (2019). Sixty percent of Avatar, which made $2,789,000,000 during its time in theatres, was made up of computer-generated images alone. On an even larger scale, Avengers: Endgame, which earned just over $2,796,000,000 during its time in theatres, had 96 percent of its scenes created using digital imaging. 

Movies before CGI were made through simpler means and, consequently, were more human. Characters struggled with real-life desires and emotional ups and downs, and plots posed compelling questions that left audience members reflecting upon their own lives. For example, Audrey Hepburn’s most iconic role as Holly in Breakfast at Tiffany’s (1961) examines people’s desire for attention and the dissatisfaction which comes with material possessions. 

The hit Netflix show Stranger Things is a perfect example of how CGI changes cinematic emphasis. The first season of the show focused on the characters and, in doing so, built an intricately stunning plot; in turn, the audience loved it, and both the Rotten Tomatoes and Audience ratings were near perfect. Visual effects began to take a forefront during the second and third seasons of the show, however, leaving the plot somewhat wanting. Consequently, the ratings dropped. 

Some argue that the introduction of CGI has improved modern cinema by increasing the divide between what’s seen in movies and the realities of life. The technology allows directors to make worlds so vastly different from our own that one can more easily escape his or her troubles for a few hours. However, this distracts from the fundamental purpose of the art: getting to the heart of human emotion and examining life’s most pressing questions through stories. Instead of escaping our lives and jumping into visually stunning yet shallow and finite worlds, we learn to confront our problems, to improve and to change. 

Moving forward, movie makers need to cut back on their use of CGI and return to the classic style of film to ensure that the true nature of the art form is not lost in the flames of a computer-generated explosion. 

Dara Marie is a Freshman at Utah State University from Washington DC, Virginia, where she had lived all her life before moving to Logan. She is currently majoring in English and hopes to soon have her creative work published.