In December 2013, composer Andrew Lloyd Webber announced that a film adaption of his musical “Cats” was in development.
Almost six years later, the first trailer for the soon-to-be-released film premiered to audiences.
The word “travesty” seems like a heavy-handed descriptor, but after watching the petrifying, computer-animated, human-cat hybrids leap across the screen a multitude of times, I am at a loss for any other label.
Whether or not you are a fan of the classic musical, if you have seen the trailer, you can understand why viewers were so taken aback.
What was once a story featuring energetic actors dressed in detailed feline-esque costumes now features an off-putting computer-generated amalgamation that is somehow neither human nor cat.
In this age of technological advances, it begs the question of whether or not CGI (Computer Generated Imagery) is really necessary. Could “Cats” have been brought to the big screen just as it is brought to life on the stage, using practical effects?
Yes, CGI is a fantastic development in filmmaking and can be used to enhance things that are already there. But how much is too much?
The answer to this is when the film loses sight of the tangible aspects that all audiences look for. I believe that the 2019 rendition of “Cats” has left tangibility far in the rearview mirror.
This, however, is not the first time CGI has replaced good, old-fashioned practical effects in film.
John Carpenter’s “The Thing” (1982) and its 2011 remake/prequel of the same name are good examples of how much more successful non-animated effects can be.
While both films depict a parasitic, shapeshifting alien, the gory sci-fi effects are far superior in the original film.
Due to the limited computer graphics compared to today, the majority of the film’s special effects were achieved through ingenious uses of Jell-O, wax and other materials, and were groundbreaking in the world of FX makeup.
Another example is the disparity in quality between Peter Jackson’s “Lord of the Rings” and “The Hobbit” movie trilogies.
In films like “The Two Towers,” CGI was used effectively to bring Andy Serkis’ Gollum to life, but other creatures, such as the orcs, were created using intense prosthetics and special effects makeup.
Unfortunately, things took a turn in later book-to-screen adaptations, specifically, the third installment of “The Hobbit” series, “The Battle of the Five Armies.”
This movie saw its titular battle become completely computer animated, utilizing very few on-screen actors.
Actor Ian McKellen, known for his role as Gandalf, stated that he was frustrated by the constant use of green screen technology and that acting with animation rather than real actors was not why he enjoyed filmmaking.
That being said, there is no denying that the development of CGI effects is an astounding and groundbreaking step in film.
However, it is the duty of filmmakers to ensure that they are using this advancement appropriately.
As seen with the new adaptation of “Cats,” irresponsible and excessive use of CGI can hinder what could be a successful film.
It is better to put in the extra work necessary for practical effects and give audiences something they will enjoy watching.
Latest posts by Abby Burgess (see all)
- Computer animation can hurt movies more than it helps - 26 Aug 2019
- Theatre needs violence directors to protect actors on set - 18 Mar 2019
- Live television productions should honor theatre art form - 28 Jan 2019