We Live in a CGI World
Computer Generated Imagery has come a long way in the 50 or so years it’s been around. It’s been responsible for some of the most memorable motion picture moments, the most gripping gaming sequences, and even the way products are designed and advertised.
It’s hard to imagine the world without it today. However, it’s been an evolution and there is still much work to be done on the exciting technology.
CGI: A Brief History
Like many technologies, CGI had a rather unassuming start. To trace its history, we need travel back in time to the year 1968. Back then, the notion of displaying imagery that a computer had generated was completely alien. In fact, computers themselves were pretty cutting-edge! That was until a group of leading mathematicians and scientists from Russia successfully created a math-based model that created a cat that could move across a computer monitor. This simple programme was further developed by the team and would eventually be used on a state-of-the-art computer known BESM-4. This resulted in the creation of usable film material using computers alone.
After CGI technology had been born, it began to evolve rapidly. By the 1970s the innovation was suitably impressing movie makers. The start of the decade saw computer scientists Nestor Burtnyk and Marceli Wein create the first ever software for CGI. This was quickly followed by the first use of the technology on national TV. Later in the 1970s, the movie Westworld also showcased the fledgling technology by using a 2D effect during the plot.
With work on CGI progressing, it wasn’t long before movie makers began looking towards 3D CGI to further illustrate their creations. Futureworld in 1976 was one such example of an early use of 3D computer generated images.
Perhaps the best-known early uses of CGI would come later in the 1970s. Star Wars, Alien, Superman: The Movie all had breath-taking (for the day) examples of the technology in action. These offerings amazed viewers at the time, even if they do look a little dated now!
As the 1980s and 90s rolled around, computing was becoming more popular, as well as ever-more powerful and affordable. Of course, CGI technology hugely benefited. CGI human characters, 3D shaded CGI, 3D water effects, digital composites and alien landscapes were now all possible. The likes of TRON, Star Trek 2, The Abyss, Jurassic Park, and Toy Story all demonstrated how far technology had come in just a couple of decades.
Today, CGI is used for a lot more than just movies even in the entertainment industry. Of course, cinematic efforts like Life of Pi showcase the very peak of CGI technology for cinematic ends. However, video games also make use of grandiose CGI worlds for their incredibly life-like characters to explore. Most games are 3D these days. The likes of Grand Theft Auto, Skyrim, and Call of Duty are particularly impressive offerings that incorporate modern CGI to portray the most realistic of worlds.
Of course, CGI makes these video game and movies look amazing. They’re fictional too so the use of computer generated imagery shouldn’t cause too much of a stir. However, what about supposedly real footage?
Loads of scenes in documentaries are touched up, or entirely fabricated to get the message home to the viewer. In fact, even the legendary David Attenborough has criticised the industry for excessive use of CGI in various documentaries!
However, the entertainment industry doesn’t hold a monopoly on the technology! Other sectors are keen to embrace CGI too. Architects, for one, experiment with building design using CGI. This allows them to test various features prior to starting any expensive building work.
CGI modelling is also used in various scientific fields. Skeletal animation, space exploration, creating medical and cosmetic implants, and even CT scans are just some of the ways scientists and medical professionals now make use of the technology.
Even car commercials these days don’t always use real cars! It’s much cheaper for an advertiser to use a CGI car than shipping one of their models out to location and highlighting exactly what it can do on a computer! Also, next time you flick through a catalogue, ask yourself if it looks real. It probably does. That doesn’t mean it is though! Many catalogues actually use CGI to hawk their products to potential buyers.
The Future of CGI
All this is pretty impressive. However, CGI still has a way to go. For one, a recent movie highlighted the shortcomings of the technology. In Star Wars: Rogue One, many fans were left a little disconcerted at the portrayals of both Grand Moff Tarkin and Princess Leia (played by the late Peter Cushing and Carrie Fisher respectively). The issue was that despite a largely convincing use of CGI throughout the movie, these two legendary figures of the franchise were quite clearly not really humans. Their presence left viewers in no doubt that they were watching computer generated images.
However, we must remember that we’re still just 50 years from the birth of CGI technology. Think of the immense progress that has been made in just half a decade. We’ve gone from the brightest computer scientists on the planet painstakingly creating a simple moving image of a cat to huge virtual worlds that gamers are free to explore all aspects of. The 1960s digital representative of the feline would trick few into thinking they were looking at a living, breathing creature. However, the lifeforms in movies such as Avatar, or the stunningly designed architecture in video games such as Assassin’s Creed look like you could step right into the screen and interact directly with them.
With the development of such graphics and a network (the internet) capable of allowing global communication, it seems that the then far-fetched plot of The Matrix could become a possibility in less than the timeframe that CGI technology has already existed. In fact, efforts to produce similar are already well underway. Take simulators such as Second Life that allow players to interact in a fully CGI world. As the technology becomes more convincing, it’s likely that similar CGI “realities” will emerge.In just a few years, we might all be able to engineer exactly life that we would be most comfortable living on a computer system. What will become of our real-world forms? My guess is they’ll withers away in a test tube filled with embryonic fluids whilst we frolic around in some digital paradise. What could possibly go wrong?