<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Are films turning into video games?

4 minute read

The last of usThe last of us

 

Since before the first arcade cabinet choked out an attempt at the speech-synthesized words “Operation Wolf!” in the late 80s, video games have been trying very hard to be films

Since before the first arcade cabinet choked out an attempt at the speech-synthesized words “Operation Wolf!” in the late 80s, video games have been trying very hard to be films. There was even a point in the mid-90s where genres crossed over almost entirely, with titles like Wing Commander including big chunks of live action video in its third iteration. While this arguably represented a brief infatuation of the gaming industry in general with the technical benefits of CD-ROM storage, and it's a slightly gimmicky approach that hasn't really survived, Wing Commander 3 was enough of a film that it almost makes sense when viewed as one. You can do so via a thousand illicit YouTube videos which assemble all of the full-motion video. These days, we tend to let the machine render those sorts of non-interactive cutscenes in the same way it renders the rest of the game, mainly because the graphics now look a lot smoother and because it's a less jarring transition between the realtime and offline rendered material. Modern stuff may also use motion captured animation with synchronous audio to keep the characters moving around convincingly, and recorded audio tracks have been common for years, but the desire is still the same: video games are, to at least some extent, still trying to be movies, and have started to staff themselves to suit.

Ellen Page performs motion capture and dialogue for Beyond: Two Souls

 

Interactive entertainment

The objection to this in the 90s was more or less the same as the objection now: a game is a piece of interactive entertainment, whereas a motion picture is not, by design. Even now, there is often a difference in visual quality between the non-interactive cutscenes, with a sliding scale of interactivity between the almost completely passive full-motion video, perhaps limited to a few decisions to make at branching points, and the fully interactive world of, often, a realtime 3D rendering engine. Making the game look good pushed designers toward the less interactive route, but gameplay, particularly the replay value of a title, was often compromised. Even very recent games, such as the well-reviewed The Last Of Us (possibly one of the last great titles for the now-superseded Playstation 3) have something of this problem: the opening of the game is a masterpiece, but interactivity is limited to opening doors and kicking out a windscreen, each of which requires a single button to be pressed a few times. It's completely scripted, and therefore it is probably only a masterpiece once. The same problem would have existed if it had been a simple piece of full-motion video, and the illusion of interactivity provided by allowing the player to take a slightly different path down a hallway is suspect at best.

The Last Of Us opening scene (15'04”, somewhat graphic, spoilers abound)

 


320 x 200 displays

But the thing is, the relationship between games and films isn't always, or even often, about the fineness of the graphical representation of reality. As we saw above, people have been trying to do movie-style stuff in games since a time when the technology was laughably inadequate to the task. The animated intro to Operation Wolf is basic in the extreme by modern standards, but clearly apes action movies of the period, particularly the Rambo sequel (and in a nice bit of circularity, is almost directly enacted by the opening of the 1996 Schwarzenegger action movie Eraser, which was made much later). The fact that the designers of Wolf only had 320x200 displays capable of 16 simultaneous colours didn't dissuade them from directly referencing the almost infinitely more richly-depicted world of the film. Whether you want to refer to something that basic as production design or not is a semantic choice. Regardless, the fact that contemporary games from the likes of Sega and Nintendo existed in a pastel-coloured world of child-friendly happiness, whereas Operation Wolf is in military greens and browns, makes me think that the argument can at least be made.

Operation: Wolf arcade opening

And so the requirement for production design in games, in exactly the same sense it exists in filmmaking, has advanced in sophistication alongside the graphical fidelity. With modern games beginning to rival offline-rendered CG films, there's a direct parallel. The desire, and the ability, to have video games represent more photo realistic, real-world environments has made the decisionmaking process for design, as separate from implementation, effectively the same as for a live-action feature film as well. Concept artists regularly work in both disciplines (perhaps particularly because games, for some reason, do a lot of sci fi, where their talents are very necessary). I recently encountered a designer in the games industry on a camera forum, looking for information on how the film industry approaches the ways in which camera technology and technique interact with production design. There seems to be a continuing desire to use all of the techniques available to the filmmaker in games, right down to filters, lenses, and even film stocks – which barely exist in the real world anymore, let alone how anyone's going to simulate them in a video game, especially one that probably runs at 60fps.

Impressive

Either way, this is a pretty impressive degree of dedication, and it's probably being done because nice-looking screenshots sell games. On the face of it, video games should (and generally do) look pretty good from a formal production design, lighting and camera standpoint, because the environment is intrinsically very controllable, from the fact that the sun doesn't move unless you want it to, through  the perfectly-smooth interpolated camera moves, right down to the fact that every environment and object the player will see must be built from scratch and can therefore be ideally suited to the production designer's intent. Upscale feature films pay huge amounts of money to achieve, in essence, ever greater control over the environments in which they're shot, and this is something that games get by default. With the next generation of games consoles just released at the time of writing (although they're really only moderately powerful compared to current PCs), we can probably look forward to ever more elaborate virtual cinematography in future.

But there is, ultimately, a problem. The better the hardware gets, the more detail is required to keep the world looking reasonable, and the more work it takes to create it. Given that they lack the ability to simply go somewhere that looks right and have an environment available that's correct right down to the atomic level, one has to wonder if the limit on photorealism in games is not hardware, but the ability of designers to keep modelling every particle of dust. Laser-scanning real world environments might be part of it, and perhaps a certain amount of procedurally-generated content is also part of the answer, but it was certainly proposed, after the release of the incredibly elaborate and good-looking PC game Crysis, that something on that level would rarely if ever be done again simply because of the work involved in creating the environments.

 

Tags: Production

Comments