Thursday 12 January 2012

a history of computer games, part three: 2000s

Since the introduction of 3D graphics, the improvements in the level of detail have increased greatly. Between the PS1 and PS2, technical aspects on graphics increased greatly; the polygon counts of the games produced for the platform increased by 50 times, while at the same time screen resolution increased  by 2.5, vastly increasing the amount of information that must be displayed at any point of time during the game. Comparing recent releases to the very first video games, by far the most obvious difference is in the graphics. These are visible even without playing the game and many casual gamers will purchase games based on this first impression.

As the baseline for quality of graphics continues to be pushed higher, the technology it is built on must also improve.  Development frameworks such as the Unreal Engine streamline the production of the software by providing toolsets for game producers to work from, but even clever programming is ultimately limited by the hardware. 

A more recent development is companies such as OnLive, which provide “cloud gaming”, where other computers run the game, and then stream it to the console the player is using.

At present, consoles such as the Wii and the Kinect are the exception. Their marketing is based on the novelty of the controls rather than piggybacking on established game series, but that may change as interactive controls become more common and future games are designed to accommodate them.
Games have always pushed the limitations of the current technology, but now they have the pressure of huge companies behind them, and an entire industry resting on their financial success.

In order to maximise profits, most games publishers find it necessity to make games available to a large an audience as possible, which means porting new releases to multiple consoles. Making a game work on all machines is not straight forward; console companies have their own in-house publishers - or at least a specific company they outsource to - and therefore have little need to make their own systems compatible with their competitor’s, leaving the game developers to bridge that gap.  Unsurprisingly, several companies have risen to fill this niche, providing tools such as Criterion's Renderware that make this task easier for producers.

Increasing complexity in games means more demand on the hardware, but technology is not improving fast enough to keep up with the demands of the market. At the moment, the only foreseeable answer to the increased demand for processing power is using multiple CPUs, but this comes with its own technical problems. Games must be designed to distribute the workload, and this mean a different approach to how the game is built, and requires skills that are not currently available.

Lastly, the length of development time means that by the time a project is finished, the technology it was based on at the beginning will not be the most up to date. With any new game, developers must anticipate and accommodate for the changes that will take place within the timescale of the project.
As technology and skill needed to produce a game increases, so does the time and the cost. In 1982, Pacman cost one person the grand total of $100,000 to produce. Halo 2, released in 2004, cost 400 times that. True, the profits these days are greater, but not on such a noticeable scale. It takes an enormous amount of time money to produce a game in line with popular titles; the start-up costs are so high it is generally impossible for individuals to produce games in the way they once did. The expectations of games are so high, but increase in expense does not guarantee perceivable increase in quality, at least to the general public.

With the monetary pressure hanging over their heads, companies often go for the safe option; sequels and games based of successful cinema are often produced in favour of new ideas. However with realism of graphics struggling to compete with expectations, innovative gameplay and storytelling may become the deciding factor in future games’ success.

It takes a lot of people to produce a game that lives up to todays expectations, the sheer amount and diversity of content call for many different skills; artists, engineers, writers and programmers, and people to make sure they all cooperate. 

The final consideration for game producers is making their product available to the global market, and making sure it isn’t outcompeted by the many, many other available games that are all fighting for the same customers. One way of ensuring that a game isn’t overshadowed is to make sure releases of similar games are staggered. Produces also have to consider that what will work in one company might not work in another, either because of language differences, or the expectations of consumers. For instance when the Xbox was to be released in Japan, the controlled was deemed too bulky, and a more streamlined version was produced for the Japanese market.  They’re not obvious problems, but as the games industry continues to develop the need to think globally will only become more important.

No comments:

Post a Comment