Monday, 21 November 2011

User interface design in video games

User interface design in video games

By Anthony Stonehouse



User interface design in games differs from other UI design because it involves an additional element — fiction. The fiction involves an avatar of the actual user, or in this case, player. The player becomes an invisible, but key element to the story, much like a narrator in a novel or film.
This fiction can either be linked directly to the UI, partly, or not at all. There seems to be a growing debate with which is more immersive for the player. One side of the debate is that UI elements that sit within the game world, and therefore ‘viewable’ by both the player and avatar, are the more immersive and provide a more seem-less experience. However these elements are often clunky and inherit many restrictions. They need to strictly adhere to to the fictional narrative and adopt the visual design of the game’s art direction. They also sit within the geometry of the game’s perspective, often rendered on a 3D plane that can reduce their legibility.
Erik Fagerholt and Magnus Lorentzon explore these theories in their thesis for Chalmers University of Technology titled: Beyond the HUD — User Interfaces for Increased Player Immersion in FPS Games. They introduce the term diegetic interfaces for UI elements that exist within the game world that the player and avatar can interact with through visual, audible or haptic means.
I believe well executed diegetic UI elements do benefit the player by enhancing their experience of the narrative. However, they are more difficult to implement because the inherit restrictions.
Metro 2033 uses a complete diegetic UI with no HUD elements to help to support the game’s narrative. It does seem to run the risk of frustrating the player though slow response times but perhaps this forms part of the game mechanic, I have yet to play the game.
Metro 2033's diegetic interface
A well executed example that I am familiar with, and one that is often used, is the act of interacting with the phone in Grand Theft Auto 4. It mimics the real world interaction — you hear the ringing and there is a delay before you answer it, the act of answering itself is often an awkward process in reality. The game’s fiction adopts real world fiction so perhaps it makes diegetic elements easier to integrate.
Grand Theft Auto IV mobile phone interface element
UI elements can also exist in the game world and not be viewable by the player’s avatar, therefor avoiding the clunkiness of having to interact with the avatar (but also avoiding jumping out to menus) and the need to strictly conform to the game world’s art direction. They still need to follow the rules of the game’s fiction so they help immerse the player and provide a seamless experience.
These elements can either exist within the geometry of the word, or on the camera/screen for the player. Fagerholt and Lorentzon use the term Spatial to define elements of the UI that sit in the geometry and Meta to define elements that exist on the camera.
Fable 2’s dotted line that guides you to locations marked on your map is an example of a Spatial element, it exists in the game world but the avatar doesn’t interact with it. It ensures a more seamless experience, avoiding jumping out to a map screen as often.
Fable 2's glowing trail
Splinter Cell Convictions also adopts some nice Spatial elements in the form of projections that illustrate objectives within the game world. Their scale does seem to challenge the fiction slightly more than other examples.
Splinter Cell Conviction's objectives projected in the game world
A common example of a Meta UI element is blood the splatters on the screen as a form of health bar, as in the recently launched Modern Warfare 2.

Lastly there are the traditional non-diegetic UI elements. They have the freedom to be removed completely from the game’s fiction and geometry and can adopt their own visual treatment, though often influenced by the game’s art direction.
I think these elements are best used when the diegetic, meta and spatial forms provide restrictions that break the seamlessness, consistency or legibly of the UI element. World of Warcraft has a pure non-diegetic UI, that adds nothing to the fiction. It does allow the user to completely customise though, hopefully ensuring a familiar experience.

Here are some visual representations of these concepts that have been adapted from Fagerholt and Lorentzon’s thesis.

For further reading; Marcus Andrews from DICE examines recent games using these theories from Fagerholt and Lorentzon in this article posted on Gamasutra. Michael Grattan, a senior at the University of Souther California writes a response to the article on his blog. Dave Russell has also written an article on his blog.
New physical interaction methods make their way to the PS3 and 360 this year, called Move and Natal respectively. These technologies promise to challenge current best practice in game UI design and generate even more innovation in the area. As a designer currently working in web design I’m almost jealous of the levels of interaction plus the addition of audio and haptic elements available in game UI — while I’m still refined to users with an outdated keyboard and mouse. Web does offer other challenges and more variety in content delivery though.


Saturday, 19 November 2011

Video game user interface design: Diegesis theory

Video game user interface design: Diegesis theory

 by Dave Russell on 2 February



Interface design is often one of the most challenging aspects of game development. There is a lot of information to convey to the player and little screen space with which to do it. When the interface is poorly designed, a good game concept can be reduced to a frustrating user experience.
There are several theories that can be used by designers to analyse a user interface and help them break down choices. The theory we will look at here is called diegesis theory. It is adapted from diegesis theory used in literature, film and theatre. Diegesis refers to the world in which the story is set, and hence it focuses on games as stories.

There are two concepts core to this theory: narrative and the fourth wall.

Narrative

Narrative is a message that conveys the particulars of an act or occurrence or course of events. In simple terms, it is the story the designer wishes to convey; be it the story of blocks falling from the sky which need to land in the right place (Tetris), or a journey through a strange land (Machinarium).
Not all elements of a game are part of the narration. For example, the game menus and the HUD, because the game’s characters are not aware of these elements. This does not mean these components do not support the narrative. For example, a futuristic game typically has GUI elements that also appear futuristic.

The fourth wall

The fourth wall is the imaginary divide between the player and the world of the game. In order for the player to immerse themselves in the game world, he needs to move through the fourth wall. The ease with which the player moves between the real world and the game world depends on the way the interface designer delivers information to the player.
Posting your latest game accomplishments on Facebook is an example of how a game extends beyond the fourth wall. To further delve into this concept, one should read Steven Conway’s interesting discussion of the fourth wall in games: A Circular Wall? Reformulating the Fourth Wall for Video Games.

Interface components

We can now ask ourselves two questions about any interface component:
  • Is the component part of the game story? (Is it part of the narrative?)
  • Is the component part of the game space? (Is it behind the fourth wall?)
Depending on the answers, we can classify the component into one of four classes: diegetic; non-diegetic; spatial; or meta.
The diagram below shows how the questions relate to the classes.
Diagram adapted from Gamasutra.

Diegetic Components

For diegetic components, we answer our two questions as follows:
  • Is the interface component in the game story? YES
  • Is the component in the game space? YES
Diegetic components provide the player with cues and information without distracting them from the narration of the world. These cues are something that the player’s avatar and other characters in the game world are aware of, and can interact with. This makes the experience more immersive and cinematic.
In Far Cry 2 an attempt is made to make the experience as diegetic as possible – there is no HUD. The use of numerous in-game gadgets and items allows the player to get information without referring to elements outside of, or superimposed over the reality of the game world.
While this is great for the immersion of the game, if it is not done correctly, it can have the opposite effect. For example, in the adventure game Grim Fandango the player is forced to search through their inventory one item at a time. This frustrating process breaks the player’s suspension of disbelief, and he pops back into reality.
Depending on the type of game you are designing, a completely immersive and realistic world may not be what you are looking to achieve, and this may in fact break the narrative of your story.
In Grim Fandango, the character’s head turns towards objects to indicate that they are interactive. Although it is perhaps a more realistic way to deliver information, the movement is awkward and unnatural. It is distracting and not as helpful as the glowing objects and mouse cursor changes traditional to adventure games.

Examples of diegetic interface

Designing diegetic interface components to replace common HUD elements requires a clever approach. Some examples follow:
In real time strategy games, diegetic components are elements such as the visual damage to units and buildings.
In Far Cry 2 the player can use a compass to help them navigate through the game world – far more immersive compared to the non-diegetic compasses that appear in HUDs of many other games.
In Dead Space, instead of providing a typical health bar overlay, the player’s health is indicated by the high-tech meter on the avatar’s suit.

Non-Diegetic Components

For non-diegetic components, we answer our two questions as follows:
  • Is the interface component in the game story? NO
  • Is the component in the game space? NO
We have all become very comfortable with the use of a heads up display (HUD) in games. This system provide us with key information in a fairly simple manner. If done correctly the player doesn’t even know it is there.
Some games, such as Gears of War, have a minimalist approach which limits the number of HUD items, while others, such as World of Warcraft, provide extensive HUD information.
An example a HUD being used poorly is the widget in Gears of War. The widget appears when the player selects a new weapon. This widget breaks the flow of the game, distracting the player from the world in which they have spent the last few minutes immersing themselves.
There are less intrusive user interface mechanisms one could use for a simple action such as selecting weapons. If the player is able to see the actual weapon in the game world there is little or no need to show a non-diegetic cue for swapping weapons.
World of Warcraft has a complicated but flexible and customisable user interface that allows players to optimise it to their personal play experience requirements. They can choose how much clutter fills the screen depending on their needs.
Although the game is often criticized for its complicated interface (by pointing out that the game cannot be played competitively without customizing it), it must be remembered that World of Warcraft is a complicated game that allows for many different types of play. The complexity of the interface is a result of the complexity of the game.
World of Warcraft has a rich user interface to support the vast amount of information given to the player. The interface is complex, because the game is complex.
It is not always clear whether a component is non-diegetic. Is the speedometer in the HUD of a racing game really a non-diegetic component? The speedometer is just a conveniently placed clone of the actual diegetic speedometer which is presumably inside the car.
The interface on the left is diegetic; the interface on the right is non-diegetic.

Spatial Components

For spatial components, we answer our two questions as follows:
  • Is the interface component in the game story? NO
  • Is the component in the game space? YES
These are components that are visualised within the game world but are not part of the game world. The game’s characters are also unaware of these spatial components. For example, the aura selection brackets around units in real time strategy games. They are used to provide extra information on a component in the world, although that information is not part of the narrative. The information is provided in the location on which the player is focused, reducing clutter in the HUD.
A good example of this are the auras in Warcraft 3. These indicate the gameplay effect that is currently in place and the range within which units will be effected. Another example are the icons that appear above the heads of characters in The Sims.
The selection brackets in Warcraft 3 immediately make it clear which units the player has control of. The brackets’ location in space makes selecting appropriate units much easier. Think of how difficult it would be the select them from a list in the HUD – it would be very difficult to see which units are closest to the action taking place in the game world.

Meta Components

For meta components, we answer our two questions as follows:
  • Is the interface component in the game story? YES
  • Is the component in the game space? NO
Meta representations are components that are expressed as part of the narrative, but not as part the game world. These can be effects that are rendered onto the screen such as cracked glass and blood splatters – effects that interact with the fourth wall are the most common examples.
These components aim to draw the user into the reality of the game by applying cues to the screen as if the game were directly interacting with the player. An example of this is the blood splatter on the screen used in Killzone 2. Note that this interface component also affects gameplay by reducing visibility.

Diegesis Theory in 2D Games

So far, we have only covered 3D games. The concepts we have discussed so far still apply to 2D games.
Think of a 2D game as a flattened representation of a 3D world. Take Pac-Man as an example. There is still the concept of a world, or narrative component, and components that are outside the narrative. The pills, walls and ghosts are all diegetic component of Pac-Man’s world. The scores and details around the game would are non-diegetic.
An example of a purely diegetic 2D game is Limbo. All user cues are presented within the world that the character perceives. The game leverages this pure in-world experience to help the user become immersed into the game. From the background to the foreground the entire world is part of your experience. There is nothing with which players interact that is not part of the narrative.
Spatial representations in 2D games are the auras or indicators, such as arrows, that are present in the game world, although the characters are unaware of them, and objects are unaffected by them. An examples of this are the paths and warning icons in Flight Control.
Although it is easy to imagine screen cracks and blood spatters against the screen in a 2D game, this is rarely done, and no examples of meta interface components come to mind. (If you think of anything, let us know in the comments!)


Thursday, 10 November 2011

Diegetic Interface

Link: http://tvtropes.org/pmwiki/pmwiki.php/Main/DiegeticInterface


Diegetic Interface


Game interface elements that are a part of the game universe. "Diegetic" is a term meaning "within the narrative", usually used in reference to music. Diegetic music is heard by the characters because it's actually being played in the scene (as opposed to non-diegetic background music). Diegetic menus are the same — they actually exist in the game world, rather than simply appearing for the benefit of the player.
This is generally handled in one of two ways. Sometimes the normal player HUD is explained as being part of the character's equipment — common if he's robotic, a cyborg, or wearing Powered Armor, and justified if the game is in first-person perspective. Other times, the game simply uses in-game indications of things that a HUD would normally tell you; a wounded character will limp instead of having a Life Meter, for example.
Vehicle simulators call this a virtual cockpit, and it tends to be the most detailed and realistic interface mode short of a hardware sim with actual panels. In these cases, a 2-D control panel laid out for easier reading without scrolling is usually included as an easier-to-program option.
Generally, this is done in order to increase immersion; it's much easier to believe that your character is a real person in a real situation when the screen isn't cluttered with inexplicable icons representing health and ammo. In some cases, this means that actions done via menus are actually happening in real time — browsing your inventory may leave the player open to attack, so you can't pause the action midbattle to grab a handy medkit.
A diegetic interface often averts Menu Time Lockout. Robo Cam is when one is applied to a character's view outside of video games. Justified Save Point is related. May justify Interface Screw as well. See also Painting the Fourth Wall, where this is temporarily invoked for the sake of Post Modernism, and usually Played for Laughs.
Examples:
Diegetic HUDs
  • Halo, a First-Person Shooter in which you play as a Space Marine with Powered Armor. Your HUD is projected on this inside of your helmet's visor, and some weapons have readouts as well: small LCDs for human weapons, and holograms for the aliens. During Noble Six's Last Stand in Halo: Reach, the visor, and thus the interface, shows damage.
  • Half-Life and its sequels and add-ons, also with Powered Armor. Strangely, Gordon is never depicted wearing a helmet.
  • Crysis, which overlaps with Interface Screw when certain enemies and weapons (like EMPs) cause your HUD to go fuzzy or fail entirely.
    • Crysis 2 also adds a gorgeous new bobbing effect for the HUD when you move, and makes it look more realistic (like a fighter plane HUD). All of this is a wee bit strange when you consider that the Crysis: Legion book calls it a "BUD" (Brain-Up Display), like a neural interface.
  • Metroid Prime, which also has some Interface Screw elements similar to Crysis.
  • Azraels Tear, again with Powered Armor.
  • System Shock, as part of the cyber-interface implanted in the beginning of the game. The player can even improve the interface by finding hardware, such as targeting aids, health monitors, infrared, a widened field of vision, and a multimedia data reader (a CD drive?).
  • Strange Journey has all the interface elements as part of the PC's Demonica Suit.
    • Likewise, the contents of the upper screen of the DS is identical to the display of the characters' Demon Summoning Program in Devil Survivor 2: this is where they find out the names of the Cosmic Horrors they're being attacked by and they directly refer to the skills shown there later in the game.
  • In Deus Ex Human Revolution your HUD doesn't even exist until Adam Jensen gets cybernetic implants (including his eyes) after a brutal beatdown.
  • In Bulletstorm, the player character has no HUD until he puts on the Leash, which then injects him with nanomachines.
  • The Journeyman Project.
  • Star Wars Republic Commando, once again with Powered Armor. As with Crysis, there are certain areas and weapons that cause your HUD to go crazy.
  • Battlefield 2142 has the HUD projected onto everyone's Net Bat Helmet visor from within. Like the above example, the HUD disappears and the entire screen appears washed-out with flickering static if an EMP weapon goes off nearby. Interestingly, this distruption also disables the networked battlefield system that displays the positions of hostiles spotted by one soldier to everyone on the team.
  • FEAR 2's HUD is projected on the character's glasses and goes missing when they are briefly removed.
  • The Project Eden interface appears to be projected on some kind of contact lens as it is seen booting up when the character does something with their eye.
  • In Borderlands, the cute Claptrap robot gives you the device displaying your HUD before you can even move.
  • Starsiege: Tribes and its sequels outfit everyone with Powered Armor as a rule, but the HUD doesn't look particularly diegetic until Tribes Ascend.
  • I Miss the Sunrise is a Turn Based Strategy example. Yes, really. The main character is a commander of a fleet who has a unique protein in their body that, when combined with a chemical, greatly augments their mental capabilities, allowing them to take as long as they want to formulate an order without taking any time from an outside perspective. The images displayed on the screen are what the character literally sees from their cockpit (probably not the menus, though).
  • In Perfect Dark, Joanna is equipped with a headset that deploys a small screen over her field of vision which acts as the game's menu, similar to James Bond's wristwatch computer from Goldeneye.

Virtual Cockpits
  • Many Racing Games replace the HUD with the car's dashboard when using a first-person viewpoint.
  • Gran Turismo V has realistic simulated interiors for all its cars, a first for the series.
  • The HUD in Ace Combat, especially in Cockpit View. The same applies for Tom Clancy's H.A.W.X..
  • Descent and its sequels provide an optional cockpit view to increase the sense the player was Material Defender in the Pyro-GX and successor ships.
  • The MechWarrior series. The HUD is presumably a function of the neurohelmet every MechWarrior wears for the 'Mech to keep balanced given its superimposed nature, though.
  • Steel Battalion, in spades. Every VT generation has its own cockpit, and Line of Contact adds even more cockpits with support/indirect-fire and Jaralaccs VTs now having their own. The VT Operations Manual goes into extensive detail on what all of the cockpit lights and gauges mean, and not a single one is there for mere decoration.
  • Orbiter has one for the most popular built-in spacecraft, and the space shuttle. Unfortunately (or maybe not) most of the switches on the shuttle panel are dummies, and most of the special functions aren't usable from the mouse interface. A lot of the better realized add-ons have them, but most skip the extra work and only include a 2D panel.
  • Microsoft Flight Simulator has had active virtual cockpits since the 2004 edition, and a basic implementation back in '95.
  • Too many combat flight simulators to count. Fully realistic settings in most of them will even enforce cockpit view, with no non-diegetic gauges to rely on. In extreme cases such as Falcon 4.0, DCS: Ka-50 Black Shark and DCS: A-10C Warthog, the cockpits are fully clickable to the point where the player can even go through a cold start procedure using the virtual cockpit!

Other Examples
  • Jurassic Park Trespasser was probably the first first-person game to have absolutely no HUD. Instead, the player character would verbally call out the amount of ammo left in a gun ("five shots left," "feels half full," etc.) and a tattoo on her chest (which could be viewed by looking down) indicated the amount of health the character had.
  • In Dead Space, everything is diegetic. Health levels and power-up charge are given via displays on Isaac's suit, menus are Holographic Terminals projected by either his suit or the machinery he's interacting with, and even the "go here next" hints are glowing lines on the floor generated by a projector in Isaac's glove, presumably in conjunction with the ship's computer (which would, naturally, know how to get you where you want to go).
  • Metro 2033 handles almost everything diegetically. Damage causes your vision to narrow and red out, while low air causes blurry vision and labored breathing. One button lets you look at your watch (which shows time until air runs out and ambient light level, for sneaking) while another brings up his notes (listing your next objective, with a compass pointing the way). The only non-diegetic part of the interface is your weapon selection and ammo count — though for some weapons, even the ammo count is visible on the weapon. Even better, the hardest difficulties turn off the non-diegetic parts. Better count those bullets!
  • The Director in Crackdown refers to columns of light and other things visible in the game as being part of a graphical interface attached to the player character's eyes.
  • Fallout 3 and Fallout: New Vegas have diegetic menus in the form of a wrist-mounted computer called the Pip-boy 3000. It somehow lets you do things like use items, manage your equipment and other inventory, and physically examine yourself for wounds. It also runs on an equivalent of MS-DOS. Note, however, that the HUD itself is mostly non-diegetic, as the ammo counter, HP and AP meters, and compass pips aren't justified in-game.
  • The Starcraft 2 research and armory screens are set up something like this. The entire ship may also count as well.
  • In the Sly Cooper game series, the sparkles that mark areas that can Sly and the others can use (Climbing, Crawling Under, etc.) are noted as being visible in-universe, to Sly at least, and represent "thieving opportunities". In the 2nd and 3rd games, the starting locations of missions and the locations of objectives are made visible with holographic markers that are also explicitly said to be visible to the characters.
  • Pretty much everything in the Assassin's Creed series. The HUD, highlighted targets, and even things like the pause menu are explained as being part of the Game Within A Game that is the Animus. Indeed, during the segments outside the Animus, the game goes out of its way to avoid having any sort of HUD at all, except when essential.
  • Escape from Monkey Island displays your items in a circle orbiting around you when you open your inventory. Other characters can apparently see this and make comments like "you better clear up that clutter when you're done".
  • The Getaway doesn't have any kind of HUD to try and make the game more cinematic and immersible. Rather than a health bar, your character develops bloodstains and a limp the more they get hurt. Rather than floating health kits, leaning against a wall recovers you health (and removes bloodstains). And rather than a minimap or GPS arrow pointing you to your destination, your cars indicators will blink when you should turn, and both will flash when you reach your destination. The game did come with an actual map of London to help you find your way around though.
  • James Bond in Golden Eye 1997 could switch weapons using the readout on his laser watch. The bad guys would kindly stop shooting and wait for him to finish what he was up to before resuming the firefight.
  • Splinter Cell mixes it with Product Placement in the form of a Palm OPSAT or a Sony Ericsson phone as pause menus.
  • In Hammerfight, the tutorial mentions that flying machines are controlled with a mouse.
  • In Nier, Grimoire Weiss functions as your menu/journal/inventory. When he's not in your party (such as before you meet him), your menu/journal/inventory is extremely limited.
  • Digital: A Love Story is a Visual Novel that is presented entirely in a GUI reminiscent of Amiga Workbench.
  • Quite a few DS games use this with whatever's on the bottom screen, such as:
  • In Minecraft, you have to craft your maps, and they only update if you're holding them.
  • The interface menus in Silent Hill: Shattered Memories are in the form of a shoehorned mobile phone. While it does result in funny things like a messaging menu that can only receive messages, it does add to the immersion—particularly when you are chased by monsters and you need to look at the map.
  • The video-game adaptation of Peter Jackson's King Kong remake deserves some mention because its Diegetic Interface is a total lack of HUD. Despite taking the exact opposite tactic, it still increases immersion by forcing you to pay attention to how many shots you've fired, your character's movespeed and labored breathing, the ambient noise of the game environment, and so on.
  • Silent Hill Downpour is a minor example; there's no HUD, but there are context commands and an inventory screen. The easiest way to tell how much more damage Murphey's taken is Murphey's appearance—whether he can still run or just barely drag himself around, and the extent of bruising visible on his body are the tell tale signs of damage. 

Wednesday, 2 November 2011

Designing an MMO UI

Desining an MMO UI

By Prasad Silva


Scaleform principal engineer Prasad Silva offers a guide on designing MMO UIs
The user interface can sometimes make or break a game. With MMOs, this is especially true for a number of reasons.
First, there is a significant amount of information that must be presented to the user, making the UI graphically intensive. A large number of interface elements and a lot of animation at once may slow down the overall game experience. The many different components of the MMO UI can also easily gobble up screen real estate, if not carefully managed.
Second, gamers spend hundreds of hours playing MMOs, so usability and personalisation is extremely important. Third, while most UIs remain constant after a game is released, UIs for MMOs can continue to be developed by both the developers and mod communities.
These factors create specific challenges for MMO UIs. Designers and artists need to determine how to present the game data with the best visual fidelity and user interactivity, while trying to avoid affecting overall game performance. Customisable and moddable UIs require frameworks and tools for creating new and editing existing content. To achieve this, such UIs require complete decoupling from the game client using an efficient bi-directional communication layer.
Because of these unique challenges, there are certain things that should be considered when developing a UI for an MMO.


Enabling Rapid Iteration

The design of an MMO UI must take into account look and feel, usability, and performance. For look and feel, and usability, a great WYSIWYG tool for layout and animation can provide significant value. It is quite difficult for artists to iterate and optimise UI design without visual tools. The ability to add and test UI behaviors independently from the client increases overall productivity of both the UI and game programming teams. Also, having a robust framework for reusable components and widgets provides a great foundation for developing and testing user interaction before integrating the UI into the client.
Autodesk Scaleform provides a tight iteration loop with a full suite of UI development tools and components: artists can design in Flash Studio, quickly launch assets in the Scaleform Player to see the final result, and use the Scaleform AMP profiler to keep a tight grip on performance and memory. Flash ActionScript provides a flexible scripting language for rapid prototyping and iteration, and can be used to develop extensive component frameworks such as Scaleform CLIK (Common Lightweight Interface Kit). CLIK provides over 15 different customisable controls that are easy to ‘skin’ and simple to extend to introduce new behavior.

Decoupling the UI and the Client

Separating the UI from the backend code is a good practice which supports parallel development of the UI and game client and is also necessary for providing comprehensive modding support. Scripting languages allow for the modding community to create custom UI behavior and layout. Through public scripting APIs and data binding mechanisms ― a process for efficiently and automatically updating UI elements to reflect the current state of the client ― the UI is able to communicate with the client.
Scaleform offers good separation between the client state and the UI because it is a standalone runtime which is integrated on top of the underlying game engine. This runtime is used throughout the life cycle of the game, from UI design to development to customising and modding. Several mechanisms exist in the Flash standard for enabling communication between the backend and the UI. In addition, Scaleform provides the Direct Access API which allows direct control over UI objects in C++ and allows developers to create high performance public APIs.

Optimising Performance

A graphics renderer with an efficient resource batching system is key to reducing the number of draw primitives sent to the GPU. Images can be automatically packed into a texture atlas as a pre-process or at runtime, to maximize batching. This approach is a definite plus, as it allows an optimised data format to be generated for the rendering engine without actively disrupting the artists’ pipeline. To reduce overdraw, care must be taken when using transparency and blending effects in order to minimize the depth complexity at each pixel.
The Scaleform 4.0 renderer provides a novel batching solution and the Scaleform AMP profiling tool makes it easier for artists and programmers to optimize content, and identify inefficiencies and hot spots. AMP monitors CPU usage, graphics rendering, ActionScript code execution and memory allocation in real-time. Using its frame-based history graphs, developers can quickly spot problem areas, then drill down to determine their exact cause.

Proof of Concept

The Autodesk Scaleform team works closely with our customers to overcome the challenges they face in UI development, including those outlined above. As a proof of concept we created a sample MMO kit, now available with Autodesk Scaleform 4.0. Developers can use the kit as a game ready solution with minimal customisation, or simply as a best practices guide when designing an MMO UI from the ground-up.