Game Developers Conference 2022

Amebous Labs
13 min readApr 19, 2022

--

At the end of March, several members of our game studio had the pleasure of attending the Game Developers Conference (GDC) in San Francisco, California. The annual event serves to unite the gaming industry for a week of educational discussions, inspiring games, hardware demos, and networking opportunities—Annie, Amy, Chan, and Pierce each had a great time experiencing everything that the event had to offer. Looking back at GDC 2022, here were all of our favorite moments throughout the first four days of the event.

Day One

Filled with insightful summit discussions and presentations tackling a variety of game-related topics, day one of GDC ’22 offered something for everyone.

Annie’s Picks:

Future Realities Summit: How NASA Has Translated Aerospace Research into Biofeedback Game Experiences

John E. Muñoz (Game Designer & Postdoctoral Fellow, Personal) and Alan Pope (Distinguished Research Associate, NASA) explore gaming’s adoption of NASA’s biofeedback research.

It was fascinating to see how NASA’s advanced biofeedback device research has been integrated into games. As the technology advanced, biofeedback was used to do psychophysiological modeling in virtual reality. The ability to track one’s heart rate and frontal alpha has produced more effective conflict de-escalation training, leveraging biocybernetic adaptation. Applications utilizing this technology have also helped researchers further understand the brain functioning of individuals with autism and ADHD.

Animation Summit: Animation and Customization in ‘The Sims 4’

Yusun Chung (Lead Animator, Electronic Arts) analyzes aesthetic considerations and implementation workflows necessary to create animations for customizable characters, clothing, and interactable objects.

Pierce and I attended a great session covering the animation and customization in ‘The Sims 4’. This was especially of interest due to our future goals of implementing customization in Loam Sandbox. It was helpful to learn about the importance of object variety when an asset type must feature animations or interactivity. By setting parameters for artists, an asset’s core interactivity can stay the same for variants while the outer shell can be as creative as the artist wants.

Chan’s Picks:

Future Realities Summit: Up Close & Virtual — The Power of Live VR Actors

Alex Coulombe (Creative Director, Agile Lens: Immersive Design) details the techniques, tools, and technology he uses to create compelling VR shows.

I attended a panel called Up Close & Virtual presented by the CEO of Heavenue, Alex Coulombe. He discussed his company’s experience creating a live VR theater performance. In his performance of the Christmas Carol, various techniques and technologies combine to create a one-man show for audiences to enjoy in VR. I was absolutely impressed by the idea and I wouldn’t mind exploring more of it in the future. We could be seeing the creation of a new type of theater that is solely unique to VR!

Honestly, talking to fellow developers throughout the day has been among the real treats of GDC. I’ve met several VR developers, and it’s interesting to hear their opinions and thoughts on game development and existing experiences for VR platforms such as PSVR, Oculus/Meta, and Vive.

Pierce’s Picks:

Future Realities Summit: Hands Best: How AR & VR Devs Can Make the Most of Hand Tracking

Brian Schwab (Director of Interaction & Creative Play Lab, LEGO Group) and John Blau (Game Designer, Schell Games) share their philosophies for designing hand tracking in VR.

VR game development forces you to reframe the way you think about your hands and the many ways you use them to interact. It was interesting to hear John and Brian refer to hands as systems through which we attain information about the things around us by reaching out and touching or gauging distance.

There are so many different factors to keep in mind when designing and implementing hand tracking for your VR apps such as user comfort, cultural differences, and the tracking complexity of your experience. However, as a general rule of thumb, developers should first identify the type of experience they want to create before they begin designing any hand tracking for it. Once you’ve realized this, choosing a combination of interaction models to match your goals can become a less overwhelming process.

Amy’s Picks:

Visual Effects Summit: How to (Not) Create Textures for VFX

Simon Trümpler emphasizes how using photos and third-party assets saves time when making textures.

Many artists feel like creating new textures for visual effects using existing photos and third-party assets is cheating, but Simon dispelled these common beliefs and showed us how we can take a base asset and customize it in a way that makes it our own. His advice was to focus on getting assets to the “good enough” stage because, if you find additional time later on in the game development process, then you can fine-tune them to have increasingly customized looks.

One of my favorite tricks he demonstrated was taking a picture of a forest, flipping it upside down, bumping up the contrast, then combining it with other water effects he had already created. The result was a waterfall that appeared to have ripples and movement. Had he painted each line by hand, it would have taken significantly longer to create. He wrapped up the talk by encouraging artists to share their expertise and learnings on social media using #VFXTexture to streamline the texture creation process for fellow artists.

Day Two

Day two of GDC ’22 presented our team with opportunities to demo new hardware and learn tips and tricks for mitigating VR motion sickness and designing in-game environments. Here are our favorite moments from the second day of GDC 2022.

Chan’s Picks:

Future Realities Summit: How NOT to Build a VR Arcade Game

Michael Bridgman (Co-founder & CTO, MajorMega) shares development tips for mitigating user motion sickness in VR experiences.

On my second day of GDC, I went to an awesome panel covering VR arcade games: the dos, don’ts, trials, and tribulations. Michael Bridgman, the developer hosting the presentation, has made incredible VR experiences, and he discussed a few tricks to help mitigate major causes of motion sickness in virtual reality. While some of it isn’t applicable to the kind of development that we do at Amebous Labs, I think he shared many interesting things worth looking into.

Day two also presented me with a chance to use the HTC Flow and the HTC Focus. The XR industry often focuses on the Oculus Quest and to a lesser extent the Vive Index, so it was very lovely to use some alternative headsets. I want to give more attention to the HTC Flow which is a lightweight VR system that takes advantage of your phone’s processing power to reduce the overall size of the headset. Five minutes with it on and I was already trying to formulate ideas for how we might use it (and how I would convince Annie to buy it…)

Pierce’s Picks:

Machine Learning Summit: Walk Lizzie, Walk! Emergent Physics-Based Animation through Reinforcement Learning

Jorge del Val Santos (Senior Research Engineer, Embark Studios) highlights a new method for animating digital assets — machine learning!

The traditional way of bringing digital assets to life consists of first rigging and then animating them, but machine learning agents (ML-Agents) present a new, physics-based option. Using a reinforcement learning approach to automatic animation, developers can bring life to creatures and critters with the click of a button — no animators involved. With that said, a fair few challenges accompany the benefits of this technique, so it may not always be the best fit for your game.

One of the greatest challenges to this technique is reward structure; because ML-Agents will be rewarded no matter how they move, they might end up learning to walk in unintended ways. For example, mixups in reward structures can cause a creature to walk using an arm and a leg because there was nothing in place disfavoring that. Another weakness of this physics-based model of animation is authorship or, in other words, your ability to bring character to your creations. At the moment, there’s no clear way of training your monsters to be scary or move intimidatingly.

Amy’s Picks:

Art Direction Summit: Building Night City

Kacper Niepokólczycki (Lead Environment Artist, CD Projekt Red) explains the importance of emotion when designing environments for your game.

My favorite session of day two was presented by the lead environment artist at CD Projekt Red, Kacper Niepokólcyzcki; his talk was centered around the creation of Cyberpunk 2077's environments. My main takeaways were to think about how you can evoke emotions through the design of the environment from the very start.

Although it may lack color and feature very limited detail, the earliest stages of your environment design should still make you feel something. The following steps of the design process should continue to strengthen that feeling you want the player to experience while playing your game. If it doesn’t, then you will need to re-iterate until you hit upon an environment better aligned with your game’s core pillars. When you are making design decisions, you should always stop and ask yourself (or your team) “WHY?” If the answer is vague or unclear, then the concept needs to be worked on.

Annie’s Picks:

Meeting HTC Vive’s Developer Community

Annie tries out the HTC Focus 3’s new wrist trackers.

It was great to meet more people in the HTC Vive developer community! On day two, I got to see the new Wrist Tracker device for the Vive Focus 3. It was great to see how this hardware has enhanced hand tracking and reduced issues with occlusion.

Day Three

Throughout day three of GDC ‘22, we celebrated Atlanta’s game development community, attended several more insightful presentations, and explored the new games and tech featured on the GDC Expo floor. Here were our favorite moments from day three.

Pierce’s Picks:

The Fine Line Between Difficult and Impossible: Adventures in VR Development

Devin Reimer (CEOwl, Owlchemy Labs) and Andrew Eiche (CTOwl & Cable Slinger, Owlchemy Labs) describe the difference between ‘impossible’ development problems and ‘very difficult’ ones.

Problem-solving and critical thinking are two major aspects of game development. Getting your game to look and run how you envisioned on various different hardware systems is difficult and, at times, borderline impossible. In this presentation led by Owlchemy Labs, the game development studio behind the hit VR game Job Simulator, Devin Reimer and Andrew Eiche argue that the best development rides the fine line between an impossible problem and a daunting challenge.

The Owlchemy team explained how something impossible feels the same as something very difficult, but impossibility can only be factored in when looking at the overall project holistically. Each individual issue may be difficult, but when you have too many difficult problems, then the entire project becomes impossible. During the development of Cosmonious High, Owlchemy Labs realized the importance of being flexible with your game’s requirements — the size and scope of Owlchemy’s original vision for the game had to be scaled back in order to effectively optimize the game.

Amy’s Picks:

Three Opportunities to Improve Character Design in the Games Industry

Jessica Tompkins (Senior Researcher, Electronic Arts) shares the historical trends of game character design.

The expo show floor opened up today with so much to see and do, but I made it a point to attend Jessica Tompkins’ talk on character design. She shared her findings and ideas regarding how developers can be agents of change at their game studios and improve representation in video games.

She suggested focusing on motivation driven design rather than designing for demographic parameters, going beyond being an ally and actually pushing for change by acting and advocating, and, finally, opening up the conversation while future game industry professionals are in school — it is not a conversation that higher education should be timid about having. Taking these steps will help to push the industry forward and be more welcoming of underrepresented gamers and game developers.

Annie’s Picks:

Hardware Demo: Speech Graphic’s Facial Animation Tool

Chan and I got to demo Speech Graphics’ audio-driven facial animation tool. It was fantastic seeing how their product creates extraordinarily realistic face and mouth movements from just a single audio file. Tools like these will be monumental in expediting production pipelines, so I can’t wait to try this for some of our future VR projects.

Chan’s Picks:

Serious Games Roundtable

Matthew Lee (Chair, IGDA Serious Games SIG) chats with developers about various styles of game design in a roundtable discussion.

I had the absolute privilege of attending the Serious Games Roundtable on the third day of GDC. I admit I was not familiar with the roundtable format when I walked into the room, so I was very surprised to see a bunch of tables lined up in a circle. Each developer took a seat at the table and a live mic was passed around so that everyone who wanted to talk had an opportunity to do so.

As a lifelong admirer of both serious and educational game design, it was interesting to hear insights and thoughts from fellow developers. While I recognized a few terms tossed around during the discussion, I learned several new phrases and ideas that I absolutely adore and would love to start using regularly.

Game Economies: An Economist’s Perspective

I also attended a talk regarding in-game economies presented by Christopher Smith, the chief economist at Metanomic Ltd. He explained economic theories and concepts that should influence your game design. While most of it went over my head, I thought the talk was incredibly informative, especially considering our studio is currently supporting Loam Sandbox with post-launch content.

After the conference today, we went on a whirlwind of networking and connecting. We went to a venue where we were able to connect with other Atlanta developers, and I had the awesome opportunity to hang out with and connect with many talented, skilled game developers.

Day Four

GDC ’22 neared its end, but things did not slow down—day four was packed with inspiring hardware demos and game-changing presentations from some of gaming’s most creative minds. Here are our favorite parts of the conference’s fourth day.

Amy’s Picks:

Understanding Developer Challenges with Localization, Player Support, and More

Annie, Alexander, Xavier, and Ninel share actionable advice on how to best localize your game or application in order to create an experience that many can enjoy.

My pick isn’t from day four but rather the end of day three. Annie, the executive producer at Amebous Labs, joined the TransPerfect panel with host Alexander Fletcher (Global Director, TransPerfect Gaming Solutions) and co-panelists Xavier Marot (Director of Production, Focus Home Interactive) and Ninel Anderson (CEO, Devoted Studios).

Together, they shared an insightful discussion regarding how developers can create a seamless experience for players from all over the world. Ninel suggested tapping into the community of players that speak the language you’ve translated your game into and trading them game keys for feedback. Taking this step will not only help the developers, but it will make the players feel heard as they offer advice on how to appropriately translate content for their audience.

Annie shared ways to avoid using text in virtual reality, as it is inherently difficult to read large amounts of it in virtual reality. She suggested using iconography and visual diagrams to assist the player in successfully maneuvering throughout a VR game.

Annie’s Picks:

Hardware Demo: Tilt Five

We got to have a private demo of Tilt Five, an augmented reality wearable for tabletop games. The weight of this device was lighter than other HMDs and the field of view is impressive (especially for the weight). I think we’ll be putting an order in for a couple of these so that we can experiment with creating content for this device!

Chan’s Picks:

Building Next-Gen Games for PlayStation VR2 with Unity

On day four, I went to an interesting talk presented by Unity regarding its support for PlaystationVR2 games. It highlighted new PSVR2 features like eye-tracking and Foveated Rendering while detailing how Unity will implement them. While much of the talk focused on PSVR2’s exclusive features, it also explored Unity’s broader features and what we can expect regarding game development on the system moving forward.

While URP and Single-Pass rendering have been talked about ad nauseam, I find explanations for some of their advanced uses to be fascinating. For example, I know that you can make a custom rendering pipeline for URP, but I didn’t consider that I could have a pipeline focused solely on UI which can be run separately from everything else.

Tilt Five uses AR-enabled glasses and a unique controller to breathe life into our favorite table-top board games.

I also walked around the GDC Expo Hall and saw innovative games with alternative control schemes. The artistry and creativity on display really encouraged me to find new and novel ways to play.

Pierce’s Picks:

The MAW: Safely Multithreading the Deterministic Gameplay of ‘Age of Empires IV’

Joel Pritchett (Technical Director of Age of Empires Franchise, Microsoft) details the multithreading, debugging, and verification tools built into the game’s engine.

Age of Empires is a large-scale game featuring many, many different units and entities. Whether you are running the game on a single-core CPU or a massive gaming PC, the sim game must always produce a similar experience, so lots of thought went into designing the game’s engine to adhere to this requirement. The game’s developers parallelize units by grouping them into islands — these are groups of units that need to know about one another but not about other islands.

RTSs used deterministic input syncing, and it usually worked fine: the peers agree on input and return the same result. The issue with multithreading, however, is that it was possible for threads to start or stop at different times in different machines, making it potentially non-deterministic. The developers resolved this issue by recording read/write records on data in the engine; if there were any conflicts between peers, they could detect those errors. With this in place, developers could just write code, and whenever they wrote code that would create desyncs, they were automatically and immediately caught.

--

--

Amebous Labs

Amebous Labs unites innovative developers, artists, and writers to create groundbreaking, immersive games and entertainment.