Wednesday, December 31, 2014

Photon Mapping

With my latest game, The Little Plane That Could launched, it gave me a good opportunity to start some experimental coding. I would like to crack the challenge of real time indirect lighting. My idea is to use voxel-geometry only, and run a photon mapper on this geometry. Because the geometry is simple, the intersection tests (AABB vs ray) may be fast enough for real time use.

Below is an early result, where I shoot photons for direct light only, no bounces yet. It shows the raw photon hits, and a low-pass smoothed version of it. The light source is not visible, but is not too far above the pillar.

I also tried flat shading the voxel faces based on the number of photons that hit it. The discontinuity between faces is too distracting though. So the next step is smooth shading the quads. A first attempt with Gouraud shaded quads looked pretty horrible, so I need to rethink that.

Tuesday, December 23, 2014

The Little Plane That Could

This is just a heads-up that my new game The Little Plane That Could has been released for iOS, Android and GNU/Linux(64bit).

It was made available for Oculus Rift (Win64) earlier and a Mac OSX version is still in the pipeline.

Its highlights are the great physics, the convincing AI pilots and the immersive 3D Audio via OpenAL. To run the GNU/Linux variant (free to try, pay-what-you-want) you need to install its requirements using:

$ ./apt-get install libopenal1
$ ./apt-get install libalut0
$ ./apt-get install libsdl2

I've tested it with Ubuntu 14.04 LTS. And it requires a gamepad to run. If your gamepad is not supported by SDL2, you may have luck with adding it to the gamecontrollerdb.txt file.

Wednesday, November 26, 2014

Using the OpenAL library in Xcode for iOS.

The last two days I have been porting The Little Plane That Could to iOS. It was already running on OculusRift(Windows), Linux, Android and Mac OSX. When porting it, I struggled with the poor OpenAL implementation that comes with the iOS SDK.

I knew that there was a very low limit of 32 sound sources for iOS. I figured that this limit only applied to sound sources that were in 'AL_PLAYING' state. And sure enough: you can create much more than 32 sources, without getting errors back from the OpenAL implementation.

In my game, I have 48 planes flying around, each with engine sounds. If we forget about gun sounds and explosion sounds for a while, these 48 engine sounds already exceed what the iOS implementation of OpenAL can handle. So this is why I adopted a method of only playing the four closest engine sounds. If an engine sound was further away, I would pause the playing of this sound. This worked well on all platforms, except iOS.

The problem I experienced was the following: alGetError() would start returning a non-sensical value of -1. And the sounds would no longer get started. The return value of -1 for alGetError() violates the OpenAL specification standard though. If you study the header files of the SDK, you see that the only acceptable return values are 0 (AL_NO_ERROR) or:


/** 
 * Invalid Name paramater passed to AL call.
 */
#define AL_INVALID_NAME                           0xA001
/** 
 * Invalid parameter passed to AL call.
 */
#define AL_INVALID_ENUM                           0xA002
/** 
 * Invalid enum parameter value.
 */
#define AL_INVALID_VALUE                          0xA003
/** 
 * Illegal call.
 */
#define AL_INVALID_OPERATION                      0xA004
/**
 * No mojo.
 */
#define AL_OUT_OF_MEMORY                          0xA005

After struggling to find the cause of the problem for half a day, I finally solved this issue. It turns out that the 32 source limit applies to BOTH playing and paused sources. Only sources that have not started playing yet (AL_INITIAL) or have been stopped (AL_STOPPED) do not count against the limit. So the fix was relatively easy: sources that are not nearby should not be paused. They should be stopped instead. This does not take away the fact that the alGetError return value is not according to spec. Apple should fix this.

Wednesday, November 19, 2014

What has Bram been up to?

So, it's been quiet for too long on this blog. It is about time to report on what I have been up to. Why not start with a small snippet of video?

Since July 2013 I have been mulling over the next step for my crane sim game. And as the holy grail, I have put forward an ambitious development target: deformable terrain. Frankly, I have not yet seen a single game getting this right. I most likely will not achieve completely realistic, sandbox-style, unlimited deformable terrain either. But I will surely try. The video above show a bulldozer at work in my terrain simulator.

Currently I have achieved the following:

  • Proper 3D terrain, not those crummy height-fields. So you can have overhangs.
  • Universally deformable terrain: you can dig anywhere and not just at designated spots.
  • Tunnelling! Yes - you can dig your own tunnel.
  • Soil mixing. You can dig up grey coloured dirt in the west, and deposit it on a brown coloured sediment in the east.
  • Near infinite size. I bound the height and depth (Z) of the plane. But in the long/lat (X/Y) directions, you can drive near infinitely far. And deform the terrain at any place. Distances are only limited by MAX_INT and the size of your disk, because any modifications you do to the terrain are stored to disk.
  • Procedural definition of the virgin grounds. I only store disturbances of the virgin ground. So you can dig a hole, then drive 8 hrs going in any direction, and drive 8 hrs back to the original location. Your hole in the ground will still be there! (This is a big thing!)
  • Tracing back your steps, BTW, is easy. Because while driving, depressions are made in the soil which are permanent. They will still be there, even after 100s of hrs of play. Everything persists.
I think this technology is pretty unique and not found in any other game. The basic tech is all implemented, and nearly good enough. What is not there at all, is game-play. Wonderful game technology is one thing. Making a interesting, fun to play game is another. There are some routes I could take w.r.t. the game-play:
  • Make it into an arcade style game with a lot of KATCHING! noises, and bright graphics every time you dig up a gem from the earth.
  • Do it like the original Little Crane game: break up the gameplay in a number of puzzle-like levels with simple goals. Digging for treasure, unearthing a temple, burying an item, digging a tunnel, building a dam and such, could all be simple objectives in such a game.
  • Make a full blown gold-mining simulator. Process earth by moving it into a trommel.
  • Make a full blown gold-mining simulator plus economic sim: borrow money from bank, buy claims, buy diesel, etc.
  • Make a full blown gold-mining simulator MMO. Buy and sell claims online. Bid for scarce resources (land) against other players.
The smaller the scope, the higher chance of succeeding of course, so I should avoid the latter ones.

Things I am doing at this moment include added more vehicles, like dump truck and excavator to the simulation. I also have to trace a bug where soil simulation goes hay wire at negative world coordinates. As my world is procedurally defined, I could cheat and spawn all action at large positive coordinates, but that feels wrong.

Monday, August 18, 2014

Developing for Oculus Rift DK2

I've purchased an Oculus Rift Development Kit 2. In a sense it feels like going back to my thirties. Much of my professional career was in Virtual Reality. I can't remember for how long exactly, but I think it was for 8 years or so, that I developed software and ran VR projects for SARA's CAVE. See below for a picture or me, mining genomics data that we visualized for Johnson&Johnson.

Since 2007, my career has been in Video Games. Even though I had been making games since 1982, it took me 25 years of hobby game development before I decided to do this for a living. And since 2010 I do this independently as a private enterprise. Back to 2014: my two careers have crossed, and I find myself actually developing Games for Virtual Reality. A younger me would have labeled this as heaven, but the years at SARA have made me cautious about the adoption rate of VR. This current VR revival could be another one that is destined to go out like a candle.

This web page will function as a development log. I intend to document peculiarities, problems and solutions that are bound to pop up during DK2 development. If you just want to play my VR game, download The Little Plane That Could.

Windows Driver issues

  • When using the 'Direct' Rift Display Mode, I get extreme chromatic shifts. The different channels R/G/B are rendered all over the place, not converging what so ever. 'Extended Desktop' display mode works better.
  • The SDK precompiled example runs badly, the jitter is horrible and is only alleviated a bit if I toggle multi sampling. It's hard to say which mode (ON or OFF) reduces jitter, because both look exactly the same, with the same level of aliasing.

Development under Windows

  • After installing windows runtime and sdk, the latest firmware can be flashed to the device.
  • The SDK example 'Oculus Room Tiny' is intended as the minimalistic code to base your first VR app on. It comes with a Visual Studio 2013 solution. Unfortunately this does not build out of the box. But what is worse: it is a D3D app, and not an OpenGL based app. The SDK seems to support both, but unfortunately the sample is in D3D only. Bummer!
  • Building Oculus Room Tiny yields a: Cannot open include file: 'AtlBase.h'
  • Linking the sample also fails: fatal error LNK1104: cannot open file 'atls.lib'
  • To fix these compiler and linker errors, you need to install a Windows Driver Kit, as described on Stack Overflow.
  • If you want to link your own app against libOVR, you need to add these to the linker input: libovr.lib (libovrd.lib for Debug), ws2_32.lib and winmm.lib.
  • The Rift display mode (Extended Desktop vs Direct) seems to mess with OpenGL context creation. When 'Extended Desktop' is enabled, SDL_GL_CreateContext() can create a 3.2 Core profile context. If I ask for a 3.2 Core profile when the mode is set to 'Direct', then SDL_GL_CreateContext() crashes with an access violation at address 0.

Development under GNU/Linux

  • Let's start with some good news: Oculus intends to release DK2 SDKs for Windows, OSX and Linux. The bad news: so far no Linux SDK for the DK2 has appeared. It was labeled as 'soon' but has been delayed quite a bit now.

General Development issues

  • You can create the projection matrix for each eye with a call to ovrMatrix4f_Projection() but you need to transpose the result before using.
  • You can build the view matrix for each eye using the eyePose from ovrHmd_GetEyePose(). Strangely enough you still need to shift the position of the eye with the intra occular distance. Like the project matrix, the view matrix need transposing as well.

Tuesday, August 12, 2014

Routers with restrictive NAT spoiling online gameplay.

I have published an online tank simulator for both iOS and Android. On the app store it is called Buggy Bang Bang! and was renamed as Six Wheels and a Gun on Google play. Players on iOS and Android can even play each other, as they are all pooled on the same lobby server. I wrote the lobby server code myself using Python.

A neat thing about my networking code (C for the clients, Python for the lobby server) is that it is 100% UDP based. There is not a single TCP connection used, just some UDP packets that get exchanged with both server and opponent's client. Two players playing my game are typically on a LAN behind a router that does NAT (Network Address Translation) and this complicates things. It means that the only way they can connect to each other is to have them both connect to a server on the internet, with a public IP. This server will then see datagrams coming in on some random ports, and can tell both parties what these ports are. This is the concept behind UDP hole punching.

Things go wrong if your router is too restrictive. It may do Symmetric NAT, or Restricted Cone NAT. If this happens, the players will not find each other on the ports as described by their common server. In my game, it will result in two buggies that refuse to move and after 12 seconds, the match will time out and record this as a 'forfeit' by the other party. Moral of the story: if you play my game and the HUD controls don't show up with the buggies just sitting there, then your router or your opponent's router will not play ball. So far my approach to this is: too bad it happens, better luck next time. If you want, read the manual of your router and change the NAT type.

UPDATE

I've been adding a new feature to S.W.a.a.G. that detects restrictive NATs. Instead of pairing up a user with a bad NAT, we nominate two players for a session. If one of them fails to punch a hole in the firewall, the match will not be started. The players are informed of the situation with either of these two messages:

  • YOUR NAT SEEMS TOO RESTRICTIVE.
  • OPPONENT NAT SEEMS TOO RESTRICTIVE.
So at least the players will know what is going on. If it is your opponent, you can just retry, and hopefully get paired with another user.

Tuesday, July 22, 2014

OpenAL with Android NDK.

This posting is for future reference, if I ever have to get OpenAL working again under Android's NDK. It will also be useful to my peer game developers who intend to do the same.

Step 1

Get an Android port of openal-soft.

Step 2

I like to build static libraries for android game dev, instead of dynamic ones. So I replaced the jni/Android.mk with my own:

LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)

OPENAL_DIR := OpenAL

AL_SOURCES := \
  $(OPENAL_DIR)/Alc/android.c              \
  $(OPENAL_DIR)/OpenAL32/alAuxEffectSlot.c \
  $(OPENAL_DIR)/OpenAL32/alBuffer.c        \
  $(OPENAL_DIR)/OpenAL32/alDatabuffer.c    \
  $(OPENAL_DIR)/OpenAL32/alEffect.c        \
  $(OPENAL_DIR)/OpenAL32/alError.c         \
  $(OPENAL_DIR)/OpenAL32/alExtension.c     \
  $(OPENAL_DIR)/OpenAL32/alFilter.c        \
  $(OPENAL_DIR)/OpenAL32/alListener.c      \
  $(OPENAL_DIR)/OpenAL32/alSource.c        \
  $(OPENAL_DIR)/OpenAL32/alState.c         \
  $(OPENAL_DIR)/OpenAL32/alThunk.c         \
  $(OPENAL_DIR)/Alc/ALc.c                  \
  $(OPENAL_DIR)/Alc/alcConfig.c            \
  $(OPENAL_DIR)/Alc/alcEcho.c              \
  $(OPENAL_DIR)/Alc/alcModulator.c         \
  $(OPENAL_DIR)/Alc/alcReverb.c            \
  $(OPENAL_DIR)/Alc/alcRing.c              \
  $(OPENAL_DIR)/Alc/alcThread.c            \
  $(OPENAL_DIR)/Alc/ALu.c                  \
  $(OPENAL_DIR)/Alc/bs2b.c                 \
  $(OPENAL_DIR)/Alc/null.c                 \
  $(OPENAL_DIR)/Alc/panning.c              \
  $(OPENAL_DIR)/Alc/mixer.c                \
  $(OPENAL_DIR)/Alc/audiotrack.c           \
  $(OPENAL_DIR)/Alc/opensles.c


LOCAL_MODULE    := openal
LOCAL_SRC_FILES := $(AL_SOURCES)

LOCAL_C_INCLUDES := \
  $(HOME)/src/openal-soft/jni/OpenAL \
  $(HOME)/src/openal-soft/jni/OpenAL/include \
  $(HOME)/src/openal-soft/jni/OpenAL/OpenAL32/Include \


LOCAL_CFLAGS += \
  -DAL_ALEXT_PROTOTYPES \

MAX_SOURCES_LOW ?= 4
MAX_SOURCES_START ?= 8
MAX_SOURCES_HIGH ?= 64

LOCAL_CFLAGS += -DMAX_SOURCES_LOW=$(MAX_SOURCES_LOW) -DMAX_SOURCES_START=$(MAX_SOURCES_START) -DMAX_SOURCES_HIGH=$(MAX_SOURCES_HIGH)
LOCAL_CFLAGS += -DPOST_FROYO

include $(BUILD_STATIC_LIBRARY)

Step 3

Get an Android port of freealut.

Step 4

Add an Android.mk file to src/ directory containing:

LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)

LOCAL_MODULE := alut

LOCAL_SRC_FILES := \
  alutBufferData.c \
  alutCodec.c \
  alutError.c \
  alutInit.c \
  alutInputStream.c \
  alutLoader.c \
  alutOutputStream.c \
  alutUtil.c \
  alutVersion.c \
  alutWaveform.c

LOCAL_C_INCLUDES := \
  $(HOME)/src/freealut/include \
  $(HOME)/src/openal-soft/jni/OpenAL/include \

#LOCAL_CFLAGS +=

include $(BUILD_STATIC_LIBRARY)

Step 5

Some config.h stuff does not properly make it into the source. Not sure why, so I manually edited alutInputStream.c alutInternal.h and alutUtil.c to overcome this.

Step 6

Now you can pull these static libraries into your build by adding to your project's Android.mk file something like this:

LOCAL_STATIC_LIBRARIES += alut openal
...
$(call import-module,openal-soft/jni)
$(call import-module,freealut/src)

UPDATE #1

When porting to MSWindows, I found that OpenAL is pretty tricky there as well. Some info for future reference:
  • You need to link against libOpenAL32.dll.a
  • You need to supply soft_oal.dll with your game, renamed as OpenAL32.dll
  • As with Android and OSX, the CMake stuff of freealut is horribly broken, so you need to pull in the sources to your project and hack around the platform specific stuff.

UPDATE #2

When building for 64bit arm, or arm64-v8a as it is known, I get crashes (SIGSEGV) in alSourcePlay. Maybe openal-soft is not 64 bit safe? The android port is pretty old, as the last change is from 2012.

Wednesday, July 16, 2014

Flight Simulation on touch screens.

My current project is biplane combat simulator. I have been developing it under GNU/Linux, and porting it to Android. It needs a commercially viable platform to run on, and I am pretty sure GNU/Linux games are not making money at the moment. This may change with the Steam box, but for now, I have to look at iOS, Android or maybe a console platform.

After porting my biplane game to Android, I learned a surprising fact: When it comes to flight simulators, there is absolutely no substitute whatsoever for a physical analog joystick. You really need the self-centering stick with plenty of instant feedback if you want to control an airplane.

The first substitute I tried was touch control, with a virtual joystick on the screen. I added snapping-to-center when you release your touch, so that there is some sort of self-centering going on. But this was not enough to make the plane controllable. The lack of tactile feedback means that you have little awareness of the absolute position of the virtual joystick without looking at the control graphics.

I may need to redo this touch control experiment with iOS though. Android is notoriously bad when it comes to responsive touch control. Try this for fun: swipe some content (web page, e.g.) up and down at roughly 5Hz or so. You will see that the motion of the content is opposite of the motion of your finger. If your finger goes up, the content still goes down from your swipe 100ms ago.

I thought that tilt-control (changing pitch and roll of the phone to directly control the virtual flight stick) would fix the awareness issue. But it comes with a loss of snap-to-center functionality. To center the control, you actually have to carefully move the device orientation back to its neutral position. Android now has sensor fusion and can use both gyroscope and accelerometer to accurately determine the gravity vector. But even using this (via Android sensor type 'Gravity') was not enough to constitute a usable control system.

So for now I will make a bold statement: Android and flight simulators do not mix. I guess it means I need to use another platform to release this for. Maybe PC, but then, so far I've not been able to get a project greenlit for Steam, and PC gaming seems to be synonymous to Steam gaming these days.

Wednesday, June 11, 2014

Recording Game Play Videos

This is mainly a note to future self, and also others that need to record a video of game play under Ubuntu. First off, it is of no use to use non-standard resolutions when uploading to YouTube. I recorded some videos with a vertical resolution of 608 lines, which ended up as 480p encoded on YouTube. So it's best to use a window size of 1280x720 to get a quality YouTube video.

My tool of choice for recording a video under Ubuntu is the tool called 'RecordMyDesktop'. I thought I had a very fast (Haswell) CPU, but when using 'encode on the fly' the output video will be full of faults. (Typically frozen video, or image lagging behind the sounds.) So under the Advanced tab, it is essential to set the 'encode on the fly' to disabled. It turns out that my SSD drive is fast enough to store this raw 1280x720 stream, so that's fine. You do need to wait for encoding after you stop recording the video though. Also in the advanced settings make sure you don't have 'Record Window Decorations' set, and that the sound device is 'Pulse'.

There is an annoying bug in RecordMyDesktop that will save the video under the wrong file name. When you select 'Save As', and choose a file name, it will actually save the video under the previously used file name. And next time you save a recording, and pick another name, it will use the old name again.

Thursday, April 10, 2014

Reoccurring programming bugs

Ugh, how I hate it. That feeling that you are struggling to fix an issue with your code, and you know you have battled with it before on a different project. Of course you can't remember what the fix was, and the two code bases are too different for easy compare. In the Netherlands, we have a saying: "A donkey will not stumble on the same stone twice." Meaning that if you do, you are dumber than a donkey. Here is the measure I take to make sure I do not get hit a third time: I blog about it for future reference.

My issue involves closing the window of an OpenGL app under Mac OSX by clicking the red window close button. My app would then crash in the OpenGL drawing (triggered by a CADisplayLink.) Stopping a display link is a bit of a mess, as I always seem to be to late with it. Let's see the hooks where I can close down the display link:

My AppDelegate gets callbacks about OSX's intention to close down. We have a lot of these callbacks going on, as there are:

-(void)applicationWillResignActive:(NSNotification *)notification
-(void)applicationWillTerminate:(NSNotification *)notification
-(NSApplicationTerminateReply)applicationShouldTerminate:(NSApplication*)sender
-(BOOL)applicationShouldTerminateAfterLastWindowClosed:(NSApplication *)sender
Astoundingly none would be fast enough to close down the display link before resources get torn down. It turns out only this one is called early enough:
-(BOOL)windowShouldClose:(id)sender
However, my AppDelegate was not getting called with this. It turns out I had not made my app delegate the delegate for my window:
[ window setDelegate:self ];
This requires the AppDelegate to be a NSWindowDelegate:
@interface DigAppDelegate : NSObject <NSApplicationDelegate, NSWindowDelegate>

Friday, March 21, 2014

Car engine

I like car racing simulations. I spent a lot of time playing Gran Turismo Prologue on my PS3 years ago. So I thought it would be time for me to try making my own racing game. After coding a Hover Bike and all sorts of Heavy Machinery that should be easy, not?

Well, it turns out to be a lot more difficult than I expected. Just some high school physics will not cut it. The image above is my prototype in action. I'm pleased with the suspension, the steering and the traction models. Most of which you get for free by using the OpenDE physics engine. But how can I create a simulation model for the engine?

OpenDE lets me speed up/dn the wheels by simply specifying:

  • Desired wheel velocity.
  • Torque available to achieve this wheel velocity.
  • Mass of the wheel.
  • Moment of inertia of the wheel.
So how do I go about this? How do I determine what speed the wheel wants to turn, and how much torque is available for this?

In real life, things are more complicated. The engine primarily speeds up the fly wheel, not the car wheels. Depending on the state of the clutch, the gearbox, it will result in wheel acceleration. Do I need to model a flywheel in my simulation? And somehow transfer kinetic energy between flywheel and car wheels depending on clutch and gearbox? How does the accelerator fit in here? And what about the torque curve of the engine, and the rpm of the engine? I am beginning to think a few simple equations are not going to suffice.

How does the accelerator affect the engine RPM? Well, that depends on engine load. How do I calculate that? It seems that there is also a circular dependency in here somewhere:

With an engaged clutch, you can only increase wheel velocity if you increase engine RPM. But engine RPM can only go up if wheel velocity goes up. I think I need to study up on what is actually engine load, and how I can measure it in my sim.

Saturday, March 15, 2014

Postscript printing

I bought a laser printer recently for home office use. I went with the Ricoh Aficio SP 3510DN. What scares me is that some people sell smartchips. Apparently you need to replace a chip if you refill the toner with a non-Ricoh product. It is disgusting how printer manufacturers do not let the consumer decide when and how to replace the toner. Somehow pages need to be counted by a secure chip. So who owns the f-ing printer, me or an evil printer ink corp?

On a happier note: this printer can do Postscript. I absolutely adore Postscript. It's much more than a printer data file. It is a very nifty programming language. Because it is stack-based you really need to wrap your head around the problem, and un-learn a lot of imperative programming skills.

I started programming Postscript when my father purchased an Everex Laserscript LX made by the Abaton division of Everex. That was an awesome printer, despite it's small 2.5Mb memory, and was a clone of the more expensive Apple Laserwriter. I believe we upgraded that memory at some point in time. In 1992 the laser printers were roughly 10 times more expensive. But at least you downright owned your printer, and were not a slave to the ink pushers.

I look forward to adding Postscript features to my software. For instance this Art project I've been working on.

Thursday, February 13, 2014

Crawling to the top.

Freshly released, is v6.10 of The Little Crane That Could for linux/64bit [update: and win64]. You can get it at bram.itch.io.

It comes with a tracked vehicle and a hill climbing level. You can see it in action on YouTube (although the game at 60fps looks much better than YouTube's 30fps can do it justice.

iOS and Android releases will follow soon. I was pleasantly surprised by iPhone5S which can easily do this simulation at 60fps. My iPad3 struggles to get 30fps though. I guess the iPad3 CPU is not all that great.

Friday, January 17, 2014

Beta Testers Wanted for GNU/Linux buggy combat simulator game.

So... I've done a lot of work to port my game Six Wheels and a Gun from Android to ASM.JS javascript in a browser. Just when everything was working, and only networking was left, I found out that Javascript cannot do UDP communication. This means that online play in a browser is not going to work. TCP is too high latency and too slow for running a distributed physics simulation.

I don't want my porting effort go to waste, so I made a derivative port to the 64 bit linux platform. (The Javascript version is built on SDL1.2 and the linux version on SDL2.) Everything works just fine as far as I can tell, and I am preparing a release on the itch.io portal. Before I release it, I could use some testing and feedback. Grab a copy of the game here: swaag-1.5-linux64.tar.gz or swaag-1.5-linux32.tar.gz

No installation necessary, just run ./swaag on the command line. It is dynamically linked and requires libSDL2 to be installed on your machine. (apt-get install libsdl2-2.0 on Ubuntu 13.10) In case of problems, please send me the console output.

Tuesday, January 14, 2014

Rendering flags for online users.

So I've ported my Buggy Bang! Bang! game to Android, and released it under the name Six Wheels and a Gun or SWAAG for short. While doing this port I wrote a multiplayer networking system from scratch, including the lobby server. When doing online multiplayer, I thought it would be nice to render flags for the players based on the country associated with the IP number. I've open sourced this piece of technology as ip2ensign which is available at GitHub.