Using Computer Vision to Identify Player Inputs on Street Fighter 5
C++, OpenCV
Individual Thesis Project
This project won the SMU Research Day Poster Session.
Machine Learning has become more and more popular among game researchers, but researchers rely heavily on developer API, memory hacking or reverse engineering to gain access to game status. Blizzard Entertainment released the StarCraft II API in 2017 that provided full access to game input/output. Many games do not allow such access to source or development/research APIs, and reverse engineering or memory-hacking the game is strictly prohibited by the end user license agreements. This limited access prevents researchers from access and gaining information from games that is required for training data generation as well as standard data science methodologies. This poster introduces an approach that allows researchers to get client data by analyzing prerecorded game footage in either video or image formats using computer vision techniques provided by OpenCV. This system extracts the timeline of the user inputs required to generate character poses in replay video footages.
Unity-chan Stage
C++, DirectX 11
Individual Work
A simple stage with Unity-chan dancing "UNITE IN THE SKY" just like the official video in my own engine, along with ribbon particle and audio spectrum effects.
Main feature
FBX importing
Including mesh, skeleton, blend shape and animation data.
Using imported data to replay animation with my own engine.
Particle System
Use data-driven parameters to create particles and ribbons.
Spectrum Dancing Floor
Analyze spectrum of music and reflect it to the stage.
Timed event system
Since the official project uses animation events to change Unity-chan's facial expression, I create a timed event system to replicate it. The engine also uses a similar timeline system for lifetime parameters of particle effects.
Particle System
Data-driven Parameters
The particle system can read parameters from XML files. It also introduces Timeline for lifetime evaluating parameters.
<Effect name="ribbon">
<Emitter name="ribbon" type="ribbon">
<Location module="box">
<Mins>0.0,0.0,0.0</Mins>
<Maxs>0.0,0.0,0.0</Maxs>
</Location>
<Velocity module="line">
<MaxVelocity>0.0,0.0,0.0</MaxVelocity>
<MinVelocity>0.0,0.0,0.0</MinVelocity>
</Velocity>
<LifeTime>1</LifeTime>
<!-- Time before emitter stops spawning particles -->
<SpawnTime>5</SpawnTime>
<!-- Initial burst spawning -->
<Burst>0</Burst>
<!-- Particles spawned per second -->
<Spawn>60</Spawn>
<ConstantAcceleration>0.0,0.0,0.0</ConstantAcceleration>
<ColorByLife>
<KeyFrame fraction="0" value="0,209,237,255"/>
<KeyFrame fraction="1" value="0,209,237,0"/>
</ColorByLife>
<SizeByLife>
<KeyFrame fraction="0" value="0"/>
<KeyFrame fraction="0.1" value="0.1"/>
<KeyFrame fraction="1" value="0"/>
</SizeByLife>
<Prewarm>false</Prewarm>
<Material>Data/Material/Ribbon.xml</Material>
</Emitter>
</Effect>
Ribbon Particles
After I implemented the particle system during my SD class, I expended it to support ribbon particles.
Vertices generation of ribbon particles is similar to billboards.
For each particle except the first, make vector i as the direction from particle to the camera, vector j as the direction from particle to the previous one, and vector k as the cross product of i and j.
Then add a pair of control points along both positive and negative k axis. (for the first particle, simply add one control point at its position)
Finally, connect the previous pair of control points as a quad.
bool ParticleEmitter::GetRibbonVertices(std::vector<Vertex3>& vertices, const Vector3& cameraPosition) const
{
if (m_particles.empty())
return false;
vertices.reserve((uint)(m_particles.size() * 12));
Vertex3 prevTop;
Vertex3 prevBottom;
Vector3 prevPos;
for (size_t particleIndex = 0; particleIndex < m_particles.size(); particleIndex++)
{
if (particleIndex == 0)
{
Particle* prev = m_particles[0];
prevPos = prev->m_status.m_position;
Rgba prevColor = m_definition->GetColorByTime(m_definition->m_lifeTime);
prevTop = Vertex3(prevPos, prevColor);
prevBottom = Vertex3(prevPos, prevColor);
continue;
}
Particle* current = m_particles[particleIndex];
Vector3 currentPos = current->m_status.m_position;
//create basis
Vector3 toCamera = cameraPosition - currentPos;
Vector3 toPrev = prevPos - currentPos;
Vector3 planeVertical = CrossProduct(toCamera, toPrev);
planeVertical.Normalize();
float currentSize = m_definition->GetSizeByTime(current->m_lifeSpam);
Rgba currentColor = m_definition->GetColorByTime(current->m_lifeSpam);
//new pair of control points
Vertex3 currentTop(currentPos + planeVertical*currentSize*0.5f, currentColor, Vector2(1.f, 0.f));
Vertex3 currentBottom(currentPos - planeVertical*currentSize*0.5f, currentColor, Vector2(1.f, 1.f));
//triangulate
vertices.emplace_back(prevBottom);
vertices.emplace_back(currentBottom);
vertices.emplace_back(currentTop);
vertices.emplace_back(prevBottom);
vertices.emplace_back(currentTop);
vertices.emplace_back(prevTop);
vertices.emplace_back(prevBottom);
vertices.emplace_back(currentTop);
vertices.emplace_back(currentBottom);
vertices.emplace_back(prevBottom);
vertices.emplace_back(prevTop);
vertices.emplace_back(currentTop);
currentTop.texCoords = Vector2(0.f, 0.f);
currentBottom.texCoords = Vector2(0.f, 1.f);
prevPos = currentPos;
prevTop = currentTop;
prevBottom = currentBottom;
}
return true;
}
Spectrum Dancing Floor
To make the stage more dynamic, I build a grid-based stage and pop each vertex up based on the spectrum of the music.
It extracts spectrum data with FMOD, then pops up vertices within the radius in the vertex shader.
struct vertex_in_t
{
float3 position : POSITION;
float2 uv : UV;
float4 color : COLOR;
float3 normal : NORMAL;
float3 tangent : TANGENT;
uint4 boneIndices : BONE_INDICES;
float4 boneWeights : BONE_WEIGHTS;
};
struct vertex_to_fragment_t
{
float4 position : SV_Position;
float2 uv : UV;
float4 color : COLOR;
float3 normal : NORMAL;
float3 tangent : TANGENT;
float3 world_position : WORLD_POSITION;
};
cbuffer model_position_cb : register(b10)
{
float3 MODEL_POSITION;
float padding;
};
cbuffer matrix_cb : register(b0)
{
float4x4 MODEL;
float4x4 VIEW;
float4x4 PROJECTION;
float4 EYE_POSITION;
};
StructuredBuffer<float4x4> tSkinMatrices : register(t3);
StructuredBuffer<float> tMusicSpectrum : register(t10);
vertex_to_fragment_t VertexFunction(vertex_in_t vertex)
{
vertex_to_fragment_t out_data = (vertex_to_fragment_t) 0;
float4 model_position = float4(vertex.position, 1.0f);
float4 world_position = mul(model_position, MODEL);
float3 meshDistance = world_position.xyz - MODEL_POSITION;
float2 distance = float2(meshDistance.x, meshDistance.z);
float len = length(distance);
float radius = 3.f;
float outerRadius = 20.f;
if (len < radius)
{
float fraction = 1 - len / radius;
fraction = fraction * fraction;
int index = int((float)1024 * fraction);
float offset = tMusicSpectrum[index] * 2;
offset = clamp(offset, 0.f, 2.f);
world_position.y += offset;
}
float4 view_position = mul(world_position, VIEW);
float4 clip_position = mul(view_position, PROJECTION);
out_data.position = clip_position;
out_data.uv = vertex.uv;
out_data.color = vertex.color;
out_data.normal = mul(float4(vertex.normal, 0.f), MODEL).xyz;
out_data.tangent = mul(float4(vertex.tangent, 0.f), MODEL).xyz;
out_data.world_position = float4(world_position.xyz, 1.0f);
return out_data;
}
Timed Event System
Timed Event System fires events at a specific time. In this project, I use it to change Unity-chan's facial expression, just as the official project does with animation event.
The system has an array of keyframes. Each keyframe contains the time when it triggers and an event callback. During updating, it holds an elapsed time before update A, and an elapsed time after update B. Then it fires all events within A and B.
template <typename DataType>
struct KeyFrame
{
float m_time;
DataType m_value;
KeyFrame();
KeyFrame(float time, const DataType& data);
};
//---------------------------------------------------------
class TimeEvent
{
std::vector<KeyFrame<CallbackFunc>> m_keys;
float m_prevTime;
float m_currentTime;
public:
TimeEvent();
void Reset();
void Update(float deltaSeconds);
void AddEvent(float time, CallbackFunc func);
};
void TimeEvent::Update(float deltaSeconds)
{
m_currentTime += deltaSeconds;
for (KeyFrame<CallbackFunc>& key : m_keys)
if (m_prevTime <= key.m_time && key.m_time <= m_currentTime)
key.m_value("");
m_prevTime = m_currentTime;
}
Post Mortem
What Went Well
Ribbon particle looked good.
Spectrum dance floor did help the stage look less boring.
Unity-chan kawaii!
What Went Wrong
This project actually missed one feature: a spring bone system that helps unanimated joint to follow the movement naturally. This feature was not finished and test, so the hair still looked weird.
Detailed and well-organized technical documents are critical for users to get started quickly and get things done, but FBX document was definitely not one of them.
I suffered a hard time to get all skinning and animating curves done before I figured out a proper way to do matrices calculations.
I spent too much time just replaying Unity-chan's cute dance without doing any development.
Action Items
Do serious research before doing real math.
Moe-anthropomorphize every technology
Miner Survival
C++, DirectX 11
Individual Work
A survival shooting game in a Minecraft-like world
Main feature
Procedurally Generated World
The world is generated by Perlin Noise.
Boss Enemy
Bosses that spawn zombies and fire explosive missiles are spawned at certain waves.
Zombie Enemy
Zombies can dig blocks and climb up from missile craters
Explosive Missiles
Missiles are fired by boss enemies and explode on impact of player/terrain
Procedurally Generated World
The world is generated by Perlin Noise, and blocks are defined by certain rules.
//Build Chunk with Perlin Noise
void Chunk::BuildChunk()
{
const float scale = 100.f;
const int numOfOctaves = 8;
const float octavePersistence = 0.3f;
const float octaveScale = 2.f;
const float maxAmplitude = CHUNK_SIZE_Z * 0.25f;
for (int x = 0; x < CHUNK_SIZE_X; x++)
{
for (int y = 0; y < CHUNK_SIZE_Y; y++)
{
float coordXInWorld = static_cast<float>(x + m_mapCoords.x * CHUNK_SIZE_X);
float coordYInWorld = static_cast<float>(y + m_mapCoords.y * CHUNK_SIZE_Y);
int deltaColumnHeight = static_cast<int>(
Compute2dPerlinNoise(
coordXInWorld, coordYInWorld, scale, numOfOctaves, octavePersistence, octaveScale)
* maxAmplitude);
int columnHeight = SEA_LEVEL + deltaColumnHeight;
for (int z = 0; z < CHUNK_SIZE_Z; z++)
{
int blockIndex = GetIndex(x, y, z);
BlockType blockType;
if (z < columnHeight - DIRT_HEIGHT)
blockType = BLOCK_STONE;
else if (z < columnHeight)
{
if (z <= SEA_LEVEL)
blockType = BLOCK_SAND;
else
blockType = BLOCK_DIRT;
}
else if (z == columnHeight)
{
if (IsPercentChance(0.001f))
blockType = BLOCK_GLOW_STONE;
else if (columnHeight <= SEA_LEVEL)
blockType = BLOCK_SAND;
else
blockType = BLOCK_GRASS;
}
else
{
if (z <= SEA_LEVEL)
{
if (IsPercentChance(0.001f))
blockType = BLOCK_LAVA;
else
blockType = BLOCK_SAND;//BLOCK_WATER;
}
else
blockType = BLOCK_AIR;
}
m_blocks[blockIndex].SetBlockType(blockType);
}
}
}
}
Chunk Management
Due to the cost of chunk management (create/load/save a chunk) of blocks is huge, executing it multiple time in a frame will crash the frame rate.
The solution is to prioritize and run one of the operation at most one time per frame:
- Deactivate farthest chunk when the number of active chunks reach max value
- Activate the nearest chunk
- Deacticate a chunk that is out of range
Zombie AI
After I implemented the boss enemy, a issue raised that craters created by missiles of boss enemies trap zombies, which makes the player have to search every craters for remaining zombies to proceed to next wave.
So I made zombie to be able to climb up hills and try to dig holes towards the player.
void Zombie::Update(float deltaSeconds)
{
if (m_hp <= 0.f)
{
m_rotationDegrees.z += TURN_SPEED * deltaSeconds;
TurnToward(m_rotationDegrees.x, 90.f, TURN_SPEED * deltaSeconds);
Entity::Update(deltaSeconds);
return;
}
float horizontalAcc = HORIZONTAL_ACC;
m_jumpCoolDown -= deltaSeconds;
if (!m_isLanded)
horizontalAcc *= 0.1f;
Vector3 planeDelta = g_theGame->m_map->m_player->m_position - m_position;
planeDelta.z = 0.f;
m_velocity += planeDelta.GetNormalize() * horizontalAcc * deltaSeconds;
Vector2 velocityProjectionInXYPlane(m_velocity);
TurnToward(m_rotationDegrees.z, velocityProjectionInXYPlane.CalcHeadingDegrees(), TURN_SPEED * deltaSeconds);
//dig blocks to pass if the wall is 2 or more blocks high
if (m_isLanded)
{
float velDegree = velocityProjectionInXYPlane.CalcHeadingDegrees();
BlockInfo facingBottomBlock;
if (-135.f < velDegree && velDegree <= -45.f) //south
{
facingBottomBlock = g_theGame->m_map->GetBlockInfoAtWorldPos(m_position).GetSouth();
}
else if (-45.f < velDegree && velDegree <= 45.f) // east
{
facingBottomBlock = g_theGame->m_map->GetBlockInfoAtWorldPos(m_position).GetEast();
}
else if (45.f < velDegree && velDegree <= 135.f) // north
{
facingBottomBlock = g_theGame->m_map->GetBlockInfoAtWorldPos(m_position).GetNorth();
}
else // west
{
facingBottomBlock = g_theGame->m_map->GetBlockInfoAtWorldPos(m_position).GetWest();
}
BlockInfo facingTopBlock = facingBottomBlock.GetTop();
BlockInfo facingTopTopBlock = facingTopBlock.GetTop();
BlockInfo topTopBlock = g_theGame->m_map->GetBlockInfoAtWorldPos(m_position).GetTop().GetTop();
if (/*facingBottomBlock.IsSolid() &&*/ (facingTopBlock.IsSolid()||facingTopTopBlock.IsSolid()))
{
g_theGame->m_map->DigBlock(facingTopBlock);
g_theGame->m_map->DigBlock(facingTopTopBlock);
g_theGame->m_map->DigBlock(topTopBlock);
m_jumpCoolDown = JUMP_COOL_DOWN;
}
if (topTopBlock.IsSolid())
{
g_theGame->m_map->DigBlock(topTopBlock);
m_jumpCoolDown = JUMP_COOL_DOWN;
}
}
if (m_isLanded && m_jumpCoolDown <= 0.f)
{
m_velocity += Vector3(0.f, 0.f, JUMP_ACC_ONE_SHOT);
m_jumpCoolDown = JUMP_COOL_DOWN;
}
m_light = g_theGame->m_map->GetBlockInfoAtWorldPos(m_position).GetLight();
Entity::Update(deltaSeconds);
}
Self-made Assets
I created models of enemies/player/projectile and the BGM.
Post Mortem
What Went Well
Zombies can reach the player where he is.
The game runs in 60 FPS with hundreds of zombies
What Went Wrong
The map is initialized as soon as the game starts, which causes random seed and terrain generated by it persistent at first round.
The gameplay may be fun enough to distract me from developing.
Blocks against entities collision are not serious cube vs. cylinder collision, and it makes huge entities (boss enemies) not perfectly land in terrain.
Action Items
Do serious research before doing real math.
Daydream Night Mode
Chrome Extension
Individual Work
Get rid of the sun-like screen and have a sweet Daydream...
Main feature
Revert webpage brightness
Revert webpage brightness with adjustable contrast while keeping most of the media element still in the right color.
Exception option
The user can exclude certain domains from the extension if it has its own dark theme.
Cross device settings
Settings are synced between device under the same Google account.
PSO2 Event Schedule
Universal Windows Platform Application (JavaScript)
Individual Work
Click a button once every week after maintanance to get Phantasy Star Online 2 events schedule and reminder.
(After SEGA abandoned scheduled events after Phantasy Star Online 2: New Genesis update, the application is no longer usable)
Main feature
Scrap event schedule info
The application scrap event schedule info from the official site of the game Phantasy Star Online 2.
Windows Action Center notifications
It uses Windows Action Center to notify the user when in-game event is about to happen.
Time Zone supports
The application supports different time zone and users will get the right time wherever they are.
About
Up in the Air is an open-world amusement park for the player to roam and have fun.
Play as a balloon dog and explore the world, complete mini-games and attractions everywhere, and play with "friendly" children.
Collect tickets to unlock skins and abilities, with which to enjoy the world in different ways.
Contributions
Design the data architecture related to characters
Implement mini-games around the world
Manage state machine and physics assets of characters
Prototyping and Developing visual effects
Optimizing blueprints and save frame rates from 20+ to 40+ in development environment
Info
Role: Programmer
Team Size: 13 (4 programmers)
Platform: PC
Genre: Open-world
Engine: Unreal 4
Development Period: June - December 2017 (7 months)
State Management
In this game, the play has multiple abilities to use, and one of them is Inflation. When the player uses Inflation, its mesh "inflates" then turn into a round, fat balloon mesh. The inflated player becomes no longer a character but a physics actor. The player can use Inflation again to "deflate" and switch back to normal state.
To switch between pawn actor and physics actor, it needs to follow the process below:
Inflation
- Change character mesh
- Remove animation instance. (A physics actor doesn't animate)
- Change relative transform since the two mesh may have a different center point or initial rotation.
- Set the new mesh to simulate physics. (this will detach the mesh component from its root, so it needs to re-attach to the root when switching back)
- Disable collision of root collider. (since the mesh is detached and not moving with the root collider anymore, leaving it on the ground would create a "ghost collision")
- Enable collision of character mesh. Now the mesh component handles collision itself. (physics simulated mesh also need collision on to move)
- Do collision settings. (allow overlap events, collision channels or whatever)
- "Copy" velocity from character movement velocity to physics linear velocity to prevent a sudden stop when switching state.
- Deactivate character movement component.
Deflation
- Move the root to mesh position (and apply offset to fit well)
- Activate character movement component.
- "Copy" velocity from physics linear velocity.
- Enable collision of the root collider.
- Do collision settings.
- Reattach mesh to the root collider.
- Restore relative transform.
- Set animation instance.
- Change character mesh
Physics-based Animation
To give an image of a funny and floppy balloon, I use Physics-Based Animation to blend animation with physics simulation.
The amount of physics blending become larger when the character is in the air.
The same mechanism also applies to children when the player uses Air Burst to blow them into the air.
Visual Effects
Click those image cards to view more details and blueprints.
A distortion effect to give an image of air compression.
A distortion effect to give an image of air compression.
A Jiggling effect to give an image of water balloon bouncing on the ground.
A Jiggling effect to give an image of bouncing water balloon.
When an object bounces, It jiggles along the direction of impact.
A Worley noise effect to give an image of floating static around the player.
A Worley noise effect to give an image of floating static around the player.
Round up with fresnel to give a clear visual of effect radius.
Cut the static ball effect on vertical direction, only show its radius in the horizontal direction and leave more space for stuff behind.
Cut the static ball effect on vertical direction, only show its radius in the horizontal direction and leave more space for stuff behind.
Post Mortem
What Went Well
The game is really fun to play.
All the mini-games were well polished and funny.
The game runs at good frame rates with so many stuff in the world.
What Went Wrong
Physics objects and actors everywhere were overwhelming and crashing frame rates in the development environment.
Overwhelming actors and components also made testing and debugging very hard.
We made changes based on feedback at the middle of the milestone, which caused unexpected troubles.
Last-minute changes caused lots of distraction.
There were still crashing bugs remain at the end of Beta.
Action Items
Playtest early.
Think of optimization early. There would be nothing to do during the final refinement if everything had been perfect at the early stage.
Keep good communications with composers.
About
Auxilium is a 4v4 capture-the-flag first-person shooter. Two teams (up to 4 players on each team) fight against each other, capturing the flag at the opponent base then return it to own base to earn points and win.
There are 4 unique classes in the game. Each of them has different weapon and ability to play different roles during the match.
There are also 4 maps, each of them has a unique gimmick for different tactics approaches.
Contributions
Design the data architecture related to characters
Prototyping weapon mechanism and Rogue class ability
Developing weapon mechanism and all class abilities
Prototyping and Developing particle effects for weapons and abilities
Monitoring and solving networking issues related to gameplay
Info
Role: Programmer
Team Size: 50 (13 programmers)
Platform: PC
Genre: First Person Shooter, Competitive Multiplayer
Engine: Unreal 4
Development Period: January - May 2017 (4 months)
Weapon Recoil and Recovery Mechanism
In this game, we decide to implement weapon recoil by applying offset to the player's camera.
It stores a rotation offset and synchronizes with recoil and recovery so that the original camera rotation equals current camera rotation plus rotation offset.
Weapon recoil recovers by time. It stops recovering and reset the rotation offset as soon as the player moves the camera actively.
Particle Effects
I was responsible for prototyping particle effects for weapon and abilities.
I focused on the motion and found out vector field did add a lot of dynamics to effects.
Post Mortem
What Went Well
The shooting and class abilities felt good when playing.
Dividing a 50 people team into 4 sub-teams and had each team make one map did help us focus. Eventually, it came out that all maps were fun and unique.
I learned a lot about details of FPS elements.
What Went Wrong
None of our programmers had experience of network programming, which causes almost an entire milestone of latency. Programmers were always rushing to catch other disciplines.
The sound team was not very familiar with DTM pipeline. Sound effects always exceeded 0 level, which made the game sound overwhelming.
There were still disconnect bugs remained at the end of Beta.
Action Items
Spend most of the time refine characters, with which the player spends most of the time interacting.
Prioritize debugging tasks base on how critical they were or how easy the player would notice.
Notice the lead whenever I have updates.
About
The Terrible Trap is a series of two-dimensional (2D) puzzle mazes built in the Unity development platform specifically for Android devices.
The player must utilize the Android tablet tilt function to either dart around or tiptoe past any traps and obstacles to reach the “Exit” thus advancing to the next stages of the game and challenging their critical thinking skills in the process.
Contributions
Implemented all game features and systems individuality using C#
Built level components for level designers (gravity tilting and breakable walls)
Prototyped visual effects (particles and breakable pieces)
Collaborated with artist and made the credit video together with Adobe After Effects
Managed version control and helped teammates who had difficulties with
Info
Role: Programmer
Team Size: 5 (1 programmer)
Platform: Android Tablet
Genre: Maze, Physics
Engine: Unity 5
Development Period: October - December 2016 (2 months)
Skills
C++, C#, Java
Spring, jQuery, Bootstrap
JavaScript, HTML, CSS
MySQL, Oracles, SQL Servers
PS4 SDK
DirectX 11
Unreal 4
Unity 5
Perforce, Git
JIRA
Adobe After Effects
FL Studio
Experience
Software Engineer II Aug. 2018 - Present
Armature Studio, LLC. Austin, Texas, United States
- Ongoing VR Project (Oculus Quest 2):
- Integrated original game systems to Unreal Engine including blend shapes, shadows, and IO.
- Implemented our own shadow pipeline to reproduce the original look and meet performance requirement.
- Optimized the game system to meet performance requirements for shadows and animations.
- Made tools to export binary package of the original game and pipelines to replace data.
- Bayonetta and Vanquish (PlayStation4 and Xbox One):
- Contributed to graphics ports and optimizations for both games mainly on PlayStation 4.
- Delivered platform specific systems for both games on Xbox One and PlayStation 4, including Leaderboard, IO and package streaming.
- Intergrated middlewares to the newer version that can run on both platforms.
- Made tools to export binary package of the original game and pipelines to replace data.
- Sports Scramble (Oculus Quest):
- Developed game features including VR pause and trophies.
- Modified Unreal 4 source code to fix bugs and meet the performance requirement for Oculus Quest.
- Ported tools from 3DX MaxScript to Maya Python without previous experience of both languages, which optimized asset pipelines and reduced workload of artists.
- Made tools to export binary package of the original game and pipelines to replace data.
Software Engineer Intern Jun. 2015 - Dec. 2015
Shanghai Tongzhen Information Technology Co.,Ltd. Shanghai, China
- Developed and maintained website features for business logic using Spring MVC, MySQL, and jQuery.
- Developed internal tools in Java to automate data upload process, eliminated human work.
Education
Master of Interactive Technology | SMU Guildhall | Dallas, Texas, United States MAY 2018
Bachelor of Software Engineering | Donghua University | Shanghai, China JUNE 2016
Projects
Daydream Night Mode (Accessibility Chrome Extension) 2019
Chrome Extension | Personal Project
- Used CSS only to achieve reverted webpage color with adjustable contrast while keeping most of the media element still in the right color.
- Implemented extension settings to allow the user to exclude certain domains from the extension and sync between devices.
Using Computer Vision to Identify Player Inputs on Street Fighter 5 2018
OpenCV | School Project | Personal Thesis
- Used OpenCV to train classifiers to identify character motions in an image sequence of replay.
- Map identified motion to player input timeline for and reproduce in the game.
- Expendable to other game/character with proper samples.
Up in the Air 6 months, 2017
Open-world game
Unreal 4 | Team of 13 (4 programmers)
- Prototyped the core gameplay that the player navigates the world with deflated form (normal character movement) and inflated form (physics-simulated actor moves with force).
- Programmed management system for the player’s physics property and movement methods between inflated and deflated forms.
- Designed component structure for base player class.
- Programmed Boosting and Inflation abilities for the player class.
- Prototyped particle effects with vector fields that made them more dynamic.
- Implemented mechanisms for mini-games according to documents from level designers.
Unity-chan Stage 2017
Animation Stage
Own Engine | Personal
- Implemented FBX importer for meshes, animations, skeletons, and blend shapes.
- Programmed an animation system that supports additive animations.
- Implemented a particle system that supports data-driving and ribbon particles.
Auxilium 4 months, 2017
Class-Based Capture-the-Flag Shooter
Unreal 4 | Team of 50 (12 programmers)
- Programmed the dashing ability with a bursting initial thrust and a gentle slow down.
- Programmed weapon projectile mechanisms (shotgun, assault rifle, sniper rifle, throwing knife).
- Implemented weapon recoil mechanism.
- Prototyped particle effects for all character weapons and abilities with vector fields.
- Responsible for making physics assets for imported skeleton meshes.
The Terrible Trap 2 months, 2016
Tablet Balanced Ball Maze
Unity 5 | Team of 5 (1 programmer)
- Implemented all game features and systems individuality using C#.
- Built level components for level designers (gravity tilting and breakable walls).
- Prototyped visual effects (particles and breakable pieces).
- Collaborated with artist and made the credit video together with Adobe After Effects.
Miner Survival 2017
Sandbox, Shooter
Own Engine | Personal
- Implemented procedural world generation like Minecraft.
- Designed enemies AI that can chase the player and dig through blocks to escape from deep pits.
- Programmed massive enemies that can fire missiles to destroy blocks.
- Composed the background music for the game.
PSO2 Event Schedule 2016
Universal Windows Platform Application | Personal Project
- Built the application with HTML, JavaScript(jQuery), and CSS.
- Used jQuery to scrap event schedule info from the official site of the game Phantasy Star Online 2.
- Implemented Windows Action Center Notifications for in-game events.
- Synced event time to local time zone.
Hi! I am Wenzheng Huang and I am a game programmer.
I believe that fun core gameplay and an easy but deep system is key to succeed among players.
I love programming fun and addictive gameplay logic. I dream of making challenging and encouraging games.
My taste of games is wide-ranged, but I like combat games most, like Dark Souls Series and Devil May Cry Series.
Recently Played Games
Phantasy Star Online 2: New Genesis
Monster Hunter: Rise
Monster Train
Risk in the Rain
StarCraft 2 (Co-op)
Nioh 2
Opus Magnum
Endless Space 2
Toukiden 2
Faviorite Games
Monster Hunter: World/Rise
Phantasy Star Online 2: New Genesis
The Legend of Heroes: "Trails" Series
Dark Souls 3
Devil May Cry 4
Puzzle & Dragons
Borderland Series
I am also a fan of electronic music. I do some composing research when I am free.
Before I became a game programmer, I was a web developer and I can finish small-scale projects all alone.
I coded this site base on Materialize CSS Framework and Spring Boot.
Below are some effect prototypes I wrote for this site.
JsFiddle does not support small screen display.