Master's Thesis · Stealth Action Game

Ascend

PCVR · PC Unreal Engine 5 Blueprint Visual Scripting Solo Project Darmstadt University of Applied Sciences

Overview

Ascend started as a VR traversal prototype and grew into a full stealth-action traversal game for my Master's thesis at Hochschule Darmstadt. The central challenge was building a game that ran on both VR and PC from a single codebase, where the two platforms have fundamentally different interaction models, UI conventions, and locomotion requirements.

By the end it had a complete blockout level, a multi-state stealth AI, a wide variety of traversal mechanics, narrative dialogue, and platform-specific UI, all sharing the same underlying systems. The research behind the project looked at how different platforms affect player presence and experience in stealth-traversal games.

Narrative and Setting

You play as North, a trained operative navigating dystopian rooftops in a city where an AI built to support urban populations has turned hostile. Five mission objectives are spread across the level, injecting malware, retrieving a prototype, disabling communication towers, shutting down robot manufacturing, and hacking the main system.

The rooftop setting was a deliberate choice. Height creates tension naturally and raises the stakes of traversal. Getting caught by a robot forces a reload from the nearest checkpoint, so the elevated environment and the stealth system push against each other in a way that keeps the player alert throughout. Voice-overs and sound effects were produced using ElevenLabs, directed to match the narrative tone at each objective transition.

Traversal Mechanics

The problem was providing a wide enough variety of traversal types to make the research comparison meaningful, while keeping each mechanic implementable across both platforms without one version feeling worse. Walking, climbing ladders and ropes, crossing balance beams, riding elevators, riding ziplines, and traversing ramps between buildings at varying heights all made it into the final build.

Each mechanic works differently across platforms. Ladder climbing in VR means physically moving your arms. On PC you press a button and use directional keys to move. Ziplines carry you automatically once attached. Elevators require pulling a lever and pressing a button in sequence. The traversal system was built modularly to support iterative design throughout development.

Stealth AI

The design problem for the AI was creating tension without making failure feel unfair. The solution was a gradual detection model with a visible progress bar, giving the player time to react before things escalate.

At half fill the robot switches to investigation mode and moves to where it last saw you. At full fill it enters full alert, chasing and shooting the player which results in a reload from the nearest checkpoint. Crouching reduces both your visual profile and the noise you generate. An EMP device temporarily deactivates nearby robots when things get too close.

All difficulty variables are exposed and easily editable, making tuning efficient without code changes. For the research comparison the robots were kept easier on both platforms, keeping the focus on traversal experience rather than combat challenge.

Sonar System

The sonar scanning system adds a secondary awareness layer. Activating it highlights enemies in red and objectives in green for about five seconds before resetting. The temporary nature was a deliberate design decision. Permanent threat indicators remove the need for players to actively manage their awareness. The sonar nudges them to build and maintain a mental model of enemy positions instead.

Platform-Specific Implementation

The core problem was that VR and PC have completely different interaction conventions, but the research required both versions to offer the same experience as closely as possible. The solution was a single codebase with platform-specific branches only where interaction logic, input handling, or UI display genuinely differed.

The VR version uses an HMD with head-tracking and motion controllers. Movement is thumbstick-based with snap-rotation to reduce cybersickness. All interactions are physical. The UI is diegetic, a panel attached to your left hand that can be toggled whenever you need it.

The PC version supports keyboard/mouse and gamepad. Interactions trigger on proximity and a button press. The UI is a standard HUD following established first-person conventions.

Technical Notes

Extra care was taken to profile tick costs and replace them with timer handles where possible. The codebase was written with open-source release in mind from the start, extensively commented and structured so future contributors can pick it up without needing to ask how anything works.

Switching between platforms requires only simple configuration changes. The Game Instance is one of the few classes in Unreal Engine that persists beyond world changes, so a variable set there communicates throughout all level transitions and can be referenced anywhere platform-specific logic is needed. The whole platform switch takes only three setting changes in the project, which makes iteration and bug fixing significantly faster.

Visual Design and Navigation

A consistent color language solved the problem of communicating interactivity without text prompts cluttering both a VR and a flat screen simultaneously. Interactive elements are yellow throughout. Enemies appear red and objectives appear green during sonar scans. The first building block serves as a tutorial space with control instructions written on the walls, keeping the teaching method consistent and fair across both platforms.

What I Learned

Building the same game for two fundamentally different interaction models taught me more about input design and player presence than any single-platform project could have. The moments where the platforms diverged most, the ladder and rope climb, the lever interaction, were where the design decisions became most interesting.

Gameplay Video (PC)

Gameplay Video (VR)