Capstone · Grand Canyon University
Brain Game is an open-source middleware platform that bridges Emotiv EEG hardware to Unity game engines — giving people with physical disabilities a way to play games, browse the web, and control a computer using only their thoughts.
A full end-to-end demonstration of the Brain Game pipeline from Emotiv headset to Unity game output in real time.
Team Lead
CS senior and team lead driving the vision behind this Capstone project. Backed by research experience at Canyon AI Research Lab and enterprise IT work at Grand Canyon Education, I lead this team with a focus on building solutions that are as practical as they are innovative.
Developer & Security
He is a senior majoring in Cybersecurity who played a key role in developing the WebSocket integration within the Brain Game pipeline, including its implementation with Unity. He contributed to the creation of the UPM package, which primarily handled the opening and maintaining the WebSocket connection. He also provided support to the team during both the security review process and the development of security guidelines for the pipeline.
Unity Integration Developer
Computer Science Major focusing on the development of the Unity Plugin for this project. Backed by experience at Grand Canyon University as well as personal projects using the Unity Game Engine, I focus on helping this project be easily accessible to game developers using Unity.
QA & Security
Senior specializing in cybersecurity and quality assurance. Led the security review and developed security guidelines for the Brain Game pipeline, ensuring the codebase met production-grade standards. Owned technical documentation across the full project lifecycle, built and designed the project website, and contributed directly to the OSC Wrapper development. Strengthened hands-on skills in secure software validation, end-to-end system testing, and translating complex technical work into accessible documentation.
Traditional assistive technology has come a long way, but for people living with ALS, paralysis, cerebral palsy, or other conditions that limit physical movement, the options remain limited, expensive, and often out of reach. We built Brain Game because we believe interacting with technology shouldn't require hands. It should require only a thought.
People living with ALS, paralysis, or limited motor control who want to play video games without a controller or keyboard. Brain Game translates mental commands directly into game inputs no hands required.
Individuals with conditions like cerebral palsy or spinal cord injuries who struggle with traditional input devices. Brain Game opens a path to browsing the web and controlling a computer through thought alone.
Teams building the next generation of assistive BCI applications who need a reliable, open, developer-friendly middleware layer to build on top of — without starting from scratch.
Brain Game is a complete middleware system that reads brain activity through an Emotiv EEG headset, translates those signals into reliable commands, and delivers them to a Unity game or computer application in real time. It is the missing layer between raw neural data and meaningful digital interaction — built to be accessible, open source, and easy to deploy.
The Emotiv EPOC X headset reads electrical brain activity and streams mental command signals via the EmotivBCI OSC protocol.
The Brain Game OSC Wrapper receives the raw signal and runs it through a four-stage filter pipeline removing noise, false triggers, and redundant data.
Clean, reliable commands are broadcast over WebSocket to the Unity plugin, which maps them to in-game actions in real time.
The game responds to your thoughts — move forward, turn left, interact all without touching a single button.
A look at the Brain Game team presenting our project to faculty and peers at Grand Canyon University.
Brain Game is built on a clean, modular C# architecture six namespaced modules that take a raw EEG signal and turn it into a reliable game command in under 200ms.
appsettings.json parsing and validation. 12-Factor App compliant all configuration externalized, no hardcoded values.
OSC message DTOs and action mappings. Clean data transfer objects decoupled from processing logic.
UdpReceiver on port 7400 and WebSocketServer on port 8080 with exponential backoff reconnect logic.
SignalFilter pipeline with hysteresis band, 150ms debounce window, and 15Hz rate limiter. The core reliability layer.
ActionMapper and keyboard emulation via Win32 SendInput API. Decoupled portability layer for any downstream consumer.
Structured logging with Serilog integration. Full observability into pipeline state and signal decisions.
Raw EEG mental command signals are unreliable by nature. The pipeline applies progressively stricter filters to deliver stable game-ready events to Unity.
Hysteresis band with 0.6 ON threshold and 0.5 OFF threshold prevents flickering from borderline confidence values.
150ms stability window eliminates false spike activations and noise-induced command triggers.
15Hz cap drops redundant frames, reducing WebSocket bandwidth while maintaining responsive gameplay.
Translates mental commands to game actions: push → moveForward, pull → moveBackward, left → turnLeft, right → turnRight.
IEEE 830-compliant documentation, source code, test suites, and appendices from the full project lifecycle.
Individuals outside the core team who made meaningful contributions to the Brain Game project.