Daily Silicon Valley

Daily Magazine For Entrepreneurs

Home » Goldenboy Technologies Creates First Controllerless Video Game World Using Brainwave Input, Simulating Pain and Taste Through Mind-Mapped Feedback

Goldenboy Technologies Creates First Controllerless Video Game World Using Brainwave Input, Simulating Pain and Taste Through Mind-Mapped Feedback

By Evelyn Matthews | Boston, Massachusetts | July 3, 2025

In a groundbreaking demonstration that blends neuroscience, machine learning, quantum computing, and immersive game design, GoldenBoy Technologies, formerly Goldenboy Studios, has successfully developed the first-ever controllerless video game world using Roblox, operated entirely by brainwaves. The system, built using the company’s proprietary EEG brain-computer interface (BCI) called NeuroKey, doesn’t just interpret a user’s intentions — it simulates real neural responses, including pain and taste, within the virtual world.

“We didn’t just control a game with thoughts,” said Dr. Marcus Lee, lead neural interface scientist. “We reproduced actual sensory experiences—by mimicking the same brainwaves that occur in the real world.”

How the System Works: NeuroKey and Partial Neural Duplication

The NeuroKey BCI headset is a non-invasive, high-resolution EEG device that reads the brain’s electrical activity from the surface of the scalp. While current neuroscience — even with quantum computing support — still lacks the capacity to copy a full human mind, GoldenBoy engineers have succeeded in isolating and duplicating neural substructures associated with pain perception and gustatory recognition.

This approach allowed the team to partially “copy” a volunteer’s mind, replicating the neuron clusters and firing patterns involved in specific sensations. According to Dr. Lee, the EEG system doesn’t read thoughts directly, but it detects intent, motor planning, and sensory expectations with enough fidelity to classify and respond to them in a game environment.

Signal Processing: From Brainwaves to Game Commands

Translating raw, noisy brain activity into smooth, playable game control isn’t a plug-and-play task — it requires a sophisticated bridge between biology and software. To achieve this, GoldenBoy Technologies engineered a custom Python-based signal interpretation layer — a middleware system that acted as the translator between the testers EEG signals and their virtual Roblox environment.

At the heart of the pipeline were two leading neuroinformatics libraries: MNE-Python and BrainFlow.

  • MNE-Python, an open-source library designed for processing EEG, MEG, and other neurophysiological signals, was used to preprocess the raw brainwave data. This included:
    • Artifact rejection (removing interference from eye blinks, facial movements, or ambient noise)
    • Signal filtering (isolating frequency bands like alpha, beta, and gamma relevant to intention recognition)
    • Epoching and time-locking the data to specific mental events
  • BrainFlow, a lightweight cross-platform tool for streaming and interpreting biosignals, handled real-time data acquisition from the NeuroKey EEG headset, enabling live updates at sub-second latency.

The processed brainwave data was then passed through a custom machine learning classifier, trained on a personalized dataset built specifically for the tester. This model used supervised learning techniques—primarily random forests and convolutional neural networks—to distinguish patterns in the testers EEG that corresponded to high-level commands like:

  • Walk forward” – characterized by sensorimotor activation and beta-band desynchronization in the central electrodes
  • Interact” – preceded by specific motor imagery and attention-based event-related potentials (ERPs)
  • Consume object” – matched to gamma oscillation bursts and reward anticipation patterns

Each classified intention was converted into a JSON-based command object and pushed via Roblox’s built-in HttpService API to a custom Roblox game instance built for the trial. HttpService acted as the lightweight real-time bridge between Python and Roblox Studio, receiving the instructions and triggering events like avatar movement, object manipulation, and even scripted taste/pain simulations.

This system allowed the tester named Kara, a 28-year-old software engineer with no prior neuroscience training, to seamlessly operate her in-game avatar without touching a mouse, keyboard, or controller. The moment her brain registered an intention, the system read it, translated it, and acted on it.

“When I wanted to walk, my avatar walked,” Kara explained. “When I focused on picking up the cheesecake, it just… happened. It was like telekinesis, but digital.”

The feedback loop was not only accurate but adaptive. With more training data gathered live during play, the system gradually improved its precision and reduced latency — giving Kara a greater sense of immersion and embodiment.

“The signal interpretation layer was the unsung hero of this entire experiment,” said GoldenBoy machine learning lead, Dr. Alina Shah. “It made the impossible—controlling a complex environment with only your brain—feel like second nature.”

Ultimately, this brain-to-Roblox control pipeline established a new standard for how non-invasive BCI systems can be integrated with consumer-grade platforms like Roblox. It didn’t just prove technical feasibility; it demonstrated a future where thought alone is enough to explore, interact with, and even shape digital worlds.

Meet the Tester: Kara Blake, 28, Software Engineer and Gamer

Kara Blake has always been fascinated by the future.

A 28-year-old software engineer from Springfield, Massachusetts, she grew up tinkering with code, dreaming of neural implants, and spending weekends in VR headsets. When she was offered the chance to test GoldenBoy’s brain-controlled Roblox prototype, her answer was immediate.

“I thought it was sci-fi at first,” Kara said, laughing. “Then I sat down, put on the NeuroKey, and everything changed.”

What followed was a deeply personal—and scientific—journey into the boundary between thought and reality.

Testing Neural Feedback: Pain Simulation in a Game World

To evaluate whether the system could go beyond control and into simulation, GoldenBoy engineers conducted a sensory test involving pain feedback.

In the first trial, the subject’s avatar was instructed to walk across a virtual firepit. Upon doing so, NeuroKey recorded a surge of nociceptive signal activity in the somatosensory cortex — nearly identical to what the brain would produce when experiencing actual pain from heat.

A video game of a person sitting on a fire

AI-generated content may be incorrect.

The data matched her earlier real-world test, where she lightly touched a hot object. The Roblox simulation, combined with visual immersion and closed-loop EEG feedback, effectively triggered the same pain-related brainwave response.

In a follow-up test, engineers intervened: using neurofeedback algorithms, they modified the output loop to block the pain response. This time, when the subject repeated the firepit trial, no pain-related activity was detected, and the subject confirmed she felt no discomfort — despite visually perceiving the same scenario.

Simulating Taste: Cheesecake in the Digital World

The second major test involved gustatory simulation. First, the subject consumed a slice of her favorite real-world dessert: cheesecake. NeuroKey mapped her brain’s taste-processing activity and neural reward signatures.

Then, a digital replica of the cheesecake was introduced into the Roblox world.

Upon interacting with the virtual cheesecake — without any real consumption — the subject’s brain reproduced nearly identical neural firing patterns, particularly in the insula and orbitofrontal cortex, areas associated with flavor perception and reward response.

No Touch. No Typing. Only Thought.

Throughout the entire trial, Kara never touched a single controller, keyboard, or headset button. There were no joysticks, no eye-tracking tricks, and no muscle movement at all. The entire system was operated via non-invasive BCI control, using only the real-time electrical signals generated by her brain.

From the moment the NeuroKey EEG headset was placed gently around her scalp, it began to monitor Kara’s brainwaves with sub-millisecond precision. Every motor intention, focused visual cue, and imagined action triggered a cascade of neuronal activity—activity that Goldenboy’s software had been trained to recognize and translate.

“It was like my thoughts were typing commands,” Kara said. “At first, I was trying hard to concentrate. But after a few minutes, it felt natural. Like the game knew me.”

The process wasn’t simple under the hood. The EEG signals—noisy, overlapping, and faint—were filtered through adaptive algorithms, cleaned with bandpass filtering, and parsed by trained classifiers using machine learning models. These models had been previously personalized using a baseline calibration phase, where Kara imagined various movements (like walking, grabbing, and eating) while her brain patterns were mapped and labeled.

Once calibration was complete, no more input was needed except Kara’s mental focus.

When she thought about walking forward, the classifier detected premotor cortex activation, recognized the intention, and sent a move_forward command to Roblox via HTTP. When she imagined reaching toward the cheesecake, her sensorimotor rhythms changed in a specific, trained way—triggering an in-game interaction.

Even subtle mental shifts, like hesitation or surprise, began to correlate with in-game delays or reaction animations. GoldenBoy engineers noted that as Kara grew more comfortable, the system began anticipating actions based on her thought cadence and intensity, not just binary commands.

“The real breakthrough wasn’t just recognizing what she wanted to do,” said GoldenBoy CTO Jason Lin. “It was recognizing how she wanted to feel while doing it. We tapped into more than just motion. We tapped into her intent.”

This experiment marked the first time a player controlled not only their avatar’s movement and interaction—but also their internal, emotional, and sensory feedback—purely through thought.

Kara described the feeling as “almost out-of-body,” but not in the dissociative sense. Instead, she said, “It was like the game met me halfway. I didn’t have to leave myself behind—I just brought my mind into the world.”

GoldenBoy founder Keith Simpson summed it up best:

“The goal wasn’t just to control a game,” he said. “It was to simulate a complete sensory interaction—without a body. And we did it.”

Why It Matters: Building a Safe Secondary Reality

GoldenBoy Technologies sees this breakthrough not just as an innovation in gaming, but as a preview of an entirely new kind of reality—one where people can live mentally in synthetic worlds, free of the limitations and dangers of the physical body.

The long-term vision is to create a secondary digital reality where:

  • A person’s complete mind can eventually be copied or mirrored
  • Sensory experiences like touch, taste, and emotion can be programmed and controlled
  • Negative signals like pain, trauma, or fear can be selectively blocked
  • Positive feedback — joy, flavor, stimulation — can be amplified and made persistent

This future would offer therapeutic potential for chronic pain patients, trauma survivors, or those with degenerative conditions, while opening entirely new frontiers in education, entertainment, and human interaction.

“We’re not just making games,” Dr. Lee said. “We’re making a new form of existence. And for the first time, the brain is both the player and the controller.”

What’s Next?

GoldenBoy Technologies plans to expand testing to include emotional simulation, spatial memory recall, and eventually full cognitive environment mapping. The team is also working on deeper integration with real-time fMRI data and more refined EEG decoding using larger-scale quantum-assisted models.

Beta applications for limited trials are expected to open in early 2026, with a focus on therapeutic use cases and exploratory research in synthetic consciousness.

Until then, this test stands as the first public demonstration of a mind-controlled game that feels real — not because of its graphics, but because the brain believes it is real.

Silicon Valley Daily

Daily magazine for entrepreneurs and business owners

Back to top