VUB and CY Collaborate on AI‑Powered Brain‑Computer Interface Research

The Vrije Universiteit Brussel (VUB) and CY Cergy Paris University (CYU) are deepening their collaboration in artificial intelligence through a joint research initiative on Brain‑Computer Interfaces (BCIs). The project brings together leading expertise from both institutions to develop advanced deep‑learning methods capable of interpreting low‑quality EEG signals from portable, commercially available hardware.

The research is led by Principal Investigators Kevin De Pauw and Arnau Dillen (VUB), in partnership with the Artificial Intelligence research group at VUB and the Équipes Traitement de l’Information et Systèmes at CY Cergy Paris Université. Their shared objective is to create AI‑driven algorithms that can reliably extract user‑intent commands from noisy EEG data. These insights will help transform BCIs into practical controllers for robotic devices — a significant step toward more accessible neurotechnology.

The collaboration highlights the longstanding AI strength at VUB, home to one of Europe’s oldest and most interdisciplinary AI labs, and the complementary signal‑processing expertise at CYU. Combined, the teams aim to push the boundaries of real‑life human‑machine interaction using robust machine‑learning methods.

For more information, visit the project page:
Brain‑Computer Interface for Real-Life Applications (VUB)

How AI‑Driven Robotics Is Quietly Transforming Your Everyday Life

From vacuuming the house to navigating unpredictable environments, the next generation of intelligent robots is set to change the rhythm of daily life — and according to VUB robotics professor Bram Vanderborght, that future is closer than we might think.

“The future robot will be as affordable as the average car,”

– Bram Vanderborght, VUB Robotics Professor

As companies like Tesla, Meta, and Dyson invest heavily in AI‑powered humanoid machines, the boundary between science fiction and the household is rapidly shrinking. Imagine robots that unload the dishwasher, mow the lawn, or help with renovation tasks — tasks that today still strain our backs when machines fail to cooperate, as Vanderborght humorously notes.

Humans move effortlessly through the world thanks to embodied intelligence — the blend of sensory perception, muscle coordination, and intuitive decision‑making honed over millions of years. Robots must replicate the same abilities using sensors, processors, and motors. That is where artificial intelligence plays a growing role.

According to Vanderborght, future robots will merge model‑based control — the physics‑driven precision that keeps them safe — with AI‑based learning, which helps them adapt and perform new tasks. This combination allows robots to move among people without posing danger while still learning from the world around them.

AI, but not always the way you expect

Not all robotics breakthroughs rely on AI. Take the well‑known dancing robots from Boston Dynamics: they run on physics models rather than machine learning. But in the messy, unpredictable real world of households, workplaces, and cities, robots need AI to fill in the gaps — exactly where model‑based control falls short. The result: machines that can learn from experience and eventually support people in a way that feels natural and intuitive.

A future of collaboration, not replacement

Despite rapid advances, Vanderborght stresses that most technical and craft‑based tasks remain far beyond what robots can do. Human dexterity, adaptability, and creativity will stay essential. Intelligent robots will complement people rather than replace them — quietly taking over the heavy, repetitive, or physically demanding tasks in daily life.

Source:
 Interview with Prof. Bram Vanderborght: “VUB prof Bram Vanderborght on the Future of Robots”

READ MORE