Simulated Touch, Real Presence: Rethinking AI Through Sensorimotor Coupling
- Krystelle Papaux
- Jun 5
- 3 min read
Updated: Jul 7

In the quest to make AI more “embodied,” researchers often focus on giving machines physical bodies and sensory-motor systems—limbs that grasp, eyes that see, ears that listen. But embodiment isn’t just about having a body; it’s about having experience through a body. That distinction matters, especially when you start thinking about how humans sense, feel, and emotionally co-regulate through our spaces. Especially in digital space.
One of the most underexplored modalities in human-machine interaction is tactile feedback. Not just in the mechanical sense of pressure sensors or force feedback, but in the subjective, social, and emotional nature of touch. This is where studies in pseudo-haptics and mediated tactile communication start offering something that typical AI frameworks may miss: a model of embodiment that accounts for perception, affect, and social presence—not just action.
Embodied Perception Is Not Just Sensing
Traditional AI systems often treat perception as a passive data acquisition task—camera input, audio signal, accelerometer. But embodied cognition theories, especially those drawing from enactivism and sensorimotor contingencies, argue otherwise: perception is active, relational, and shaped by the agent’s own potential to act in the environment.
When we simulate touch—using latency, control/display ratios, vibrotactile and force feedback or resistance illusions etc.—we’re not just “fooling” the senses. We’re creating new perceptual spaces where meaning arises through co-regulated, anticipatory interaction. That principle is deeply aligned with how embodied agents ought to interpret the world: not through static mappings, but through dynamic coupling with their environment.
Touch Is Not a Signal—It’s a Negotiation
In human interaction, touch is rarely unidirectional. It’s not a broadcast; it’s a feedback loop. And in digital contexts, it becomes even more interesting. When one user feels the simulated pressure of an object or the blockage of a movement that corresponds to another person’s intent, a shared perceptual horizon is created—a joint space of affective meaning.
This ties into the idea that embodied AI should not just recognize emotional signals but participate in emotional exchanges. That’s a powerful proposition for social robots, telepresence systems, or AI companions: they might not need human skin, but they need to understand what touching and being touched means along with how to respond to emotional perceptual dance of touch.
Temporal Coupling, Affective Synchrony, and the Architecture of Felt Presence
Another layer is temporal coordination. Humans rely on micro-timings to feel present with others—synchrony in breathing, delays in response, smoothness in turn-taking. Our research in pseudo-haptics and mediated tactile systems offer a sandbox to explore how fine-tuned timing mechanisms (e.g., latency manipulations) can influence perceived emotional closeness or tension. For embodied AI, this raises the bar: it’s not just about real-time reaction—it’s about rhythm and resonance.
This opens up new paradigms for how embodied agents could simulate presence—not merely by “being there,” but by modulating how they are felt to be there. It’s not hard to imagine AI systems that modulate haptic output to match emotional tone or social context, enabling subtle shifts in mood, intent, or engagement—what some might call computational empathy.
Toward a New Kind of Body Schema for AI
Finally, there’s the idea of the body schema—a pre-reflective, action-oriented sense of one’s own body in space. While humans build these schemas through multisensory integration, AI systems have no such internal model (yet) that dynamically links action, sensation, and social context. Our work on pseudo-haptics and mediated touch points toward a lightweight, flexible method for embedding a kind of “virtual body” into digital interaction, one that is inherently social and emotional.
Such a schema doesn’t just help an agent navigate the world; it helps it inhabit it—emotionally, temporally, relationally.
Comments