CES 2026 | Las Vegas, USA

Booth #60445

CES Event will be held on 06 – 09 Jan'26

Event Schedule

Tue, Jan 6

Inside Pseudo-Reality: The Science of a New Sense

10:00 – 10:45 AM

A science-led introduction to Pseudo-Reality explaining how multisensory orchestration interacts with the human nervous system. Includes a demonstration.

Wed, Jan 7

From Pixels to Presence: Get to Know ArvyaX

10:00 – 10:45 AM

A media briefing unveiling ArvyaX and the shift from screen-based computing to programmable physical reality.

Wed, Jan 7

Reclaiming Reality: Nature Finds Its Way Home

3:30 – 4:30 PM

An experiential demonstration where natural environments are physically felt indoors — without screens, headsets, or simulation.

Thu, Jan 8

Reality Reimagined: Life With ArvyaX

10:00 AM – 6:00 PM

A live demonstration showing adaptive environments for reading, work, music, rest, and creative flow.

Scroll Up to View ArvyaX Booth on CES Map

CES Floor Map

ArvyaX is pioneering Pseudo-Reality, a world beyond AR/VR where imagination becomes tangible. Powered by AI, our screen-free device, SoulStrip, awakens your senses to live, work, and dream inside real ambience — yoga in mountains, work in forest, ocean reading, even magical moments you've dreamt of. Everything is possible with ArvyaX.

Why Choose ArvyaX SoulStrip?

ArvyaX SoulStrip is a sensory device that generates real-world ambience powered by AI.

Ambience AI

Predicts the ideal environment for your activity and adjusts air, light, scent, temperature, and sound.

Scenario Detection AI

Understands what you’re watching, playing, or listening to and syncs ambience with the digital content in real time.

SoundMorph AI

Decomposes environmental audio into layers and generates adaptive 3D spatial sound.

ScentSync AI

Creates scent transitions that match emotional context and scene progression.

VisionMorph AI

Generates soft 3D atmospheric light and projection depth to match mood and movement.

FlowCore (The Conductor)

Orchestrates all senses together ensuring ambience feels natural, fluid, and emotionally coherent.

Spatial Interaction Layer

Hand-tracking + edge detection that allows your body to interact with virtual objects through air & light, and force feedback.

Touch-to-Air Response Layer

Detects when your hand approaches a virtual surface and generates directional airflow to simulate “touch.”