Introduction of OAC, a project use case partner

Introduction of OAC, a project use case partner

In the framework of user requirements, we conducted interviews with the use case partners in the SUN project in order to understand better their needs and uncover insights and opportunities for desinging better the XR project pilots and applications. In this post, we...
Multimodal data fusion

Multimodal data fusion

When more than one modality (sensors, cameras, etc.) are deployed to a system for the same reason, it is essential to find robust and efficient ways to combine their data to improve the system’s performance. This is the role of multimodal fusion, a task researched by...
Rendering sense of touch in virtual manipulation

Rendering sense of touch in virtual manipulation

Rendering sense of touch in virtual manipulation is a challenging objective: acceptable dimensions and weight of wearable devices worn at fingertips is quite limited, on the other hand, interaction forces in physical manipulation are relatively high, requiring bigger...
Skip to content