2024-10-10 03-33-38.mp4

Paper working on : hand gesture detection - machine learning in Unity VR (Oct 2024)


XSENSOR Technology - 2023

Developed a medical visualization software in Unity for XSensor, integrating with their .NET application to process data from a mattress embedded with 2,600 sensors. The software used inverse kinematics for real-time image processing, combining video frames, ML position data, and pressure data from three socket pipelines.


Working within a cross-functional team of 10—comprising software developers, ML specialists, a technical lead, a product manager, and QAs—I facilitated communication between ML and technical teams, integrating Unity seamlessly into XSensor’s platform. I also resolved real-time data challenges by implementing data cleaning and validation, ensuring smooth, accurate functionality.

Research Assistant in University on VR Unity C# Sculpting (2021-2023)

In pursuit of VR sculpting with natural hand interaction, I tackled significant memory challenges by diving deeply into mesh deformation. I developed a custom deformation system in pure C#, applying movement to each vertex of an object upon interaction. This involved simulating spring-like motion at the point of interaction, with each vertex’s response based on its distance from the action point over time.

Initially, this approach was memory-intensive. To optimize, I experimented with Entity Component System (ECS), using 10,000 small objects to create a responsive environment, later exploring fluid-like simulation for more realistic sculpting behavior.

ECS Performance testing all objects has rigid body and colliders.