Simulated Human Eye Movement in Metaverse

Details
  • Agency : Duke University
  • Place : USA
Advertisers :
  • Duke University is a private university specialized in research based in North Carolina founded in 1838.
Objective / insight
  • Computer engineers from this university have developed a program, using virtual eye’s movements, to help developers create applications for the fast-expanding metaverse universe while protecting user data.
Implemented strategy
  • If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that” said Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke. This program allows to develop virtual eyes that simulate the way humans look at the world with enough accuracy that companies can train virtual and augmented reality programs that can be then integrated into the Metaverse.
Technology implemented
  • Eye movements are recreated in a virtual way thanks to the analysis of each tiny movements made by one’s eyes and the level of dilatation of their pupils. Indeed, the way our pupils dilate shows how we are feeling: bored, excited, angry and even if we are at ease or not in the conversation… When you are talking to someone else, it is natural for your eyes to alternate between this person eyes, mouth and nose for various amount of time. Based on those elements, researchers from Duke University created a model that extracts where those features are on a speaker and programmed their virtual eyes to statistically emulate the time spent focusing on each part of the face. “If you give EyeSyn a lot of different inputs and run it enough times, you’ll create a data set of synthetic eye movements that is large enough to train a (machine learning) classifier for a new program” said Maria Gorlatova.
Results
  • After testing the program, it has been shown that EyeSyn was totally able to reproduce in a very accurate way the distinct patterns of real-world gaze signals as well as reproducing different reactions of the eyes of several people. In other words, the results of the virtual eyes were equivalent to human eyes. This performance was then considered good enough for companies to use as a baseline to train new metaverse platforms and software. In addition, it is possible to customize EyeSyn algorithms after interacting with specific users. This is done in order to replicate the reality and for commercial software to achieve better results. And because the personalization of the algorithms can be done on local systems, people don’t have to worry about their private eye movement data becoming part of a large database.

ADD A COMMENT

Ce site utilise Akismet pour réduire les indésirables. En savoir plus sur comment les données de vos commentaires sont utilisées.