MYamenSPERSONAL SITE

PUBLICATIONS

02 Jul 2017

MetaLimbs: metamorphosis for multiple arms interaction using artificial limbs

Los Angeles, California


MetaLimbs: metamorphosis for multiple arms interaction using artificial limbs

If we could have the capability to edit or customize our body scheme by technology, could our abilities and activities be enhanced? This research proposes a novel interaction to alternate body scheme using artificial limbs substitution metamorphosis. In this work, two additional robotic arms are added to user’s body, and are manipulated by legs movement. Limbs control is achieved using two sets of tracking systems: global motion tracking of legs using an optical tracker and local motion tracking for manipulation purposes using socks type device. These data are mapped into artificial limbs’ arms, hands and fingers motion. Lastly, force feedback is provided to the feet and mapped to manipulator’s touch sensors.

18 Mar 2017

Observation of mirror reflection and voluntary self-touch enhance self-recognition for a telexistence robot

Los Angeles, California

Virtual Reality (VR), 2017 IEEE

Poster Yasuyuki Inoue, Fumihiro Kato, Mhd Yamen Saraiji, Charith Lasantha Fernando, Susumu Tachi

Observation of mirror reflection and voluntary self-touch enhance self-recognition for a telexistence robot

Yasuyuki Inoue, Fumihiro Kato, Mhd Yamen Saraiji, Charith Lasantha Fernando, Susumu Tachi Poster

In this paper, we analyze the subjective feelings about the body of the operator of a
telexistence system. We investigate whether a mirror reflection and self-touch affect body
ownership and agency for a surrogate robot avatar in a virtual reality experiment. Results
showed that the presence of tactile sensations synchronized with the view of self-touch
events enhanced mirror self-recognition.

09 Aug 2015

Twech: a mobile platform to search and share visuo-tactile experiences

Los Angeles, California

SIGGRAPH 2015

Poster Nobuhisa Hanamitsu, Kanata Nakamura, Mhd Yamen Saraiji, Kouta Minamizawa, Susumu Tachi

Twech: a mobile platform to search and share visuo-tactile experiences

Nobuhisa Hanamitsu, Kanata Nakamura, Mhd Yamen Saraiji, Kouta Minamizawa, Susumu Tachi Poster

Twech is a mobile platform that enables users to share visuo-tactile experience and search other experiences for tactile data. User can record and share visuo-tactile experiences by using a visuo-tactile recording and displaying attachment for smartphone, allows the user to instantly such as tweet, and re-experience shared data such as visuo-motor coupling. Further, Twech’s search engine finds similar other experiences, which were scratched material surfaces, communicated with animals or other experiences, for uploaded tactile data by using search engine is based on deep learning that ware expanded for recognizing tactile materials. Twech provides a sharing and finding haptic experiences and users re-experience uploaded visual-tactile data from cloud server.

23 Mar 2016

Changing body ownership using visual metamorphosis

Laval, France

Laval 2016

Poster Tomoya Sasaki, MHD Yamen Saraiji, Kouta Minamizawa, Michiteru Kitazaki, Masahiko Inami

Changing body ownership using visual metamorphosis

Tomoya Sasaki, MHD Yamen Saraiji, Kouta Minamizawa, Michiteru Kitazaki, Masahiko Inami Poster

This paper presents a study of using supernumerary arms experience in virtual reality applications. In this study, a system was developed that alternates user’s body scheme and motion mapping in real-time when the user interacts with virtual contents. User arms and hands are tracked and mapped into several virtual arms instances that were generated from user’s first point of view (FPV), and are deviated from his physical arms position at different angles. Participants reported a strong sense of body ownership toward the extra arms after interacting and holding virtual contents using them. Our finding is body ownership perception can be altered based on the condition used. Also, one interesting finding in this preliminary experiment is that the participants reported strong ownership toward the arm that actually is not holding the virtual object. This study contributes in the fields of augmented bodies, multi-limbs applications, as well as prosthetic limbs.

01 Jul 2016

Layered Telepresence: Simultaneous Multi Presence Experienceusing Eye Gaze based Perceptual Awareness Blending

KYOTO-JAPAN

ACM SIGGRAPH 2016 Emerging Technologies, Anaheim, USA, Article No. n, July. 2016 (to appear)

DemonstrationsPoster Selected Mhd Yamen Saraiji, Shota Sugimoto, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi

Layered Telepresence: Simultaneous Multi Presence Experienceusing Eye Gaze based Perceptual Awareness Blending

Mhd Yamen Saraiji, Shota Sugimoto, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi DemonstrationsPoster Selected

We propose “Layered Telepresence”, a novel method of experiencing simultaneous multi-presence. Users eye gaze and perceptual awareness are blended with real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority-driven layered stack. A weighted feature map was created based on the objects recognized for each layer using image-processing techniques, and pushes the most weighted layer around the users gaze into the foreground. All other layers are pushed back to the background providing an artificial depth-of-field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see-through HMD, television screen or even your PC screen enabling true multitasking.

SIGGRAPH 2016 ETech page

01 Oct 2015

Effectiveness of Spatial Coherent Remote Drive Experience with a Telexistence Backhoe for Construction Sites

KYOTO-JAPAN

ICAT-EGVE 2015 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, pp. 69-75, Oct 2015

Conferences Charith Lasantha Fernando, Mhd Yamen Saraiji, Kouta Minamizawa, Susumu Tachi

Effectiveness of Spatial Coherent Remote Drive Experience with a Telexistence Backhoe for Construction Sites

Charith Lasantha Fernando, Mhd Yamen Saraiji, Kouta Minamizawa, Susumu Tachi Conferences

In this paper, a spatial coherent remote driving system was designed and implemented to operate a telexistence backhoe over a wireless network. Accordingly, we developed a 6 degrees of Freedom (DOF) slave robot that can mimic the human upper body movement; a cockpit with motion tracking system and a Head Mounted Display (HMD) where the operator was provided with a HD720p ultra low latency video and audio feedback from the remote backhoe, and a controller to manipulate the remote backhoe. Spatial coherent driving could help manipulating heavy machinery without any prior training and perform operations as if they were operating the machinery locally. Moreover, construction work could be performed uninterrupted (24/7) by operators remotely log-in from all over the world. This paper describes the design requirements of developing the telexistence backhoe followed by several field experiments carried out to verify the effectiveness of spatial coherent remote driving experience in construction sites

01 Oct 2015

Development of Mutual Telexistence System using Virtual Projection of Operator’s Egocentric Body Images

KYOTO-JAPAN

ICAT-EGVE 2015 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, Kyoto, Japan. pp. 125-132 (Honorable Mention & Best Demo Award)

ConferencesDemonstrations Selected Mhd Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi

Development of Mutual Telexistence System using Virtual Projection of Operator’s Egocentric Body Images

Mhd Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi ConferencesDemonstrations Selected

In this paper, a mobile telexistence system that provides mutual embodiment of user’s body in a remote place is discussed. A fully mobile slave robot was designed and developed to deliver visual and motion mapping with user’s head and body. The user can access the robot remotely using a Head Mounted Display (HMD) and set of head trackers. This system addresses three main points that are as follows: User’s body representation in a remote physical environment, preserving body ownership toward the user during teleoperation, and presenting user’s body interactions and visuals into the remote side. These previous three points were addressed using virtual projection of user’s body into the egocentric local view, and projecting body visuals remotely. This system is intended to be used for teleconferencing and remote social activities when no physical manipulation is required.

01 Mar 2015

Transparent Cockpit using Telexistence

ARLES- FRANCE

2015 IEEE Virtual Reality (VR), Arles, France. pp. 311-312

Poster Takura Yanagi, Charith Lasantha Fernando, MHD Yamen Saraij, Kouta Minamizawa, Susumu Tachi, Norimasa Kishi

Transparent Cockpit using Telexistence

Takura Yanagi, Charith Lasantha Fernando, MHD Yamen Saraij, Kouta Minamizawa, Susumu Tachi, Norimasa Kishi Poster

We propose an indirect-vision, video-see-through augmented reality (AR) cockpit that uses telexistence technology to provide an AR enriched, virtually transparent view of the surroundings through monitors instead of windows. Such a virtual view has the potential to enhance driving performance and experience above conventional glass as well as head-up display equipped cockpits by combining AR overlays with images obtained by future image sensors that are superior to human eyes. As a proof of concept, we replaced the front windshield of an experimental car by a large stereoscopic monitor. A robotic stereo camera pair that mimics the driver’s head motions provides stereoscopic images with seamless motion parallax to the monitor. Initial driving tests at moderate speeds on roads within our research facility confirmed the illusion of transparency. We will conduct human factors evaluations after implementing AR functions in order to show whether it is possible to achieve an overall benefit over conventional cockpits in spite of possible conceptual issues like latency, shift of viewpoint and short distance between driver and display.

01 Mar 2015

Telexistence drone: design of a flight telexistence system for immersive aerial sports experience

SINGAPORE

Proceedings of the 6th Augmented Human International Conference, Singapore. pp. 171-172

Poster Hirohiko Hayakawa, Charith Lasantha Fernando, MHD Yamen Saraiji, Kouta Minamizawa, and Susumu Tachi

Telexistence drone: design of a flight telexistence system for immersive aerial sports experience

Hirohiko Hayakawa, Charith Lasantha Fernando, MHD Yamen Saraiji, Kouta Minamizawa, and Susumu Tachi Poster

In this paper, a new sports genre, “Aerial Sports” is introduced where the humans and robots collaborate to enjoy space as a whole new field. By integrating a flight unit with the user’s voluntary motion, everyone can enjoy the crossing physical limitations such as height and physique. The user can dive into the drone by wearing a HMD and experience the provided binocular stereoscopic visuals and sensation of flight using his limbs effectively. In this paper, the requirements and design steps for a Synchronisation of visual information and physical motion in a flight system is explained mainly for aerial sports experience. The requirements explained in this paper can be also adapted to the purpose such as search and rescue or entertainment purposes where the coupled body motion has advantages.

27 Mar 2015

Virtual Embodied Telexistence: Telecommunication using Sensory Feedback and Virtual Body Representation

JAPAN

My Master Thesis in Media Design

Theses MHD Yamen Saraiji

Virtual Embodied Telexistence: Telecommunication using Sensory Feedback and Virtual Body Representation

MHD Yamen Saraiji Theses

With the advancements in Internet based services, the demands for better communication tools became higher in order to fill the distance gaps between people. Especially in our daily activities when we are traveling abroad. The traditional video/audio communication tools became actively available within our reach thanks to smartphone advancements. Internet based conferencing such as “Google Hangouts” made it more reachable by using web-based tools, allowing two or more participants to engage in a social meeting. However, the demand for social tools which allows the participants to move freely in the remote place existed. On one hand, Telepresence robots stepped into the scene to fill that need, also it opened new services to the consumers such as remotely visiting museums or attending exhibitions and conferences. On the other hand, Telexistence systems provided more sophisticated capabilities to the user by replicating several functions of his body into the remote place using mechanical parts. “Virtual Embodied Telexistence” is a system that combines low cost telexistence system with virtually embodied human functions into it. Allowing the user to have an immersive visual feedback of the remote place while being aware of his body via visual representation of it.
“Virtual Embodiment” proposed in this thesis defines the important elements for replacing physical representations of human’s body into a virtual representation, as a way to increase the sense of presence in a different place. Increasing the sense of body presence in telecommunication helps the user to act more naturally and intuitively using his body motion.
This thesis explores the proposed concept of Virtual Embodiment for human’s body and its tight link with presence. The design flow to realize a virtually embodied telexistence system is discussed, as well as the implementation procedure that has been done. We show the efficiency of using this system by objective user evaluation, and how intuitive it is to be used by any user.

01 Mar 2015

Mutual hand representation for telexistence robots using projected virtual hands

SINGAPORE

Proceedings of the 6th Augmented Human International Conference, Singapore. pp. 221-222 (Best Demo Award)

Demonstrations Selected MHD Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, and Susumu Tachi

Mutual hand representation for telexistence robots using projected virtual hands

MHD Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, and Susumu Tachi Demonstrations Selected

In this paper, a mutual body representation for Telexistence Robots that does not have physical arms were discussed. We propose a method of projecting user’s hands as a virtual superimposition that not only the user sees through a HMD, but also to the remote participants by projecting virtual hands images into the remote environment with a small projector aligned with robot’s eyes. These virtual hands are produced by capturing user’s hands from the first point of view (FPV), and then segmented from the background. This method expands the physical body representation of the user, and allows mutual body communication between the user and remote participants while providing a better understanding user’s hand motion and intended interactions in the remote place.

01 Dec 2014

Enforced telexistence: teleoperating using photorealistic virtual body and haptic feedback

SHENZHEN - CHINA

SIGGRAPH Asia 2014 Emerging Technologies, Shenzhen, China, Article 7, pp. 2.

Demonstrations Mhd Yamen Saraiji, Charith Lasantha Fernando, Yusuke Mizushina, Youichi Kamiyama, Kouta Minamizawa, Susumu Tachi

Enforced telexistence: teleoperating using photorealistic virtual body and haptic feedback

Mhd Yamen Saraiji, Charith Lasantha Fernando, Yusuke Mizushina, Youichi Kamiyama, Kouta Minamizawa, Susumu Tachi Demonstrations

Telexistence systems require physical limbs for remote object manipulation. Having arms and hands synchronized with voluntary movements allows the user to feel robot’s body as his body through visual, and haptic sensation. In this method, we introduce a novel technique that provides virtual arms for existing telexistence systems that does not have physical arms. Previous works [Mine et al. 1997; Poupyrev et al. 1998; Nedel et al. 2003] involved the study of using virtual representation of user hands in virtual environments for interactions. In this work, the virtual arms serves for several interactions in a physical remote environment, and most importantly they provide the user the sense of existence in that remote environment. These superimposed virtual arms follows the user’s real-time arm movements and reacts to the dynamic lighting of real environment providing photorealistic rendering adapting to remote place lighting. Thus, it allows the user to have an experience of embodied enforcement towards the remote environment. Furthermore, these virtual arms can be extended to touch and feel unreachable remote objects, and to grab a functional virtual copy of a physical instance where device control is possible. This method does not only allow the user to experience a non-existing arm in telexistence, but also gives the ability to enforce remote environment in various ways.

01 Aug 2014

Enforced telexistence

VANCOUVER - CANADA

ACM SIGGRAPH 2014 Posters, Vancouver, Canada, Article No 49.

Poster MHD Yamen Saraiji, Yusuke Mizushina, Charith Lasantha Fernando, Masahiro Furukawa, Youichi Kamiyama, Kouta Minamizawa, and Susumu Tachi

Enforced telexistence

MHD Yamen Saraiji, Yusuke Mizushina, Charith Lasantha Fernando, Masahiro Furukawa, Youichi Kamiyama, Kouta Minamizawa, and Susumu Tachi Poster

Telexistence systems require physical limbs for remote object manipulation. Having arms and hands synchronized with voluntary movements allows the user to feel robot’s body as his body through visual, and haptic sensation. In this method, we introduce a novel technique that provides virtual arms for existing telexistence systems that does not have physical arms. Previous works [Mine et al. 1997; Poupyrev et al. 1998; Nedel et al. 2003] involved the study of using virtual representation of user hands in virtual environments for interactions. In this work, the virtual arms serves for several interactions in a physical remote environment, and most importantly they provide the user the sense of existence in that remote environment. These superimposed virtual arms follows the user’s real-time arm movements and reacts to the dynamic lighting of real environment providing photorealistic rendering adapting to remote place lighting. Thus, it allows the user to have an experience of embodied enforcement towards the remote environment. Furthermore, these virtual arms can be extended to touch and feel unreachable remote objects, and to grab a functional virtual copy of a physical instance where device control is possible. This method does not only allow the user to experience a non-existing arm in telexistence, but also gives the ability to enforce remote environment in various ways.

01 Dec 2013

Real-time egocentric superimposition of operator’s own body on telexistence avatar in virtual environment

TOKYO - JAPAN

23rd International Conference on Artificial Reality and Telexistence (ICAT) 2013

Conferences MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi

Real-time egocentric superimposition of operator’s own body on telexistence avatar in virtual environment

MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi Conferences

With the advancements in Internet based services, the demands for better communication tools became higher in order to fill the distance gaps between people. Especially in our daily activities when we are traveling abroad. The traditional video/audio communication tools became actively available within our reach thanks to smartphone advancements. Internet based conferencing such as “Google Hangouts” made it more reachable by using web-based tools, allowing two or more participants to engage in a social meeting. However, the demand for social tools which allows the participants to move freely in the remote place existed. On one hand, Telepresence robots stepped into the scene to fill that need, also it opened new services to the consumers such as remotely visiting museums or attending exhibitions and conferences. On the other hand, Telexistence systems provided more sophisticated capabilities to the user by replicating several functions of his body into the remote place using mechanical parts. “Virtual Embodied Telexistence” is a system that combines low cost telexistence system with virtually embodied human functions into it. Allowing the user to have an immersive visual feedback of the remote place while being aware of his body via visual representation of it.
“Virtual Embodiment” proposed in this thesis defines the important elements for replacing physical representations of human’s body into a virtual representation, as a way to increase the sense of presence in a different place. Increasing the sense of body presence in telecommunication helps the user to act more naturally and intuitively using his body motion.
This thesis explores the proposed concept of Virtual Embodiment for human’s body and its tight link with presence. The design flow to realize a virtually embodied telexistence system is discussed, as well as the implementation procedure that has been done. We show the efficiency of using this system by objective user evaluation, and how intuitive it is to be used by any user.

01 Dec 2012

Virtual Telesar – Designing and Implementation of a Modular Based Immersive Virtual Telexistence Platform

FUKUOKA - JAPAN

IEEE Virtual Reality (VR) 2013, pp.595 - 598.

Conferences MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi

Virtual Telesar – Designing and Implementation of a Modular Based Immersive Virtual Telexistence Platform

MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi Conferences

In this paper, we focus on designing a customizable modular based virtual platform for modeling, simulating and testing telexistence applications, where the physical parameters are preserved in the virtual environment; for motor control of physical characteristics of robot as well as sensory feedback of vision, auditory, haptic. We proposed “Virtual Telesar” which allows telexistence engineers to model a prototype system before manufacturing it and to experience how the final model will look-like, perform manipulations etc… in real world without building it. The platform consists of three features: first, the user can define a robot using predefined modular components. Second, the user can customize and tune up parameters. Third, the user can have immersive experiences of operating the robot with visual, auditory and haptic sensation. In this paper, we describe a design concept of Virtual Telesar platform and report modeling result based on a physical robot and result of immersive experience with it.