MYamenSPERSONAL SITE
.01

ABOUT

PERSONAL DETAILS
135-0064 Japan-Tokyo-Kotoku-Aomi 2-2-A804
yamen@kmd.keio.ac.jp
+081 80 3442 5775
Hello! I am a PhD student now at Graduate School of Media Design, Keio University, Japan. I am passionate at interactive media and Computer-Human Interaction applications. Feel free to get in touch with me!

BIO

ABOUT ME

After graduating from bachelor degree in 2010 with Computer Science degree, I worked for two years as a senior programmer for Game Development company followed by a major Telecommunication company in Syria (SyriaTel). In 2012, I continued my education journey at Keio University in Japan, receiving my Master degree in Media Design in 2015, and now seeking my PhD degree from the same graduate school. I have lot of interests, but currently I am focusing my research to topics related to Computer-Human Interaction, Telepresence/Telexistence, and mutual communication experience in remote locations.

HOBBIES

INTERESTS

Love photography, hiking and traveling whenever I had time!

FACTS

FACTS ABOUT ME

.02

RESUME

EDUCATION
  • 2005
    2010
    SYRIA

    BACHELOR DEGREE IN COMPUTER SCIENCE

    DAMASCUS UNIVERSITY

    Received my degree in computer science while majoring in Artificial Intelligence from Damascus University.
  • 2013
    2015
    JAPAN

    MASTER DEGREE IN MEDIA DESIGN

    KEIO UNIVERSITY

    I received my Master Degree in Media Design from the graduate school of Media Design - Keio University with a GPA equivalent to 3.98/4.33 and were selected as one of Dean's students upon graduation.
  • 2015
    2018 (expected)
    JAPAN

    PHD STUDENT

    KEIO UNIVERSITY

    Majoring in Media Design at Keio University, and my current research activities involves in: Haptics, Telexistence, Telepresence, Virtual/Augmented Reality
ACADEMIC AND PROFESSIONAL POSITIONS
  • 2012
    JAPAN

    GRADUATE STUDENT RESEARCHER

    GRADUATE SCHOOL OF MEDIA DESIGN - KEIO UNIVERSITY

    Worked at Tachi-lab as a research student tackling projects related to Telexistence mainly.
  • 2013
    NOW
    JAPAN

    RESEARCH ASSISTANT

    GRADUATE SCHOOL OF MEDIA DESIGN - KEIO UNIVERSITY

  • 2009
    2011
    SYRIA

    SENIOR PROGRAMMER, SYSTEM ANALYST

    JOY BOX

    System Designer, Analyzer and Programmer at a game development company: Joy Box. Worked with professional people: Administrators, 2D/3D Designers and Programmers for various projects related to entertainment, education and media.
  • 2011
    2012
    SYRIA

    SENIOR PROGRAMMER, SYSTEM ANALYST

    SYRIATEL

    - Communicating with clients and stakeholders to understand their requirements. - Database development and data immigration. - Developing both Front End and Back End of the software. - Developing reporting tools for the stakeholders. - Communicating with the upper management staff.
HONORS AND AWARDS
  • 2012
    2018
    JAPAN

    MEXT SCHOLARSHIP

    MINISTRY OF EDUCATION IN JAPAN

    Fully paid scholarship covering Master and PhD Degrees in Japanese universities.
  • 2015
    SINGAPORE

    BEST DEMO AWARD

    AUGMENTED HUMAN 2015

    Received Best Demo award at the International Conference of Augmented Human 2015 for the project titled: "Mutual Hand Representation for Telexistence Robots Using Projected Virtual Hands" Reference
  • 2015
    JAPAN

    DEAN'S LIST

    KEIO MEDIA DESIGN

    Honoured as being selected in the dean's list of my graduate school. Reference
  • 2015
    KYOTO - JAPAN

    BEST DEMO AWARD & HONORABLE MENTION AT ICAT'15

    EUROGRAPHICS - ICAT2015

    Won two awards at the 25th International Conference on Artificial Reality and Telexistence held in Kyoto, Japan. - Honorable Mentions: "Development of Mutual Telexistence System using Virtual Projection of Operator's Egocentric Body Images", Mhd Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa and Susumu Tachi. - Best Demo: "Development of Mutual Telexistence System using Virtual Projection of Operator's Egocentric Body Images" Mhd Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa and Susumu Tachi. Reference
  • 2015
    TOKYO - JAPAN

    PEPPER APP CHALLENGE - BEST AWARD

    SOFTBANK ROBOTICS

    Received best award and best care and welfare prize for the project: HUG Project Reference
  • 2016
    Anaheim - Los Angeles

    STUDENT RESEARCH COMPETITION - GOLD PRIZE

    ACM SIGGRAPH 2016

    Received the first prize at ACM's Student Research Competition for Graduate students category at SIGGRAPH 2016 Reference
.03

PUBLICATIONS

PUBLICATIONS LIST
01 Jul 2016

Layered Telepresence: Simultaneous Multi Presence Experienceusing Eye Gaze based Perceptual Awareness Blending

KYOTO-JAPAN

ACM SIGGRAPH 2016 Emerging Technologies, Anaheim, USA, Article No. n, July. 2016 (to appear)

DemonstrationsPoster Selected Mhd Yamen Saraiji, Shota Sugimoto, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi

Layered Telepresence: Simultaneous Multi Presence Experienceusing Eye Gaze based Perceptual Awareness Blending

Mhd Yamen Saraiji, Shota Sugimoto, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi DemonstrationsPoster Selected

We propose “Layered Telepresence”, a novel method of experiencing simultaneous multi-presence. Users eye gaze and perceptual awareness are blended with real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority-driven layered stack. A weighted feature map was created based on the objects recognized for each layer using image-processing techniques, and pushes the most weighted layer around the users gaze into the foreground. All other layers are pushed back to the background providing an artificial depth-of-field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see-through HMD, television screen or even your PC screen enabling true multitasking.

SIGGRAPH 2016 ETech page

01 Oct 2015

Effectiveness of Spatial Coherent Remote Drive Experience with a Telexistence Backhoe for Construction Sites

KYOTO-JAPAN

ICAT-EGVE 2015 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, pp. 69-75, Oct 2015

Conferences Charith Lasantha Fernando, Mhd Yamen Saraiji, Kouta Minamizawa, Susumu Tachi

Effectiveness of Spatial Coherent Remote Drive Experience with a Telexistence Backhoe for Construction Sites

Charith Lasantha Fernando, Mhd Yamen Saraiji, Kouta Minamizawa, Susumu Tachi Conferences

In this paper, a spatial coherent remote driving system was designed and implemented to operate a telexistence backhoe over a wireless network. Accordingly, we developed a 6 degrees of Freedom (DOF) slave robot that can mimic the human upper body movement; a cockpit with motion tracking system and a Head Mounted Display (HMD) where the operator was provided with a HD720p ultra low latency video and audio feedback from the remote backhoe, and a controller to manipulate the remote backhoe. Spatial coherent driving could help manipulating heavy machinery without any prior training and perform operations as if they were operating the machinery locally. Moreover, construction work could be performed uninterrupted (24/7) by operators remotely log-in from all over the world. This paper describes the design requirements of developing the telexistence backhoe followed by several field experiments carried out to verify the effectiveness of spatial coherent remote driving experience in construction sites

01 Oct 2015

Development of Mutual Telexistence System using Virtual Projection of Operator’s Egocentric Body Images

KYOTO-JAPAN

ICAT-EGVE 2015 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, Kyoto, Japan. pp. 125-132 (Honorable Mention & Best Demo Award)

ConferencesDemonstrations Selected Mhd Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi

Development of Mutual Telexistence System using Virtual Projection of Operator’s Egocentric Body Images

Mhd Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi ConferencesDemonstrations Selected

In this paper, a mobile telexistence system that provides mutual embodiment of user’s body in a remote place is discussed. A fully mobile slave robot was designed and developed to deliver visual and motion mapping with user’s head and body. The user can access the robot remotely using a Head Mounted Display (HMD) and set of head trackers. This system addresses three main points that are as follows: User’s body representation in a remote physical environment, preserving body ownership toward the user during teleoperation, and presenting user’s body interactions and visuals into the remote side. These previous three points were addressed using virtual projection of user’s body into the egocentric local view, and projecting body visuals remotely. This system is intended to be used for teleconferencing and remote social activities when no physical manipulation is required.

01 Mar 2015

Transparent Cockpit using Telexistence

ARLES- FRANCE

2015 IEEE Virtual Reality (VR), Arles, France. pp. 311-312

Poster Takura Yanagi, Charith Lasantha Fernando, MHD Yamen Saraij, Kouta Minamizawa, Susumu Tachi, Norimasa Kishi

Transparent Cockpit using Telexistence

Takura Yanagi, Charith Lasantha Fernando, MHD Yamen Saraij, Kouta Minamizawa, Susumu Tachi, Norimasa Kishi Poster

We propose an indirect-vision, video-see-through augmented reality (AR) cockpit that uses telexistence technology to provide an AR enriched, virtually transparent view of the surroundings through monitors instead of windows. Such a virtual view has the potential to enhance driving performance and experience above conventional glass as well as head-up display equipped cockpits by combining AR overlays with images obtained by future image sensors that are superior to human eyes. As a proof of concept, we replaced the front windshield of an experimental car by a large stereoscopic monitor. A robotic stereo camera pair that mimics the driver’s head motions provides stereoscopic images with seamless motion parallax to the monitor. Initial driving tests at moderate speeds on roads within our research facility confirmed the illusion of transparency. We will conduct human factors evaluations after implementing AR functions in order to show whether it is possible to achieve an overall benefit over conventional cockpits in spite of possible conceptual issues like latency, shift of viewpoint and short distance between driver and display.

01 Mar 2015

Telexistence drone: design of a flight telexistence system for immersive aerial sports experience

SINGAPORE

Proceedings of the 6th Augmented Human International Conference, Singapore. pp. 171-172

Poster Hirohiko Hayakawa, Charith Lasantha Fernando, MHD Yamen Saraiji, Kouta Minamizawa, and Susumu Tachi

Telexistence drone: design of a flight telexistence system for immersive aerial sports experience

Hirohiko Hayakawa, Charith Lasantha Fernando, MHD Yamen Saraiji, Kouta Minamizawa, and Susumu Tachi Poster

In this paper, a new sports genre, “Aerial Sports” is introduced where the humans and robots collaborate to enjoy space as a whole new field. By integrating a flight unit with the user’s voluntary motion, everyone can enjoy the crossing physical limitations such as height and physique. The user can dive into the drone by wearing a HMD and experience the provided binocular stereoscopic visuals and sensation of flight using his limbs effectively. In this paper, the requirements and design steps for a Synchronisation of visual information and physical motion in a flight system is explained mainly for aerial sports experience. The requirements explained in this paper can be also adapted to the purpose such as search and rescue or entertainment purposes where the coupled body motion has advantages.

27 Mar 2015

Virtual Embodied Telexistence: Telecommunication using Sensory Feedback and Virtual Body Representation

JAPAN

My Master Thesis in Media Design

Theses MHD Yamen Saraiji

Virtual Embodied Telexistence: Telecommunication using Sensory Feedback and Virtual Body Representation

MHD Yamen Saraiji Theses

With the advancements in Internet based services, the demands for better communication tools became higher in order to fill the distance gaps between people. Especially in our daily activities when we are traveling abroad. The traditional video/audio communication tools became actively available within our reach thanks to smartphone advancements. Internet based conferencing such as “Google Hangouts” made it more reachable by using web-based tools, allowing two or more participants to engage in a social meeting. However, the demand for social tools which allows the participants to move freely in the remote place existed. On one hand, Telepresence robots stepped into the scene to fill that need, also it opened new services to the consumers such as remotely visiting museums or attending exhibitions and conferences. On the other hand, Telexistence systems provided more sophisticated capabilities to the user by replicating several functions of his body into the remote place using mechanical parts. “Virtual Embodied Telexistence” is a system that combines low cost telexistence system with virtually embodied human functions into it. Allowing the user to have an immersive visual feedback of the remote place while being aware of his body via visual representation of it.
“Virtual Embodiment” proposed in this thesis defines the important elements for replacing physical representations of human’s body into a virtual representation, as a way to increase the sense of presence in a different place. Increasing the sense of body presence in telecommunication helps the user to act more naturally and intuitively using his body motion.
This thesis explores the proposed concept of Virtual Embodiment for human’s body and its tight link with presence. The design flow to realize a virtually embodied telexistence system is discussed, as well as the implementation procedure that has been done. We show the efficiency of using this system by objective user evaluation, and how intuitive it is to be used by any user.

01 Mar 2015

Mutual hand representation for telexistence robots using projected virtual hands

SINGAPORE

Proceedings of the 6th Augmented Human International Conference, Singapore. pp. 221-222 (Best Demo Award)

Demonstrations Selected MHD Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, and Susumu Tachi

Mutual hand representation for telexistence robots using projected virtual hands

MHD Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, and Susumu Tachi Demonstrations Selected

In this paper, a mutual body representation for Telexistence Robots that does not have physical arms were discussed. We propose a method of projecting user’s hands as a virtual superimposition that not only the user sees through a HMD, but also to the remote participants by projecting virtual hands images into the remote environment with a small projector aligned with robot’s eyes. These virtual hands are produced by capturing user’s hands from the first point of view (FPV), and then segmented from the background. This method expands the physical body representation of the user, and allows mutual body communication between the user and remote participants while providing a better understanding user’s hand motion and intended interactions in the remote place.

01 Dec 2014

Enforced telexistence: teleoperating using photorealistic virtual body and haptic feedback

SHENZHEN - CHINA

SIGGRAPH Asia 2014 Emerging Technologies, Shenzhen, China, Article 7, pp. 2.

Demonstrations Mhd Yamen Saraiji, Charith Lasantha Fernando, Yusuke Mizushina, Youichi Kamiyama, Kouta Minamizawa, Susumu Tachi

Enforced telexistence: teleoperating using photorealistic virtual body and haptic feedback

Mhd Yamen Saraiji, Charith Lasantha Fernando, Yusuke Mizushina, Youichi Kamiyama, Kouta Minamizawa, Susumu Tachi Demonstrations

Telexistence systems require physical limbs for remote object manipulation. Having arms and hands synchronized with voluntary movements allows the user to feel robot’s body as his body through visual, and haptic sensation. In this method, we introduce a novel technique that provides virtual arms for existing telexistence systems that does not have physical arms. Previous works [Mine et al. 1997; Poupyrev et al. 1998; Nedel et al. 2003] involved the study of using virtual representation of user hands in virtual environments for interactions. In this work, the virtual arms serves for several interactions in a physical remote environment, and most importantly they provide the user the sense of existence in that remote environment. These superimposed virtual arms follows the user’s real-time arm movements and reacts to the dynamic lighting of real environment providing photorealistic rendering adapting to remote place lighting. Thus, it allows the user to have an experience of embodied enforcement towards the remote environment. Furthermore, these virtual arms can be extended to touch and feel unreachable remote objects, and to grab a functional virtual copy of a physical instance where device control is possible. This method does not only allow the user to experience a non-existing arm in telexistence, but also gives the ability to enforce remote environment in various ways.

01 Aug 2014

Enforced telexistence

VANCOUVER - CANADA

ACM SIGGRAPH 2014 Posters, Vancouver, Canada, Article No 49.

Poster MHD Yamen Saraiji, Yusuke Mizushina, Charith Lasantha Fernando, Masahiro Furukawa, Youichi Kamiyama, Kouta Minamizawa, and Susumu Tachi

Enforced telexistence

MHD Yamen Saraiji, Yusuke Mizushina, Charith Lasantha Fernando, Masahiro Furukawa, Youichi Kamiyama, Kouta Minamizawa, and Susumu Tachi Poster

Telexistence systems require physical limbs for remote object manipulation. Having arms and hands synchronized with voluntary movements allows the user to feel robot’s body as his body through visual, and haptic sensation. In this method, we introduce a novel technique that provides virtual arms for existing telexistence systems that does not have physical arms. Previous works [Mine et al. 1997; Poupyrev et al. 1998; Nedel et al. 2003] involved the study of using virtual representation of user hands in virtual environments for interactions. In this work, the virtual arms serves for several interactions in a physical remote environment, and most importantly they provide the user the sense of existence in that remote environment. These superimposed virtual arms follows the user’s real-time arm movements and reacts to the dynamic lighting of real environment providing photorealistic rendering adapting to remote place lighting. Thus, it allows the user to have an experience of embodied enforcement towards the remote environment. Furthermore, these virtual arms can be extended to touch and feel unreachable remote objects, and to grab a functional virtual copy of a physical instance where device control is possible. This method does not only allow the user to experience a non-existing arm in telexistence, but also gives the ability to enforce remote environment in various ways.

01 Dec 2013

Real-time egocentric superimposition of operator’s own body on telexistence avatar in virtual environment

TOKYO - JAPAN

23rd International Conference on Artificial Reality and Telexistence (ICAT) 2013

Conferences MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi

Real-time egocentric superimposition of operator’s own body on telexistence avatar in virtual environment

MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi Conferences

With the advancements in Internet based services, the demands for better communication tools became higher in order to fill the distance gaps between people. Especially in our daily activities when we are traveling abroad. The traditional video/audio communication tools became actively available within our reach thanks to smartphone advancements. Internet based conferencing such as “Google Hangouts” made it more reachable by using web-based tools, allowing two or more participants to engage in a social meeting. However, the demand for social tools which allows the participants to move freely in the remote place existed. On one hand, Telepresence robots stepped into the scene to fill that need, also it opened new services to the consumers such as remotely visiting museums or attending exhibitions and conferences. On the other hand, Telexistence systems provided more sophisticated capabilities to the user by replicating several functions of his body into the remote place using mechanical parts. “Virtual Embodied Telexistence” is a system that combines low cost telexistence system with virtually embodied human functions into it. Allowing the user to have an immersive visual feedback of the remote place while being aware of his body via visual representation of it.
“Virtual Embodiment” proposed in this thesis defines the important elements for replacing physical representations of human’s body into a virtual representation, as a way to increase the sense of presence in a different place. Increasing the sense of body presence in telecommunication helps the user to act more naturally and intuitively using his body motion.
This thesis explores the proposed concept of Virtual Embodiment for human’s body and its tight link with presence. The design flow to realize a virtually embodied telexistence system is discussed, as well as the implementation procedure that has been done. We show the efficiency of using this system by objective user evaluation, and how intuitive it is to be used by any user.

01 Dec 2012

Virtual Telesar – Designing and Implementation of a Modular Based Immersive Virtual Telexistence Platform

FUKUOKA - JAPAN

IEEE Virtual Reality (VR) 2013, pp.595 - 598.

Conferences MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi

Virtual Telesar – Designing and Implementation of a Modular Based Immersive Virtual Telexistence Platform

MHD Yamen Saraiji, Charith Lasantha Fernando, Masahiro Furukawa, Kouta Minamizawa, Susumu Tachi Conferences

In this paper, we focus on designing a customizable modular based virtual platform for modeling, simulating and testing telexistence applications, where the physical parameters are preserved in the virtual environment; for motor control of physical characteristics of robot as well as sensory feedback of vision, auditory, haptic. We proposed “Virtual Telesar” which allows telexistence engineers to model a prototype system before manufacturing it and to experience how the final model will look-like, perform manipulations etc… in real world without building it. The platform consists of three features: first, the user can define a robot using predefined modular components. Second, the user can customize and tune up parameters. Third, the user can have immersive experiences of operating the robot with visual, auditory and haptic sensation. In this paper, we describe a design concept of Virtual Telesar platform and report modeling result based on a physical robot and result of immersive experience with it.

.05

TEACHING

CURRENT
  • 2015
    Japan

    Teacher Assistant

    Graduate School of Media Design, Keio University

    Worked as a TA for Embodied Media class, supporting students in their class project as a mentor
HISTORY
  • 2012
    2012
    Syria - Damascus

    Teacher

    Syrian Computer Society (SCS)

    -Introducing young students to programming languages -Providing them with the required skills for analyzing real-life problems and to propose solutions for it in algorithmic way -Prepare them to participate in international programming computations
.06

SKILLS

PROGRAMMING SKIILLS
System Engineering >
LEVEL : ADVANCED EXPERIENCE : >10 YEARS
C++ C# Python
Game Development >
LEVEL : INTERMEDIATE EXPERIENCE : 4 YEARS
Unity 3D
DESIGN SKILLS
3D/2D Design >
LEVEL : INTERMEDIATE EXPERIENCE : 5 YEARS
Photoshop Maya 3D Studio Max
.07

PROJECTS

MY PORTFOLIO
Projects number 10
PhDResearch

Layered Presence

Layered Presence

“Layered Telepresence”, a novel method of experiencing simultaneous multi-presence. Users eye gaze and perceptual awareness are blended with real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority-driven layered stack. A weighted feature map was created based on the objects recognized for each layer using image-processing techniques, and pushes the most weighted layer around the users gaze into the foreground. All other layers are pushed back to the background providing an artificial depth-of-field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see-through HMD, television screen or even your PC screen enabling true multitasking.

Gold Prize Award – ACM Student Research Competition at SIGGRAPH 2016 Anahiem
Best Aural Presentation – Virtual Reality Society Japan September/2016 (Japanese)

SIGGRAPH 2016 ETech page

Research

Telexistence Surveillance Project

Telexistence Surveillance Project

Spatial Coherent Remote Driving Experience for Disaster Sites under Hazardous Conditions
Collaborated work with Obayashi Corporation, ICAT 2015, ROBOMECH 2015, VRSJ 2015 OS

Research

Telexistence Marathon Runner

Telexistence Marathon Runner

In collaboration with ASICS Corporation, we demonstrated experiencing real-time 360 video and audio streaming from a Marathon Runner on February 28th at the Asics Store Tokyo. The project was initiated together with a former KMD Masters student Yusuke Mizushina who is currently working at ASICS Corporation Wearable Device Business Operation Team Corporate Strategy Department.

Delivering ultra low latency 360 video and audio content using conventional LTE mobile network was powered WebRTC technologies developed by NTT Communications SkyWay Service.

For more details please refer the Press Release (Japanese only)

DESIGNPhD

HUG – Connecting those who you love over the distance

HUG – Connecting those who you love over the distance

An idea has emerged from Mr Norikazu Takagi (from Ducklings Japan) to let his grandmother who is a resident in Nagoya prefecture to join an important personal event of him, his wedding ceremony held in Tokyo. However, due to her medical conditions she cannot attend the ceremony physically.

map1.h2

So in a cross collaboration between our research group in Keio Media Design, CREATIVE ORCA, and FOVE, we designed a user friendly system to let her “Virtually” attend the ceremony. And to realize this idea, we used the concept of telepresence using a Head Mounted Display to provide immersive stereo visuals of the event in the remote site. At the ceremony, we used Pepper robot as a platform to build our system on top of it. A custom 60FPS stereo camera set was added to robot’s head, and a low latency stereo video stream is sent to Nagoya (<180ms). The user (Grandmother) controls robots head using her eyes movement, and can do simple hands gestures using a joystick.

system-image01

During the event, grandmother’s reactions were priceless! She almost cried when she saw the pride and her grandson in the ceremony, also she expressed her feelings saying “I felt as I was there with them”.
This type of feedback is what keeps me going on and doing what I do!

The event was covered by NHK TV, and broadcasted over Japan.

best award and best care and welfare prize at Softbank’s Pepper App Challenge 2015 Winter (Japanese)

Screen Shot 2015-11-14 at 9.53.38 PM

DESIGN

Microsoft Design Challenge: Hacking Mars

Microsoft Design Challenge: Hacking Mars

“The routine of being stuck on Mars can get dull quickly. Especially in the situation our character finds himself in. However, with the “NOVA” cards (a set of devices that he can scrap together with available equipment), he can connect to his loved ones by capturing and sharing single moments of his emotional states which include an image and haptic recording of his heartbeat. In return, the loved ones can stay connected with him by doing the same and sharing their moments and experiences to reassure him that they are waiting for his return.
The tangibility of the “NOVA” cards let’s our character hang up a dedicated card for each of his loved ones, much like a traditional picture, and carry them around on his missions. The tactile message lets him feel their emotional state through the simulated heartbeat. We believe that the “NOVA” cards one message per person, per day function will present a new challenge for our character each day and keep him occupied to create new positive moments until he finds his way back home.”

NovaTitle

A challenge proposed by Microsoft Inclusive Design Team: Hacking Mars attracted many designer around the globe to come up with ideas to help an isolated astronauts to survive. Me and my team members (Jimi Okelana & Roshan Peiris) addressed this topic by looking at it from a global perspective, how to maintain the emotional attachment between this astronaut and with people he love. We began with an ideation to come up with ideas around this topic

Nova2

This project was submitted to Microsoft Design Challenge, #HackingMars.

Microsoft Design Challenge 2015 #HackingMars top three finalists

Research

Telexistence Drone

Telexistence Drone

This drone is a type of Telexistence system whereby the user can experience the feeling of flight. Telexistence is a technology that enables users to synchronize their motions and emotions with robots, such that they can be at a place other than where he or she actually exists, while being able to interact with a faraway remote environment.

In this case of the drone, a camera is attached to it and then whatever is captured by the camera is synchronized with the Head Mounted Display (HMD) of the user, such that the user is able to experience the flight of the drone. By integrating the flight unit with the user and thus crossing physical limitations such as height and physique, everyone can now enjoy a whole new concept of ‘space.’

DESIGN

TEDxTokyo 2014 – Connecting the Unconnected

TEDxTokyo 2014 – Connecting the Unconnected

“Do you see the connections?”

Inspired by TEDxTokyo 2014 theme: “Connecting the Unconnected”, TEDxTokyo design team this year aimed to create interactive visual graphics for the event.
After several meeting, we decided to make use of the real-time tweets which the participants of the event write, and display it as connected dots with TED speaker. Each dot holds a picture of twitter user, thus creating visual information. Also when other users reply to a tweet, these users get *Connected* with each other and this connection is visualized on the screen.

TED_Preview

In order to decide which tweets to be used and displayed, the application filters for tweets with tag: “#TEDxTokyo” thus it would know this tweet relate somehow to the event. Then a classification step takes place to identify which speaker those tweets are related to. So it searches inside each tweet for speaker name and if found it get connected to him.

TED_Installation

For interaction, we used leapmotion placed in front of the screen (inspired by sci-fi movies type of interaction), and users can hover and select tweets to read (by hovering over the dots) or the can change the speaker by sweeping over it.

TED_people

You can see our installation in the TEDxTokyo video stream here:

On our design team website:
http://eatcreative.jp/en/connecting-the-unconnected/
http://eatcreative.jp/en/case_study/tedxtokyo/

Master

TELUBEE 2.0 (TELexistence platform for Ubiquitous Embodied Experience)

TELUBEE 2.0 (TELexistence platform for Ubiquitous Embodied Experience)

The successor of TELUBEE 1.0, human size telexistence robot with 3-axis rotatable head (tilt, yaw, roll) and movable base (roomba). In this version, the robot can access the robot over the internet, and using OculusVR, the user can see real-time wide field of view stereo images from the robot side. Also the user can get real-time audio feedback. The robot is fully portable, currently using Wi-Fi network connection to stream the video and receive control. TELUBEE 2.0 was exhibited in International Conference on Artificial Reality and Telexistence (ICAT 2013) in Japan/Tokyo while connected with France.

Master

TELUBEE 1.0 (TELexistence platform for Ubiquitous Embodied Experience)

TELUBEE 1.0 (TELexistence platform for Ubiquitous Embodied Experience)

We introduce a novel interface where a user can connect with a remote avatar-like robot among ubiquitously distributed multiple robots around the world and experience the distant world as if he is remotely existing there.

The system “TELUBee” includes distributed small-scaled telexistence robots; and a graphical user interface with a world map for selecting the desired geolocation where you want to experience. By wearing a HMD, user can see 3D stereoscopic remote view and is able to interact with remote participants using binaural audio communication. Remote Robots head motion is synchronized with the user’s head and the combination of audio/visual sensation he would experience allows him to feel the same kinesthetic sensation as would be physically there. The robots are small, low cost and portable battery powered. Hence it can be used in anywhere where a Internet connection is present.

“TELUBee” user interface can be a gateway to travel between many places around the world without spending much time and will be useful for sightseeing, attending remote meetings and have face-to-face conversations etc. . .

 

Publications

International Conferences

  • Laval Virtual Revolution 2013

Domestic Conferences

  • 日本バーチャルリアリティ学会 第17回大会
  • 日本機械学会 ロボティクス・メカトロニクス講演会

Awards

  • Invited Demonstration at Laval Virtual Revolution 2013
Research

Virtual Telesar

Virtual Telesar

Virtual telexistace platform in which designers and engineers can define their prototype and experience it.

A work which was published in System Integration 2012 in Kyushu, Japan.

ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6426950

.08

CONTACT

Drop me a line

GET IN TOUCH

Have a question, a feedback, a suggestion? Please don't hesitate to get in touch with me! Just drop me a message here and I will get back to you ASAP!