MYamenSPERSONAL SITE

PROJECTS

Projects number 12
DESIGNPhDResearch

Telexistence Toolkit

Telexistence Toolkit

Our world now is becoming a connected village of information and social experiences, but we still experience it as spectators and from the personal point of view. Telexistence Toolkit, or TxKit, is a compact, novel communication device that virtually transports you from one physical location to another, allowing you to experience the world from your point of view. Designed based on essential sensory feedback and in humanoid factor, TxKit provides the means for stereoscopic vision, binaural audio, speaking, and head motion. Using this design, both user and remote participants can have natural and mutual communication as if both were in the same location.

MHD Yamen Saraiji, Charith Fernando, Kouta Minamizawa, Yasuyuki Inoue, Susumu Tachi

Designed in collaboration with Karakuri Products, Inc.

PhDResearch

MetaLimbs

MetaLimbs

“MetaLimbs” is a novel system to expand the number of arms by using limb substitution. In this system, legs are mapped into artificial robotic arms mounted on user’s back and are used to control arms and hands motion. This system provides an immediate and intuitive control over the new limbs, and the users can adapt to operating it without any training.

Tomoya Sasaki, MHD Yamen Saraiji, Charith Fernando, kouta Minamizawa, Michiteru Kitazaki, Masahiko Inami

SIGGRAPH 2017 ETech page
Reuters
KMD Reference Page

PhDResearch

Layered Presence

Layered Presence

“Layered Telepresence”, a novel method of experiencing simultaneous multi-presence. Users eye gaze and perceptual awareness are blended with real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority-driven layered stack. A weighted feature map was created based on the objects recognized for each layer using image-processing techniques, and pushes the most weighted layer around the users gaze into the foreground. All other layers are pushed back to the background providing an artificial depth-of-field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see-through HMD, television screen or even your PC screen enabling true multitasking.

Gold Prize Award – ACM Student Research Competition at SIGGRAPH 2016 Anahiem
Best Aural Presentation – Virtual Reality Society Japan September/2016 (Japanese)

SIGGRAPH 2016 ETech page

Research

Telexistence Surveillance Project

Telexistence Surveillance Project

Spatial Coherent Remote Driving Experience for Disaster Sites under Hazardous Conditions
Collaborated work with Obayashi Corporation, ICAT 2015, ROBOMECH 2015, VRSJ 2015 OS

Research

Telexistence Marathon Runner

Telexistence Marathon Runner

In collaboration with ASICS Corporation, we demonstrated experiencing real-time 360 video and audio streaming from a Marathon Runner on February 28th at the Asics Store Tokyo. The project was initiated together with a former KMD Masters student Yusuke Mizushina who is currently working at ASICS Corporation Wearable Device Business Operation Team Corporate Strategy Department.

Delivering ultra low latency 360 video and audio content using conventional LTE mobile network was powered WebRTC technologies developed by NTT Communications SkyWay Service.

For more details please refer the Press Release (Japanese only)

DESIGNPhD

HUG – Connecting those who you love over the distance

HUG – Connecting those who you love over the distance

An idea has emerged from Mr Norikazu Takagi (from Ducklings Japan) to let his grandmother who is a resident in Nagoya prefecture to join an important personal event of him, his wedding ceremony held in Tokyo. However, due to her medical conditions she cannot attend the ceremony physically.

map1.h2

So in a cross collaboration between our research group in Keio Media Design, CREATIVE ORCA, and FOVE, we designed a user friendly system to let her “Virtually” attend the ceremony. And to realize this idea, we used the concept of telepresence using a Head Mounted Display to provide immersive stereo visuals of the event in the remote site. At the ceremony, we used Pepper robot as a platform to build our system on top of it. A custom 60FPS stereo camera set was added to robot’s head, and a low latency stereo video stream is sent to Nagoya (<180ms). The user (Grandmother) controls robots head using her eyes movement, and can do simple hands gestures using a joystick.

system-image01

During the event, grandmother’s reactions were priceless! She almost cried when she saw the pride and her grandson in the ceremony, also she expressed her feelings saying “I felt as I was there with them”.
This type of feedback is what keeps me going on and doing what I do!

The event was covered by NHK TV, and broadcasted over Japan.

best award and best care and welfare prize at Softbank’s Pepper App Challenge 2015 Winter (Japanese)

Screen Shot 2015-11-14 at 9.53.38 PM

DESIGN

Microsoft Design Challenge: Hacking Mars

Microsoft Design Challenge: Hacking Mars

“The routine of being stuck on Mars can get dull quickly. Especially in the situation our character finds himself in. However, with the “NOVA” cards (a set of devices that he can scrap together with available equipment), he can connect to his loved ones by capturing and sharing single moments of his emotional states which include an image and haptic recording of his heartbeat. In return, the loved ones can stay connected with him by doing the same and sharing their moments and experiences to reassure him that they are waiting for his return.
The tangibility of the “NOVA” cards let’s our character hang up a dedicated card for each of his loved ones, much like a traditional picture, and carry them around on his missions. The tactile message lets him feel their emotional state through the simulated heartbeat. We believe that the “NOVA” cards one message per person, per day function will present a new challenge for our character each day and keep him occupied to create new positive moments until he finds his way back home.”

NovaTitle

A challenge proposed by Microsoft Inclusive Design Team: Hacking Mars attracted many designer around the globe to come up with ideas to help an isolated astronauts to survive. Me and my team members (Jimi Okelana & Roshan Peiris) addressed this topic by looking at it from a global perspective, how to maintain the emotional attachment between this astronaut and with people he love. We began with an ideation to come up with ideas around this topic

Nova2

This project was submitted to Microsoft Design Challenge, #HackingMars.

Microsoft Design Challenge 2015 #HackingMars top three finalists

Research

Telexistence Drone

Telexistence Drone

This drone is a type of Telexistence system whereby the user can experience the feeling of flight. Telexistence is a technology that enables users to synchronize their motions and emotions with robots, such that they can be at a place other than where he or she actually exists, while being able to interact with a faraway remote environment.

In this case of the drone, a camera is attached to it and then whatever is captured by the camera is synchronized with the Head Mounted Display (HMD) of the user, such that the user is able to experience the flight of the drone. By integrating the flight unit with the user and thus crossing physical limitations such as height and physique, everyone can now enjoy a whole new concept of ‘space.’

DESIGN

TEDxTokyo 2014 – Connecting the Unconnected

TEDxTokyo 2014 – Connecting the Unconnected

“Do you see the connections?”

Inspired by TEDxTokyo 2014 theme: “Connecting the Unconnected”, TEDxTokyo design team this year aimed to create interactive visual graphics for the event.
After several meeting, we decided to make use of the real-time tweets which the participants of the event write, and display it as connected dots with TED speaker. Each dot holds a picture of twitter user, thus creating visual information. Also when other users reply to a tweet, these users get *Connected* with each other and this connection is visualized on the screen.

TED_Preview

In order to decide which tweets to be used and displayed, the application filters for tweets with tag: “#TEDxTokyo” thus it would know this tweet relate somehow to the event. Then a classification step takes place to identify which speaker those tweets are related to. So it searches inside each tweet for speaker name and if found it get connected to him.

TED_Installation

For interaction, we used leapmotion placed in front of the screen (inspired by sci-fi movies type of interaction), and users can hover and select tweets to read (by hovering over the dots) or the can change the speaker by sweeping over it.

TED_people

On our design team website:
http://eatcreative.jp/en/connecting-the-unconnected/
http://eatcreative.jp/en/case_study/tedxtokyo/

Master

TELUBEE 2.0 (TELexistence platform for Ubiquitous Embodied Experience)

TELUBEE 2.0 (TELexistence platform for Ubiquitous Embodied Experience)

The successor of TELUBEE 1.0, human size telexistence robot with 3-axis rotatable head (tilt, yaw, roll) and movable base (roomba). In this version, the robot can access the robot over the internet, and using OculusVR, the user can see real-time wide field of view stereo images from the robot side. Also the user can get real-time audio feedback. The robot is fully portable, currently using Wi-Fi network connection to stream the video and receive control. TELUBEE 2.0 was exhibited in International Conference on Artificial Reality and Telexistence (ICAT 2013) in Japan/Tokyo while connected with France.

Master

TELUBEE 1.0 (TELexistence platform for Ubiquitous Embodied Experience)

TELUBEE 1.0 (TELexistence platform for Ubiquitous Embodied Experience)

We introduce a novel interface where a user can connect with a remote avatar-like robot among ubiquitously distributed multiple robots around the world and experience the distant world as if he is remotely existing there.

The system “TELUBee” includes distributed small-scaled telexistence robots; and a graphical user interface with a world map for selecting the desired geolocation where you want to experience. By wearing a HMD, user can see 3D stereoscopic remote view and is able to interact with remote participants using binaural audio communication. Remote Robots head motion is synchronized with the user’s head and the combination of audio/visual sensation he would experience allows him to feel the same kinesthetic sensation as would be physically there. The robots are small, low cost and portable battery powered. Hence it can be used in anywhere where a Internet connection is present.

“TELUBee” user interface can be a gateway to travel between many places around the world without spending much time and will be useful for sightseeing, attending remote meetings and have face-to-face conversations etc. . .

 

Publications

International Conferences

  • Laval Virtual Revolution 2013

Domestic Conferences

  • 日本バーチャルリアリティ学会 第17回大会
  • 日本機械学会 ロボティクス・メカトロニクス講演会

Awards

  • Invited Demonstration at Laval Virtual Revolution 2013
Research

Virtual Telesar

Virtual Telesar

Virtual telexistace platform in which designers and engineers can define their prototype and experience it.

A work which was published in System Integration 2012 in Kyushu, Japan.

ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6426950