SlideFusion: Surrogacy Wheelchair With Implicit Eyegaze Modality Sharing
For mobility-impaired people, the wheelchair is considered as a main navigation and accessibility device. However, due to the inherited design of the chair, the user is restricted to utilize his hand all the time during navigation resulting in a dual impairment. Our proposed system, SlideFusion, expands on previous work of collaborative and assistive technologies for accessibility scenarios. SlideFusion is focused on the remote collaboration of the mobility impaired person, that an operator can remotely access an avatar embedded into the wheelchair. To reduce the physical and cognitive overload on the wheelchair user, we propose the use of eye gaze modality sharing with the remote operator. The eye gaze modality enables implicit interactions that do not require the user to point or verbally say, thus leveraging indirect communication. By using this, not only accessibility can be provided to wheelchair users and their caregivers, but also hearing impairments and pronunciation disorders are applicable.
Credits: Ryoichi Ando, Kouta Minamizawa, MHD Yamen Saraiji
Keio University Graduate School of Media Design
Arque: Artificial Biomimicry-Inspired Tail for Extending Innate Body Functions
Arque addresses a long asked question of the lack of tail in human bodies. For most mammals and vertebrate animals, tail plays an important role for their body providing variant functions to expand their mobility, or as a limb that allows manipulation and gripping. In this work, Arque, we propose an artificial biomimicry-inspired anthropomorphic tail to allow us alter our body momentum for assistive, and haptic feedback applications. The proposed tail consists of adjacent joints with a spring-based structure to handle shearing and tangential forces, and allow managing the length and weight of the target tail. The internal structure of the tail is driven by four pneumatic artificial muscles providing the actuation mechanism for the tail tip. Here we highlight potential applications for using such prosthetic tail as an extension of human body to provide active momentum alteration in balancing situations, or as a device to alter body momentum for full-body haptic feedback scenarios.
Credits: Junichi Nabeshima, Kouta Minamizawa, MHD Yamen Saraiji
Keio University Graduate School of Media Design
Radical Bodies Group & Embodied Media
Effective communication is a key factor in social and professional contexts which involve sharing the skills and actions of more than one person. This research proposes a novel system to enable full body sharing over a remotely operated wearable system, allowing one person to dive into someone’s else body. “Fusion” enables body surrogacy by sharing the same point of view of two-person: a surrogate and an operator, and it extends the limbs mobility and actions of the operator using two robotic arms mounted on the surrogate body. These arms can be used independently of the surrogate arms for collaborative scenarios or can be linked to surrogate’s arms to be used in remote assisting and supporting scenarios. Using Fusion, we realize three levels of bodily driven communication: Direct, Enforced, and Induced. We demonstrate through this system the possibilities of truly embodying and transferring our body actions from one person to another, realizing true body communication.
This project is done in collaboration between Keio University Graduate School of Media Design and The University of Tokyo.
Media and Press:
ACM SIGGRAPH Blog
MIT Technology Review
IEEE Spectrum
Japan Science and Technology Agency (JST) – Japanese
Fast Company
Dezeen
hackster.io
Seamless – Japanese
DigitalTrends
YouFab Global Creative Award 2018
James Dyson Award 2019
Our world now is becoming a connected village of information and social experiences, but we still experience it as spectators and from the personal point of view. Telexistence Toolkit, or TxKit, is a compact, novel communication device that virtually transports you from one physical location to another, allowing you to experience the world from your point of view. Designed based on essential sensory feedback and in humanoid factor, TxKit provides the means for stereoscopic vision, binaural audio, speaking, and head motion. Using this design, both user and remote participants can have natural and mutual communication as if both were in the same location.
MHD Yamen Saraiji, Charith Fernando, Kouta Minamizawa, Yasuyuki Inoue, Susumu Tachi
Designed in collaboration with Karakuri Products, Inc.
“MetaLimbs” is a novel system to expand the number of arms by using limb substitution. In this system, legs are mapped into artificial robotic arms mounted on user’s back and are used to control arms and hands motion. This system provides an immediate and intuitive control over the new limbs, and the users can adapt to operating it without any training.
Tomoya Sasaki, MHD Yamen Saraiji, Charith Fernando, kouta Minamizawa, Michiteru Kitazaki, Masahiko Inami
“Layered Telepresence”, a novel method of experiencing simultaneous multi-presence. Users eye gaze and perceptual awareness are blended with real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority-driven layered stack. A weighted feature map was created based on the objects recognized for each layer using image-processing techniques, and pushes the most weighted layer around the users gaze into the foreground. All other layers are pushed back to the background providing an artificial depth-of-field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see-through HMD, television screen or even your PC screen enabling true multitasking.
Gold Prize Award – ACM Student Research Competition at SIGGRAPH 2016 Anahiem
Best Aural Presentation – Virtual Reality Society Japan September/2016 (Japanese)
In collaboration with ASICS Corporation, we demonstrated experiencing real-time 360 video and audio streaming from a Marathon Runner on February 28th at the Asics Store Tokyo. The project was initiated together with a former KMD Masters student Yusuke Mizushina who is currently working at ASICS Corporation Wearable Device Business Operation Team Corporate Strategy Department.
Delivering ultra low latency 360 video and audio content using conventional LTE mobile network was powered WebRTC technologies developed by NTT Communications SkyWay Service.
For more details please refer the Press Release (Japanese only)
An idea has emerged from Mr Norikazu Takagi (from Ducklings Japan) to let his grandmother who is a resident in Nagoya prefecture to join an important personal event of him, his wedding ceremony held in Tokyo. However, due to her medical conditions she cannot attend the ceremony physically.
So in a cross collaboration between our research group in Keio Media Design, CREATIVE ORCA, and FOVE, we designed a user friendly system to let her “Virtually” attend the ceremony. And to realize this idea, we used the concept of telepresence using a Head Mounted Display to provide immersive stereo visuals of the event in the remote site. At the ceremony, we used Pepper robot as a platform to build our system on top of it. A custom 60FPS stereo camera set was added to robot’s head, and a low latency stereo video stream is sent to Nagoya (<180ms). The user (Grandmother) controls robots head using her eyes movement, and can do simple hands gestures using a joystick.
During the event, grandmother’s reactions were priceless! She almost cried when she saw the pride and her grandson in the ceremony, also she expressed her feelings saying “I felt as I was there with them”.
This type of feedback is what keeps me going on and doing what I do!
The event was covered by NHK TV, and broadcasted over Japan.
best award and best care and welfare prize at Softbank’s Pepper App Challenge 2015 Winter (Japanese)
“The routine of being stuck on Mars can get dull quickly. Especially in the situation our character finds himself in. However, with the “NOVA” cards (a set of devices that he can scrap together with available equipment), he can connect to his loved ones by capturing and sharing single moments of his emotional states which include an image and haptic recording of his heartbeat. In return, the loved ones can stay connected with him by doing the same and sharing their moments and experiences to reassure him that they are waiting for his return.
The tangibility of the “NOVA” cards let’s our character hang up a dedicated card for each of his loved ones, much like a traditional picture, and carry them around on his missions. The tactile message lets him feel their emotional state through the simulated heartbeat. We believe that the “NOVA” cards one message per person, per day function will present a new challenge for our character each day and keep him occupied to create new positive moments until he finds his way back home.”
A challenge proposed by Microsoft Inclusive Design Team: Hacking Mars attracted many designer around the globe to come up with ideas to help an isolated astronauts to survive. Me and my team members (Jimi Okelana & Roshan Peiris) addressed this topic by looking at it from a global perspective, how to maintain the emotional attachment between this astronaut and with people he love. We began with an ideation to come up with ideas around this topic
This project was submitted to Microsoft Design Challenge, #HackingMars.
Microsoft Design Challenge 2015 #HackingMars top three finalists
This drone is a type of Telexistence system whereby the user can experience the feeling of flight. Telexistence is a technology that enables users to synchronize their motions and emotions with robots, such that they can be at a place other than where he or she actually exists, while being able to interact with a faraway remote environment.
In this case of the drone, a camera is attached to it and then whatever is captured by the camera is synchronized with the Head Mounted Display (HMD) of the user, such that the user is able to experience the flight of the drone. By integrating the flight unit with the user and thus crossing physical limitations such as height and physique, everyone can now enjoy a whole new concept of ‘space.’
“Do you see the connections?”
Inspired by TEDxTokyo 2014 theme: “Connecting the Unconnected”, TEDxTokyo design team this year aimed to create interactive visual graphics for the event.
After several meeting, we decided to make use of the real-time tweets which the participants of the event write, and display it as connected dots with TED speaker. Each dot holds a picture of twitter user, thus creating visual information. Also when other users reply to a tweet, these users get *Connected* with each other and this connection is visualized on the screen.
In order to decide which tweets to be used and displayed, the application filters for tweets with tag: “#TEDxTokyo” thus it would know this tweet relate somehow to the event. Then a classification step takes place to identify which speaker those tweets are related to. So it searches inside each tweet for speaker name and if found it get connected to him.
For interaction, we used leapmotion placed in front of the screen (inspired by sci-fi movies type of interaction), and users can hover and select tweets to read (by hovering over the dots) or the can change the speaker by sweeping over it.
On our design team website:
http://eatcreative.jp/en/connecting-the-unconnected/
http://eatcreative.jp/en/case_study/tedxtokyo/
The successor of TELUBEE 1.0, human size telexistence robot with 3-axis rotatable head (tilt, yaw, roll) and movable base (roomba). In this version, the robot can access the robot over the internet, and using OculusVR, the user can see real-time wide field of view stereo images from the robot side. Also the user can get real-time audio feedback. The robot is fully portable, currently using Wi-Fi network connection to stream the video and receive control. TELUBEE 2.0 was exhibited in International Conference on Artificial Reality and Telexistence (ICAT 2013) in Japan/Tokyo while connected with France.
We introduce a novel interface where a user can connect with a remote avatar-like robot among ubiquitously distributed multiple robots around the world and experience the distant world as if he is remotely existing there.
The system “TELUBee” includes distributed small-scaled telexistence robots; and a graphical user interface with a world map for selecting the desired geolocation where you want to experience. By wearing a HMD, user can see 3D stereoscopic remote view and is able to interact with remote participants using binaural audio communication. Remote Robots head motion is synchronized with the user’s head and the combination of audio/visual sensation he would experience allows him to feel the same kinesthetic sensation as would be physically there. The robots are small, low cost and portable battery powered. Hence it can be used in anywhere where a Internet connection is present.
“TELUBee” user interface can be a gateway to travel between many places around the world without spending much time and will be useful for sightseeing, attending remote meetings and have face-to-face conversations etc. . .
International Conferences
Domestic Conferences
Awards