4 – Existing AAL Products




Abstract




In this chapter, several technological, or to be more detailed, robotic solutions from different companies and developers will be presented. This allows an overview of the state of Ambient/Active Assisted Living (AAL), which can be categorized into home care (for independent living in old age), social interaction, health and wellness, interaction and learning, working, and mobility.





4 Existing AAL Products



In this chapter, several technological, or to be more detailed, robotic solutions from different companies and developers will be presented. This allows an overview of the state of Ambient/Active Assisted Living (AAL), which can be categorized into home care (for independent living in old age), social interaction, health and wellness, interaction and learning, working, and mobility.



4.1 Home Care (Independent Living)


Here, an overview about robotics which are used in the home care field is given. In Table 4.1, a short overview about the robots introduced in this section is presented.






  • System Name: PaPeRo



  • Developer: NEC Corporation



  • PaPeRo is a “Personal Robot” best known for its recognizable appearance and facial recognition abilities (see Figure 4.1). PaPeRo, which stands for “Partner-type-Personal-Robot” was developed as a personal assistant that could interact with people in everyday life situations. It is equipped with a variety of abilities for interaction. For example, if asked, “Is today a good day for a drive?,” PaPeRo will autonomously connect to the internet, assess the weather forecast for the day, and provide a recommendation. It also has the ability to play games with people, provide music for a party, and imitate both in voice and movement. PaPeRo adapts to various personalities based on the types of interaction it receives (tone of voice, frequency of interaction, type of question asked, etc.) and develops a simulated mood based on the moods of the humans it interacts with. When not given any immediate tasks, PaPeRo roams around searching for faces and once one is found, will begin conversing with it.



  • Technology: PaPeRo’s distinguishing “eyes” are in fact two cameras that allow for its visual awareness and facial recognition system. A pair of sensitive microphones allow the system to determine where, and from whom, human speech is coming from using its speech recognition system and PaPeRo will react accordingly. It is also equipped with an ultrasound system in order to detect obstacles. If an object is in its path, the exact location of the object will be detected and a route avoiding the object will be designed and employed.



Source: www.nec.co.jp.





  • System Name: Mamoru-Kun



  • Developer: Center of IRT (CIRT), Tokyo University



  • Mamoru-Kun is an assistive robot for seniors (see Figure 4.1). The purpose of the robot is to remind users where they may have left often-misplaced items such as keys, glasses, or slippers. Reminders can be provided either verbally or the robot can physically point out the location of the items. Alternatively, it can communicate with its older brother, the “Home Assistance Robot,” and have it retrieve the desired items. Mamoru can also be programmed to provide everyday reminders such as taking one’s medication.



  • Technology: Mamoru is primarily an “Object-Recognition-Robot,” equipped with a wide-angle lens. It stands 40 cm tall and weighs 3.8 kg. It has four joints (two in the neck and one in each arm), a microphone, and speakers in order to communicate the location of lost items. In order for personal items to be identified and located, users must register their often-misplaced items in advance to create a sort of inventory for the robot. If the item is within the system’s field of view, object recognition software is able to identify it and inform the user of its location.








  • System Name: Emiew 2



  • Developer: Hitachi



  • The second iteration of Emiew (Excellent Mobility and Interactive Existence as Workmate, see Figure 4.2) was developed by Hitachi. Emiew is a service robot with diverse communication abilities that could safely and comfortably coexist with humans while carrying out necessary services. Its communication is supported with a vocabulary of about 100 words. In order to achieve higher mobility speeds, Emiew was designed with two wheels as “feet” and can travel up to 6 km/h. At the time of development, the previous version of Emiew was the fastest moving service robot yet. With sensors on its head, waist, and near the base, Emiew is able to interact with users and follow various commands. Its speed and abilities make Emiew particularly useful in the office setting to assist in running various errands.



  • Technology: Emiew stands 80 cm tall and weighs just 13 kg, has a maximum speed of 6 km/h and an impressive acceleration of 4 m/s2. A specially developed two-wheel mechanism allows Emiew to travel at a rate comparable to humans. An “active suspension” system consisting of a spring and an actuator gives Emiew agility and the ability to roll over small differences in floor levels and various office obstacles such as cables. Emiew is equipped with no fewer than 14 microphones to allow for accurate voice detection including directionality. The robot is also equipped with noise-cancellation technology in order to filter out the noise created by the robot itself, and better focus on that of the users. A series of sensors allows the robot to acutely detect obstacles – either stationary or moving (e.g., people) – and efficiently navigate its way through them in order to quickly reach its destination.








  • System Name: Home Assistant AR



  • Developer: Center of IRT (CIRT), Tokyo University



  • AR (shown in Figure 4.2) is a humanoid robot that is able to help out with daily household chores. Its development focused on giving it the ability to make use of tools, equipment, and appliances that were designed under the assumption of human use. The goal of AR was to be able to silently and autonomously clean within the home. AR can clean up a storage room, sweep and mop floors, remove dishes from the kitchen table and insert them into the dishwasher, open and close doors, and even do the laundry. The robot can even move furniture in order to clean underneath it and when it is finished, put the furniture back in its original location. Its mobility is provided by a two-wheel base that implements a simpler mechanism than the more intricate “leg-based” systems. The robot also has the ability to perform multiple consecutive tasks rather than dealing with each one separately and requiring additional instructions after each completed task.



  • Technology: The Home Assistant AR stands at 160 cm tall and weighs approximately 130 kg. It uses a total of five cameras and six lasers in order to map out and efficiently navigate the home. A range finder also gives AR the ability to judge distances to obstacles. Based on its self-created 3D geometries of the home, it can execute various complex movements. The robot is equipped with a total of 32 joints (three in the neck and head, seven in each arm, six in each hand, one in the hips, and two in the wheel base) in order to provide it with the flexibility to complete various tasks originally designed for humans. For instance, the neck alone can move in three directions while the arm can move in an impressive seven. It can also assess whether a job it completed was successful, and if determined unsuccessful, repeat the job. The robot can work a total of about 30–60 minutes per charge.








  • System Name: Twendy-One



  • Developer: Waseda University



  • Twendy-One (see Figure 4.3) is a type of “human symbiotic” robot that was developed to address the impending labor shortages in the care for aging societies industry. As such, it must incorporate the typical functions carried out by human caregivers including friendly communication, human safety assistance, and dexterous manipulation. The robot was developed with three core principles in mind: Safety, Dependability, and Dexterity. It possesses a unique combination of dexterity with passivity and a high-power output. This allows it to manipulate objects of various shapes with delicacy by passively absorbing external forces generated by their motion. For example, the robot is gentle yet strong enough to support a human getting out of bed but also has the dexterity to remove a piece of toast from the toaster. Other typical tasks carried out by the robot may include fetching things from the fridge, picking things up from the floor, or various cleaning tasks.



  • Technology: Twendy-One stands at 1.46 m tall and weighs approximately 111 kg. It has a total of 47 degrees of freedom including rotational and directional movements. Its shell is overlaid in a soft silicone skin and equipped with 241 force sensors to detect contact (accidental or intentional) with a human user. This allows the robot to act with care when working with human users as well as adapt to unforeseen collisions instantly and react accordingly. The omnidirectional wheelbase allows the robot to move quicker and more efficient than the traditional bipedal design would. The robot is also equipped with twelve ultrasonic sensors and a six-axis force sensor to actively detect objects and users and avoid collisions. The robot can operate approximately 15 minutes on a full charge.








  • System Name: My Spoon



  • Developer: Secom



  • My Spoon (see Figure 4.3) is an assistive eating robot. It provides physical assistance for those who would otherwise require human assistance when eating due to some physical disability. It does still require, however, that the user be able to move their head, eat the food off of the spoon, chew, and swallow while sitting in an upright position. My spoon can be used with nearly all types of everyday foods and does not require special food packets for use. The food, should however, be in bite-size pieces as the robot does not include any cutting mechanisms. My Spoon can also be used for liquids such as soft drinks, coffee, tea, or soup. In contrast to most robots, My Spoon is present not only in Japan, but also across Europe.



  • Technology: My Spoon is 28 cm(W) × 37 cm(D) × 25cm(H) and weighs about 6 kg. It is designed to lie on the table in front of the user. The robot can be operated in one of three modes: manual, semi-automatic, or fully automatic. In manual mode, the user has maximum flexibility and control via a joystick. Using the joystick, the user selects one of four food compartments, fine-tunes the position of the spoon, instructs the spoon to grasp a bite of food, and directs it toward their mouth. In semi-automatic mode, the user simply selects from which compartment they would like a bite and My Spoon automatically picks up a bite from that compartment and brings it to the users’ mouth. In the fully automatic mode, with the simple press of a button, My Spoon will automatically select the compartment and will bring a bite of the food up to the users’ mouth.








  • System Name: Maron-1



  • Developer: Fujitsu



  • The Maron-1 robot (see Figure 4.4) is a cellular phone–operated system for special patient care, home, and office security. The robot is able to monitor its surroundings, take photos, and relay the photos to the user via their cell phone. The robot can store the layout of the house or office and if so directed (via a cell phone), can navigate to a specified location while avoiding obstacles and maneuvering across slight changes in floor height. It can also be given a specified “patrol route” to follow and actively monitor. Maron-1 is then able to detect any moving objects that enter its field of view (e.g., an intruder). Maron-1 is also equipped with an infrared remote-control capability that allows it to operate various appliances such as air conditioners, televisions, etc.



  • Technology: Maron-1 is 32 × 36 × 32 cm and weighs about 5 kg. Its drive mechanism of two powered wheels provides its mobility. It has a head with two cameras that can both pan and tilt to capture as much of the surrounding area as possible. It is also equipped with an infrared sensor/emitter and proximity sensor for appliance operation and obstacle detection. It uses Microsoft’s WinCE 3.0 software that allows for its communication with mobile phones. The user interface consists of a touchpad, five menu keys, two function keys, a 10 cm LCD monitor, a speaker, and a microphone. The robot can operate for about 12 hours on a single charge.








  • System Name: NetTransorWeb



  • Developer: Bandai and Evolution Robotics



  • The NetTransorWeb (Figure 4.4) robot from Bandai and Evolution Robotics was designed as a house robot for families and hobbyists alike. It was also given the quirky ability to blog, which distinguishes it from most other robots. It can even respond to comments left on the blog. For example, it can write context-related replies or take another picture of something in the house, perhaps from a different angle. Aside from this unique feature, the primary purpose of the robot is surveillance. The robot monitors the home and can take pictures of anything that moves (i.e., in the case of an intruder) and immediately alert the owner. It can receive instructions over the internet or autonomously navigate its way around the home.



  • Technology: The NetTransorWeb is about 190 × 160 × 160 mm big and weighs just 980 grams. Its battery allows for roughly 2.5 hours of operation between charges. It is equipped with cameras, microphones, speakers, and motion sensors. It is also easily connected to the internet or home network over a Wi-Fi connection. The ViPR Vision System from Evolution Robotics provides the robot with a level of intelligence such that it can learn its environment, detect and navigate obstacles, and also perform various tasks including the monitoring and reporting of any unexpected changes in its environment. NetTransorWeb can also collect news from the internet (via RSS) and use it to contribute to the blog. Its blogging abilities include uploading, commenting, and answering, often with witty retorts. Unfortunately, the system is not In the market anymore.






Table 4.1 Overview about Robots Introduced in this Chapter


































System Name Developer
PaPeRo (Partner-type-Personal-Robot) NEC Corporation
Mamoru-Kun Center of IRT (CIRT), Tokyo University
Emiew 2 (Excellent Mobility and Interactive Existence as Workmate) Hitachi
Home Assistant AR Center of IRT (CIRT), Tokyo University
Twendy-One Waseda University
My Spoon Secom
Maron-1 Fujitsu
NetTansorWeb Bandai and Evolution Robotics




Figure 4.1 Left: Personal Robot PaPeRo petit.


Source: YOSHIKAZU TSUNO/AFP/Getty Images




Figure 4.2 (a) Left: EMIEW3.


Source: TOSHIFUMI KITAMURA/AFP/Getty Images

Figure 4.3



Left: Twendy-One.


Source: Shigeki Sugano Lab., Waseda University




Right: My Spoon.


Source: STR/AFP/Getty Images




Figure 4.4 The NetTransorWeb.


(Source: Bandai Co., Ltd)


4.2 Social Interaction


In this section, the introduced technology is focused more on the robotics and techniques supporting the care staff in their work. Thereby, the care staff will get released from the physical work and have more time for the social interaction with the elderly. In Table 4.2, a short overview about the robots introduced in this section is presented.






  • System Name: Panasonic Life Wall



  • Developer: Panasonic



  • The “Life Wall TV” from Panasonic transforms an entire wall into an interactive touch screen television that provides large amounts of information and ubiquitous communication. The result is a wall embedded with a digital interface that allows any member of the family to independently access its entertainment, productivity, or communication features. It includes facial and voice recognition software in order to recognize which member of the family is present and will display the graphical user interface (GUI) specific to that person. The display can be made as small or large as the user would like (within the dimensions of the actual screen). The wall can also track your movement throughout the room and, for example, pause your movie while you get up to answer the phone or even have the smaller display follow you around the room. The touch screen allows for stimulating interaction, whether it is for games, system navigation, or any other task. When not in use, the system can be set to a scenic background of the users’ choice or it can even disguise itself as a more typical wall by displaying, for example, some bookshelves.



  • Technology: Panasonic’s Life Wall uses a large LCD display. Special facial and voice recognition software allow the system to determine which family member is currently using it and adjust accordingly. The display will show the background, pictures, programs, location of icons and tools, etc., specific to that person. The size, clarity, and interaction ability of the system take photo viewing, videoconferences, and computer games into another dimension. The Life Wall is also equipped with “Wireless HD” abilities to incorporate the internet into any of these aforementioned features.








  • System Name: Robot Town & Robot Care



  • Developer: Professor Hasegawa, Kyushu University



  • Robot Town (see Figure 4.5) is based on the premise of robots’ limitations in recognizing the environment they are in, reacting to it, and learning from these experiences. This project looks to demonstrate how robots can be more efficient in the future. According to Professor Hasegawa, in order for this to happen, the environments must be structured in a way that they become more recognizable to the robots. In this way, the necessary complexities are removed from the robot itself and transferred into the intelligent systems comprising the surrounding environment. These intelligent systems supply to the robot information such as location – the location of other robots – and direction for action. This approach was further developed into a “Town Management System” (TMS) in which the city is in constant interaction with the robots. The robots are provided with specific relevant information and duties. Thanks to the TMS, it is no longer required that the robot itself be equipped with these high-performance systems but can rather rely on the infrastructure already in place. This allows the robots to be more mobile. At the same time, 1,000 RFID tags dispersed throughout the town provide the robots with real time information on their precise location and state of their surroundings.



  • Technology: The TMS was designed specifically with senior homes in mind. The system was also tested in this setting. It was used to support the nursing staff in the home. The test setting covered 1700 m2 and was equipped with a video monitoring system.








  • System Name: Ubiquitous Monitoring System



  • Developer: Hitachi Laboratories



  • The concept of “Monitoring Systems” displays not only the instantaneous health parameters (i.e., vital signs), but also provides a continuous monitoring of these parameters as well as considers the future trends and implications of any perceived patterns. If a problem is detected with one of the monitored users, the system can take the necessary actions including alerting a caregiver or emergency services. The system is able to actively track and monitor a variety of users and their various performances in their daily routines in real time. For example, parents could monitor their children as the children make their way to school to ensure that they arrive there safely. The system uses “Peer-to-Peer” (P2P) technology. This type of system can also be used for the tracking of everyday objects such as wallets or backpacks, and should these items be misplaced, can notify the user on their location. During the development of the system, an emphasis was placed on their universal use in the context of demographic change. This system could be used in a variety of different settings including public (e.g., by the police) and private (e.g., in the home setting).



  • System Name: Secure-Life Electronics



  • Developer: Various researchers and companies (e.g., NEC, Toshiba, and Hitachi)



  • The goal of the Center of Excellence (COE) programs is to further develop MST-based technologies. This is important as it allows the improvement of the quality of life of those individuals that may require assistance in carrying out their necessary daily activities. When developing these technologies, it is important to adopt a universal approach in order not to exclude populations from different cultures, backgrounds, economic standing, etc. These specialized technologies are used for social interaction as well as technical and physical infrastructure and therefore positively influence the everyday lives of the aging population and their caregivers. As these technologies often include multi-faceted aspects, developers with specializations covering a wide variety of backgrounds are required. This is true for many technologies presented in this book and facilitates the important synergetic technology transfer between fields.



  • Technologies included within these systems include:




    • Sensory systems



    • Information processing networks



    • Actuators



    • Devices with various application cases



    • Systems integration



    • Bio-sensory systems



    • “Right-Brain Computing”









  • System Name: Input Devices & Health Care



  • Developer: Various researchers and companies



  • According to the national “u-Japan” strategy, “ubiquitous” communication and specifically the user interfaces required for these communication devices, will play an increasing role in the future. This is as a result of the significant demographic changes being experienced by many developed countries, specifically in Asia (i.e., Japan) and Europe (i.e., Germany). The interdisciplinary field of microsystems technology is used in various areas of the development of these technologies. Research and experimentation on topics such as sensory systems, actuating systems, bio-MEMS, insect-based robots, etc., is ongoing and will provide significant advancements in this field in the not too distant future. However, a clear focus will remain on different interfaces allowing seamless and intuitive human-machine interaction.



  • Technologies in this field include:




    • Wearable Input Devices



    • Mobile Pointing Shoes



    • Systems that provide health data



    • Organic Semiconductor-based strain sensors









  • System Name: Ubiquitous Communication



  • Developer: YRP & UID Center, Ken Sakamura



  • In Tokyo’s business, entertainment district and the most famous shopping quarter in Japan, Ginza, a large-scale experiment involving the use of RFID tags is ongoing. Approximately 10,000 RFID tags are distributed throughout the district and are interconnected with various Bluetooth systems, internet servers, special reading devices, and information systems. The experiment is being made multi-lingual allowing non-native people and businesses to also take part. This large-scale Tokyo Ubiquitous Network project uses an extensive RFID network structure in order to provide users the ability to accurately determine their exact location and easily plan their navigation through the large urban center. Tokyo’s governor, Shintaro Ishihara, inaugurated the experiment at the opening event in Ginza. As each building contains several stores, bars, and clubs, finding the right one can often be difficult. With this new technique, a push of a button will allow you to immediately find your exact location and in which direction you must go to reach your desired destination.



  • Technology: From the beginning of December 2008, RFID tags were dispersed throughout the entire quarter on buildings, street corners, street lamps, and in bars and shops. Reading devices with a 3.5 inch OLED display allowed users to read the RFID tags and obtain location data and directions. Each RFID tag has a specific code indicating its specific location. This data is wirelessly transmitted over WLAN to a centralized server. The server transmits the desired data (i.e., location, specific directions) back to the user’s reading device. This experiment, managed by the Tokyo Ubiquitous Computing Center, was a joint venture between the Japanese government, the city of Tokyo, the Ministry of Agriculture Infrastructure and Transport (MILT) as well as several other additional companies. Similar experiments are currently ongoing in other Japanese cities.



  • System Name: Interaction: PARO



  • Developer: AIST, Takanori Shibata



  • The positive influence animals have on the elderly or those suffering psychological disabilities is already well known. However, the integration of living animals into these homecare situations is often difficult and impractical. To address this, AIST has developed PARO (see Figure 4.6), an advanced interactive robot able to provide the documented benefits of animal companionship. The robot addresses concerns such as misbehavior, hygiene, noises, and smells that may accompany living animals. PARO has been found to reduce stress in both patients and caregivers and even stimulates interaction between the two. By interacting with the users (speaking with them, listening to them, and responding to their touch), PARO effectively simulates an animal companion and provides direct benefits such as; increased activity in social interaction, including visual, verbal, and physical. Additionally, PARO also ensures increased levels of both motivation and relaxation.



  • Technology: PARO is equipped with five different types of sensors: tactile, light, audition, temperature, and posture. This allows the robot to actively perceive people and its environment. The tactile sensor allows it to detect touch movements such as stroking or petting while the posture sensor allows it to detect when and in what position it is being held. The audio sensor allows it to detect voices (and the direction of their origin), greetings, and compliments. PARO is also able to learn and remember actions; for example, if it is stroked each time after a certain movement, it will try to repeat that movement in order to be stroked again. These sensors and movements allow the robot to respond to users as if it was alive, even mimicking the voice of a genuine baby seal.



Source: www.AIST.jp.





  • System Name: Interaction and Information: WAKAMARU



  • Developer: Mitsubishi



  • The WAKAMARU (see Figure 4.6) robot from Mitsubishi was primarily developed as a companion and helper to the elderly and disabled. It is not, however, used for housework chores such as vacuum cleaning and unloading of the dishwasher. Instead, the robot acts as a sort of secretary for the user. It can move, follow the user around, take notes, and remind them of appointments. It can also provide the user with updated information from the internet, such as the weather forecast, and give advice accordingly such as what types of clothes to wear and to remember their umbrella. It can also be used to ensure that users remember such vital tasks as taking their daily medication.



  • Technology: The robot stands approximately 1 m in height and weighs 30 kg. It has a flat circular base on which it can roll around with a speed of up to 1 km/h. Facial recognition software allows the robot to recognize and remember up to 10 faces. It is equipped with touch and motion sensors, two video cameras, and four microphones to autonomously interact with its environment. Its ultrasound capabilities also allow it to recognize, detect, and avoid obstacles. Using this knowledge, it creates a plan of all the rooms in the house and is constantly aware of its own location. This allows the robot to know where to go and how to get there. The software is Linux-based and includes a vocabulary of over 10,000 words. This makes the robot fully adequate to converse with human users. The robot is constantly connected to the internet and can readily answer any knowledge-based question. It can recall up to 10 people and remember their daily routines or preferences. It also saves all dates, appointments, etc. it is told and can provide reminders as the time approaches. It can operate for approximately 2 hours before requiring charging, which it can also initiate by itself.



Only gold members can continue reading. Log In or Register to continue

Oct 12, 2020 | Posted by in General Engineering | Comments Off on 4 – Existing AAL Products
Premium Wordpress Themes by UFO Themes