To set up a new technology system (smartphone working in combination with a motion sensor) that supports the performance of people with disabilities in multistep activities
Adults with intellectual disabilities and sensory impairments ()
Amazon/Alexa
(i) Cognitive aid (ii) Instruction
Independence aid
Cognitive load during setup
The new system, with a higher mean percentage of correct responses, has the potential to enhance participants’ activity engagement and satisfaction, making it a more effective method for multistep instruction compared to the control system
To explore the experience and response of older adults using voice assistant systems
Older adults with mild cognitive impairment and dementia ()
Amazon/Alexa
(i) Information retrieval (ii) Interface control/device interaction
Independence aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Unique and quickly changing user needs
Older adults are enthusiastic about voice assistant use, but uptake declines with cognitive impairment. Cognitively intact individuals show greater potential for effectively utilizing these devices. 85.9% desire to have a voice assistant in their homes in the future
To create a database of dysarthric speech to train voice assistants on
Adults with dysarthria () and healthy adults ()
(i) Development/prototype (ii) Generic/nonspecific (iii) Microsoft/Cortana
(i) Communication (ii) Dictation (iii) Interface control/device interaction
Independence aid and participation aid
Speech interpretation
The poor performance of commercial voice assistant systems emphasizes the necessity for dysarthric speech corpora to develop better assistive technologies
To explore the features required in spoken dialogue systems used by people with dementia and to present a system to improve their quality of life
Older adults with dementia () and caregivers ()
Google
(i) Cognitive aid (ii) Scheduling (iii) Information retrieval
Independence aid
(i) Unique and quickly changing (ii) User needs
Positive interaction and personalized adaptation are vital for supporting people with disabilities. A prototype outing assistant with voice and visual interactions, encouraging activities over intervention, demonstrates the significance of dementia-friendly technologies
To explore wake-up interactions for deaf and hard of hearing users for potential personal assistant devices that understand sign language commands
Deaf and hard of hearing adults ()
Amazon/Alexa
Interface control/device interaction
Independence aid
(i) Lack of nonverbal control (ii) Privacy (iii) Sign language
Participants preferred the sign name option for waking up Alexa due to convenience and reliability. Hands-free talk-to-talk techniques were favored for equivalence to voice-based interactions. Privacy and accidental wake-ups with camera-based interactions were concerns
To identify studies of automatic speech recognition systems and analyze the interactions between people with dysarthria and those systems/devices
Adults with dysarthria
(i) Amazon/Alexa (ii) Apple/Siri (iii) Google (iv) Microsoft/Cortana
Interface control/device interaction
Independence aid
(i) Response time interval (ii) Speech interpretation
Automatic speech recognition systems struggle with dysarthric speech due to its complexity, limited data, and varied severity. Involving dysarthric users in design and testing is crucial to addressing the challenges faced by voice assistants
To assess the outcomes of the implementation of the codesigned intervention with older people using a technology and coaching package
Older adults ()
(i) Amazon/Alexa (ii) Google
(i) Communication (ii) Environmental controls (iii) Health management (iv) Interface control/device interaction (v) Reminders (vi) Safety
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Cost (iv) Internet connection (v) Lack of nonverbal control (vi) Limited functionality
In-home technology coaching empowers older adults with enhanced skills and confidence in using devices, fostering effective integration of assistive technologies through a user-centered approach, resulting in improved well-being, quality of life, and a greater sense of safety and security
To explore the habits, preferences, expectations, and experiences of individuals with visual impairments with their daily activities at home and experiences with smart home technologies in a Brazilian context
Adults with visual impairments ()
Generic/nonspecific
(i) Environmental controls (ii) Interface control/device interaction (iii) Media management (iv) Safety
To explore user experiences with a smart speaker to better understand the device’s potential for everyday living and potential impact on health and well-being
Adults with diabetes, dementia, Parkinson’s disease, asthma, Behçet’s disease, Cushing’s syndrome, phenylketonuria, liver disorders, low mood, depression, anxiety, dyslexia, cognitive impairment, or severe visual impairment () and caregivers ()
Amazon/Alexa
(i) Communication (ii) Companionship/connection (iii) Health management (iv) Information retrieval (v) Interface control/device interaction (vi) Learning (vii) Media management (viii) Reminders (ix) Safety (x) Scheduling
Independence aid and participation aid
Speech interpretation
Patients and carers showed optimistic attitudes towards the assistive device, reporting positive impacts on health and well-being
To explore how conversational agents empower older adults with mild cognitive impairment and their care partners
Adults with mild cognitive impairment () and their care partners ()
Google
(i) Communication (ii) Companionship/connection (iii) Environmental controls (iv) Health management (v) Information retrieval (vi) Media management (vii) Reminders (viii) Safety (ix) Scheduling
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use
Conversational agents proved valuable for empowering individuals with mild cognitive impairment and their care partners. Adequate training and support enabled successful integration into daily activities, with benefits reported in information access, media, and caregiving tasks
To test whether amyotrophic lateral sclerosis patients can control a visual brain-computer interface speller
Adults with amyotrophic lateral sclerosis () and healthy adults ()
Generic/nonspecific
(i) Communication (ii) Interface control/device interaction
Independence aid and participation aid
Brain control interfaces
The brain-computer interface speller demonstrated superior performance for amyotrophic lateral sclerosis patients and young healthy participants. It allows for potential applications beyond typing, such as controlling smart home devices. Participants generally rated the system positively
To explore the use of the Dexcom G6 Siri feature in blind patients with diabetes on intensive insulin therapy on glycemic control
Blind adults with diabetes ()
Apple/Siri
(i) Health management (ii) Interface control/device interaction
Independence aid
Upkeep and maintenance
Dexcom G6 with Siri improved glycemic control and reduced severe hypoglycemia. Visually impaired patients accessed real-time glucose levels via voice command. Dexcom app on smartphones offered customizable alerts for hypoglycemia/hyperglycemia
To explore the current practice of mainstream smart home technology delivery as assistive technology
Providers working with people with disabilities ()
Generic/nonspecific
(i) Communication (ii) Environmental controls (iii) Interface Control/device interaction (iv) Safety
Independence aid and participation aid
(i) Cognitive load during use (ii) Cost (iii) Lack of educated providers (iv) Limited functionality (v) Privacy (vi) Removal of past assistive features (vii) Upkeep and maintenance
Smart home technology benefits people with disabilities but faces challenges in delivery and maintenance as assistive technology
To examine the feasibility of commercially available smart home automation technology intervention for individuals with Alzheimer’s disease and related dementia with emphasis on their safety and independence and reduction of care burden
Older adults with Alzheimer’s disease and related dementia and their caregivers ( dyads)
Amazon/Alexa
(i) Communication (ii) Environmental controls (iii) Health management (iv) Interface control/device interaction (v) Media management (vi) Reminders (vii) Safety (viii) Scheduling
Independence aid and participation aid
(i) Caregiver hesitance (ii) Cognitive load during setup (iii) Compatibility issues (iv) Internet connection (v) Lack of educated providers (vi) Unique and quickly changing user needs (vii) Upkeep and maintenance
Mainstream home automation tech can enhance safety, activity engagement, and caregiver connectivity for those with Alzheimer’s disease and related dementia. Addressing concerns and introducing tech early to optimize benefits, supporting aging-in-place with autonomy
To examine the range and extent of personal voice assistants used for older adults living in the community, their technology readiness level, associated outcomes, and the strength of evidence
Older adults
(i) Amazon/Alexa (ii) Apple/Siri (iii) Development/prototype (iv) Google
(i) Cognitive aid (ii) Communication (iii) Companionship/connection (iv) Health management (v) Information retrieval (vi) Interface control/device interaction (vii) Media management (viii) Reminders (ix) Scheduling
Independence aid and participation aid
(i) Cognitive load during use (ii) Privacy (iii) Speech interpretation
Existing studies on older adults’ use of personal voice assistants found convenience and usefulness for tasks like reminders and information searching. However, evidence supporting their efficacy remains limited due to the focus on usability rather than completed effectiveness studies
To synthesize current research in the context of voice assistant for older adults and propose specific research designs to provide better insights into the adoption and use of voice assistants in advanced age
Older adults with intellectual disabilities
Generic/nonspecific
(i) Communication (ii) Companionship/connection (iii) Health management (iv) Information retrieval (v) Interface control/device interaction (vi) Media management (vii) Reminders
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Compatibility issues (iv) Privacy
Authors stress voice assistant studies for older adults, noting benefits like social interaction, well-being support, and independence. Concerns include privacy, dependency, and training needs
To investigate whether conversational interfaces might improve the daily experience of students, professors, and various users within a university campus
Blind adults () and wheelchair user ()
Amazon/Alexa
(i) Information retrieval (ii) Interface control/device interaction (iii) Navigation
Independence aid
Speech interpretation
Volunteers accomplished ≥2/3 tasks and gave positive ratings for the voice app. User comfort varied, and noise affected Alexa’s speech recognition
To explore the use of a novel voice-based interaction system that supports users with complex needs by introducing a personalized, human voice command between smart home devices and sensors
Adults with dementia, learning disabilities or autism, older and frail people, and social and healthcare organizations ()
(i) Amazon/Alexa (ii) Development/prototype (iii) Google
(i) Cognitive aid (ii) Environmental controls (iii) Reminders
Independence aid
(i) Cognitive load during use (ii) Lack of nonverbal control (iii) Speech interpretation
The novel solution supports complex needs, enhances independence, reinforces learning of commands, and provides transparent cause-and-effect explanations for home automation events
To develop a functioning brain-computer interface prototype that can be integrated into a smart home to support people with limited motor abilities
Adults with motor and speech impairments ()
(i) Amazon/Alexa (ii) Development/prototype (iii) Google
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction
Independence aid and participation aid
(i) Brain control interfaces (ii) Lack of nonverbal control
The brain-computer interface for smart speakers got 72.82% overall accuracy. This interface empowers people with limited motor abilities to perform tasks independently, enhancing their daily life autonomy
To develop a smart human-environment interactive environment using eye tracking for smart control devices for amyotrophic lateral sclerosis patients
Healthy adults ()
Development/prototype
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Safety
Independence aid and participation aid
(i) Lack of nonverbal control (ii) Privacy
The system achieves an average accuracy of 93.2% in different light conditions and demonstrates positive user experiences across education levels and age groups
To develop a brain-computer interface system for home appliance control
Adults with motor impairments () and healthy adults ()
Development/prototype
(i) Environmental controls (ii) Interface control/device interaction (iii) Media management
Independence aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Lack of nonverbal control
The proposed system achieved improved threshold-free classification accuracies of around 92.44% (healthy) and 89.33% (motor impairment), showing cognitive abilities’ influence on performance, not disability type
To develop a system for nonverbal, sound-based interactions that people with a wide range of speech disorders can use to interact with mobile technology
Adults with speech impairments ()
(i) Apple/Siri (ii) Development/prototype
Interface control/device interaction
Independence aid
(i) Lack of nonverbal control (ii) Speech interpretation
The system achieved an 82% success rate in detecting nonverbal sounds for users with disability. It showed promise in enhancing communication for those with speech impairments and providing accessibility for individuals with limited mobility. Positive feedback was received from users in situations where they could not interact with technology otherwise
To improve augmentative and alternative speech recognition performance of smart speakers by developing a text-to-speech production
Augmentative and alternative communication device users
(i) Development/prototype (ii) Google (iii) Samsung/Bixby
Interface control/device interaction
Independence aid
(i) Lack of nonverbal control (ii) Response time interval
Augmentative and alternative interaction symbols and boards, along with optimized text-to-speech production format, improve speech recognition and enable independent device control and Internet interaction
To control a home automation system through a brain-computer interface that allows the construction of voice commands for people with motor impairments
Healthy adults ()
(i) Development/prototype (ii) Google
(i) Communication (ii) Environmental controls (iii) Information retrieval (iv) Interface control/device interaction (v) Media management (vi) Scheduling
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Lack of nonverbal control
Brain-computer interface enabled users to access diverse media and environmental controls, along with a texting app. Integrating voice commands with brain-computer interface enhanced device control
To understand whether how and why voice assistants are suitable for blind and visually impaired people and what has to be done for an even better acceptance and benefit
Adults with visual impairments ()
(i) Amazon/Alexa (ii) Apple/Siri (iii) Google
(i) Communication (ii) Dictation (iii) Environmental controls (iv) Information retrieval (v) Interface control/device interaction (vi) Math (vii) Media management (viii) Navigation (ix) Reminders (x) Scheduling
Independence aid and participation aid
(i) Internet connection (ii) Limited functionality (iii) Privacy (iv) Removal of past assistive features (v) Response time interval (vi) Speech interpretation
Voice assistants benefit visually impaired users with tasks like entertainment, Internet access, time, calendars, and notes. They appreciate smart home integration, existing functionalities, and voice commands. Privacy and data security concerns persist, but 86% find the systems helpful, indicating an overall positive outlook with room for improvement
To provide an overview of the usability and barriers of smart speakers for people with disabilities
People with disabilities
(i) Amazon/Alexa (ii) Apple/Siri (iii) Generic/nonspecific (iv) Google (v) Microsoft/Cortana (vi) Samsung/Bixby
(i) Cognitive aid (ii) Communication (iii) Dictation (iv) Environmental controls (v) Information retrieval (vi) Interface control/device interaction (vii) Reminders (viii) Scheduling (ix) Spelling
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Lack of nonverbal control (iv) Limited functionality (v) Response time interval (vi) Speech interpretation (vii) Unique and quickly changing user needs (viii) Upkeep and maintenance
Voice assistants show promise as inclusive assistive technologies for enhancing quality of life. Acceptance by diverse user groups, nonstigmatizing nature, and accessibility are key strengths. Addressing challenges and adopting an inclusive approach is vital for maximizing their potential in different situations and contexts
To determine whether Canadians with cognitive disabilities could benefit from voice-activated intelligent personal assistants to access digital services and increase their participation in the digital economy
Adults with cognitive impairments ()
Amazon/Alexa
(i) Environmental controls (ii) Health management (iii) Information retrieval (iv) Instruction (v) Interface control/device interaction (vi) Learning (vii) Math (viii) Media management (ix) Navigation (x) Reminders (xi) Safety (xii) Scheduling
Independence aid
(i) Cognitive load during use (ii) Compatibility issues (iii) Limited functionality (iv) Speech interpretation (v) Response time interval (vi) Unique and quickly changing user needs
Users’ opinions on voice assistants were positive, citing increased agency and access to well-being features. Echo Dot proved cost-effective for enhancing the quality of life for vulnerable groups. However, certain complex Alexa Skills, like transportation planning, caused hesitation and confusion due to extensive interactive dialogues
To evaluate the perception of smart home device benefits in 6 different consumer groups, including people with disabilities
Adults with disabilities ()
(i) Amazon/Alexa (ii) Google
(i) Environmental controls (ii) Interface control/device interaction (iii) Safety
Independence aid
(i) Cognitive load during setup (ii) Cost (iii) Lack of skills/apps (iv) Privacy
Persons with disabilities prioritize safety, security, and well-being. Participants did not show a knowledge increase in voice-controlled smart home tech, indicating the need for improved demonstrations. Well-being was their primary benefit, followed by safety and security. Key barriers were cost and consumer privacy concerns
To develop a smart home and wheelchair application controlled by a steady-state visual evoked potential-based brain-computer interface system
Healthy adults ()
Development/prototype
(i) Environmental controls (ii) Interface control/device interaction (iii) Navigation
Independence aid
(i) Brain control interfaces (ii) Cost (iii) Lack of nonverbal control
Subjects achieved nearly perfect accuracy in device interaction and wheelchair navigation tasks (>90%). The system controlled a virtual power wheelchair and various household devices with ease. Advantages: low cost, wireless, portable, user-friendly, and high control accuracy without extensive training. Subjects demonstrated increased confidence and competence over time, finding the system easily adaptable and learnable
To investigate the use and usefulness of virtual home assistants among older adults and their support persons
Older adults with chronic health conditions and their caregivers ( dyads)
Amazon/Alexa
(i) Communication (ii) Companionship/connection (iii) Health management (iv) Information retrieval (v) Interface control/device interaction (vi) Media management (vii) Reminders (viii) Safety (ix) Scheduling
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Compatibility issues (iv) Lack of educated providers (v) Limited functionality (vi) Unique and quickly changing (vii) User needs (viii) Upkeep and maintenance
Participants found virtual health assistants beneficial for aging in place, but faced challenges in learning and adapting to the technology. They utilized these devices for various activities, valuing features like timers and reminders. These devices were perceived as enhancing security and providing convenience, but more training and periodic assessments were desired to maximize their potential
To investigate the benefits of personalized augmented assistive technology to support people living with dementia in their daily activities in the home
Older adults with early-to-mild stage dementia ()
(i) Amazon/Alexa (ii) Google
(i) Companionship/connection (ii) Environmental controls (iii) Health management (iv) Information retrieval (v) Interface control/device interaction (vi) Media management (vii) Reminders (viii) Scheduling
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Privacy (iv) Speech interpretation (v) Unique and quickly changing user needs (vi) Upkeep and maintenance
Participants welcomed the technology, finding it helpful, but it is currently unsuitable for long-term use in dementia. Inclusive development approaches can inform future advancements. Successful deployment at home requires addressing both the person with dementia and their support person’s needs
To investigate healthcare providers’ perspectives on using smart home systems for self-management and care in people with heart failure
Healthcare providers ()
Generic/nonspecific
(i) Communication (ii) Health management (iii) Information retrieval (iv) Reminders (v) Safety
Independence aid and participation aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Cost
Participants saw potential in smart home systems for improving self-management of heart failure but had reservations. Benefits included remote monitoring, services, and independent living for patients
To evaluate how participants with heart failure interacted with a voice app version of a digital therapeutics
Adults with heart failure ()
Amazon/Alexa
(i) Companionship/connection (ii) Health management (iii) Information retrieval (iv) Media management (v) Reminders
Independence aid and participation aid
(i) Cognitive load during use (ii) Speech interpretation (iii) Upkeep and maintenance
Participants’ engagement declined due to app unreliability and adaptation challenges. Older users showed higher engagement, while middle-aged participants found the app more acceptable. Integration into daily routines and convenience for heart failure measurements were observed, but technical limitations and usability issues existed
To present the challenges of designing and evaluating conversational agents derived from recent healthcare projects conducted in the last 2 years
Older adults with chronic health conditions
Generic/nonspecific
(i) Health management (ii) Interface control/device interaction
Independence aid
(i) Cognitive load during use (ii) Cost (iii) Internet connection (iv) Privacy (v) Speech interpretation
Common challenges for conversational agents include domain integration, conversational competence, user-system interaction, and evaluation. In healthcare, additional issues arise, like empathy, safety, recruitment of vulnerable populations, and testing in real-world settings
To evaluate the ability of common voice recognition systems to transcribe dysphonic voices
Adults with speech impairments () and healthy adults ()
(i) Amazon/Alexa (ii) Apple/Siri (iii) Google
Interface control/device interaction
Independence aid
Speech interpretation
Voice disorders are linked to lower word recognition. Surprisingly, a faster speech rate improved word recognition. Dysphonia severity consistently affected word recognition, while voice technology accuracy depended on dysphonia perception, not articulation disorders
To develop a hand gesture dataset to recognize the gestures of sign language and convert it into verbal language
Deaf and hard of hearing adults
Development/prototype
(i) Dictation (ii) Interface control/device interaction
Independence aid
(i) Lack of nonverbal control (ii) Limited functionality (iii) Sign language
The system achieves 99.13% accuracy in recognizing and translating sign language gestures into speech, promising significant assistance to speech-impaired individuals, enhancing communication, and providing them with a virtual voice
To create a system to facilitate communication between a brain-computer interface and devices in the environment using voice commands
Adults with amyotrophic lateral sclerosis () and healthy adults ()
(i) Development/prototype (ii) Google
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Media management
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Compatibility issues (iv) Response time interval
Amyotrophic lateral sclerosis participants found the brain-computer interface system easy to use and rated mental demand and effort between 50 and 60 (out of 100). The system’s uniqueness lies in its user-friendliness and customizable menus for controlling devices, making it valuable to the field
To review all the papers on brain-computer interface-based smart home systems published in the last 6 years
Adults with motor and speech impairments
(i) Development/prototype (ii) Generic/nonspecific
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Media management (v) Navigation
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Cost (iv) Lack of nonverbal control (v) Limited functionality (vi) Upkeep and maintenance
Promising results in brain-computer interface system accuracy. These systems are practical for interaction with smart home devices but lack user-friendliness. User comfort was identified as a major issue
To present a brain-computer interface paradigm for controlling home appliances
Healthy adults ()
Development/prototype
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Media management
Independence aid and participation aid
Brain control interfaces
Satisfactory control of appliances and phone calls. Enhanced freedom, more devices to control, and dialing phone numbers for improved quality of life, especially for individuals with disabilities
To create a communication bridge between a brain-computer interface speller platform and the messaging services of WhatsApp, Telegram, e-mail, and SMS through the use of a virtual assistant running in a smartphone
Healthy adults ()
Google
(i) Communication (ii) Interface control/device interaction
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Lack of nonverbal control (iv) Response time interval (v) Upkeep and maintenance
Brain-computer interface-based spelling system showed promising results with healthy subjects, achieving 86.14% accuracy and positive usability feedback. However, for subjects with motor impairment, accuracy is generally lower (67-68%) due to difficulty in gaze control, requiring improvements for real-world implementation
To develop a prototype smart home gesture-based control facility that can control various home appliances
Adults with speech impairments
Development/prototype
(i) Environmental controls (ii) Interface control/device interaction
Independence aid
Lack of nonverbal control
Gesture-based smart home control system enables easy appliance operation with hand movements, aiding people with disabilities, eliminates multiple controllers, integrates well with smart speakers, and promotes user-friendliness
To develop an automatic speech recognition model for use between the smart assistant and the user with improved accuracy for atypical speech
Adults with speech impairments
(i) Development/prototype (ii) Google
Interface control/device interaction
Independence aid
Speech interpretation
Automatic speech recognition models show promise in learning atypical speech patterns, highlighting the significance of accessible technology for those with speech impairments
To explore how interactions between people with disability and voice assistant technology have an impact on the individual and collective well-being
Adults with visual and motor impairments () and their family members ()
Google
(i) Environmental controls (ii) Information retrieval (iii) Interface control/device interaction (iv) Media management (v) Reminders (vi) Scheduling
Independence aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Internet connection (iv) Limited functionality (v) Privacy (vi) Sign language (vii) Speech interpretation (viii) Upkeep and maintenance
Voice assistant technology enhances independence, daily activities, and reduces disparities for people with disabilities. Challenges include integration with smart home tech, performance improvements, and disability-focused features. Consider socioeconomic barriers for wider access. Positive impact on well-being and quality of life
To examine deaf and hard of hearing users’ experience and attitude towards personal assistants, as well as potential interactions with such devices
Deaf and hard of hearing adults ()
Generic/nonspecific
(i) Communication (ii) Environmental controls (iii) Information retrieval (iv) Interface control/device interaction (v) Media management (vi) Notes (vii) Reminders
Independence aid and participation aid
(i) Lack of nonverbal control (ii) Privacy (iii) Sign language (iv) Speech interpretation
Over 60% of participants expressed interest in personal assistant devices that understand sign language and show sign language videos/animations on screen. They preferred text output but were interested in unique applications like receiving alerts about sounds and requesting sign language interpretation
To explore a device that interprets and recognizes American sign language to control smart home components
Adults who use sign language
Development/prototype
(i) Environmental controls (ii) Interface control/device interaction
Independence aid
Sign language
The device is capable of recognizing 4 signs (lights on, play, stop, and idle) with 89.4% accuracy. It can also initiate corresponding actuation commands with the same accuracy level
To evaluate how participants with heart failure interacted with a voice app version of a digital therapeutics
Adults with heart failure ()
Amazon/Alexa
(i) Companionship/connection (ii) Health management (iii) Information retrieval (iv) Media management (v) Reminders
Independence aid and participation aid
(i) Cognitive load during use (ii) Speech interpretation (iii) Upkeep and maintenance
Participants’ engagement declined due to app unreliability and adaptation challenges. Older users showed higher engagement, while middle-aged participants found the app more acceptable. Integration into daily routines and convenience for heart failure measurements were observed, but technical limitations and usability issues existed
To present the challenges of designing and evaluating conversational agents derived from recent healthcare projects conducted in the last 2 years
Older adults with chronic health conditions
Generic/nonspecific
(i) Health management (ii) Interface control/device interaction
Independence aid
(i) Cognitive load during use (ii) Cost (iii) Internet connection (iv) Privacy (v) Speech interpretation
Common challenges for conversational agents include domain integration, conversational competence, user-system interaction, and evaluation. In healthcare, additional issues arise, like empathy, safety, recruitment of vulnerable populations, and testing in real-world settings
To evaluate the ability of common voice recognition systems to transcribe dysphonic voices
Adults with speech impairments () and healthy adults ()
(i) Amazon/Alexa (ii) Apple/Siri (iii) Google
Interface control/device interaction
Independence aid
Speech interpretation
Voice disorders are linked to lower word recognition. Surprisingly, a faster speech rate improved word recognition. Dysphonia severity consistently affected word recognition, while voice technology accuracy depended on dysphonia perception, not articulation disorders
To develop a hand gesture dataset to recognize the gestures of sign language and convert it into verbal language
Deaf and hard of hearing adults
Development/prototype
(i) Dictation (ii) Interface control/device interaction
Independence aid
(i) Lack of nonverbal control (ii) Limited functionality (iii) Sign language
The system achieves 99.13% accuracy in recognizing and translating sign language gestures into speech, promising significant assistance to speech-impaired individuals, enhancing communication, and providing them with a virtual voice
To create a system to facilitate communication between a brain-computer interface and devices in the environment using voice commands
Adults with amyotrophic lateral sclerosis () and healthy adults ()
(i) Development/prototype (ii) Google
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Media management
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Compatibility issues (iv) Response time interval
Amyotrophic lateral sclerosis participants found the brain-computer interface system easy to use and rated mental demand and effort between 50 and 60 (out of 100). The system’s uniqueness lies in its user-friendliness and customizable menus for controlling devices, making it valuable to the field
To review all the papers on brain-computer interface-based smart home systems published in the last 6 years
Adults with motor and speech impairments
(i) Development/prototype (ii) Generic/nonspecific
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Media management (v) Navigation
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Cost (iv) Lack of nonverbal control (v) Limited functionality (vi) Upkeep and maintenance
Promising results in brain-computer interface system accuracy. These systems are practical for interaction with smart home devices but lack user-friendliness. User comfort was identified as a major issue
To present a brain-computer interface paradigm for controlling home appliances
Healthy adults ()
Development/prototype
(i) Communication (ii) Environmental controls (iii) Interface control/device interaction (iv) Media management
Independence aid and participation aid
Brain control interfaces
Satisfactory control of appliances and phone calls. Enhanced freedom, more devices to control, and dialing phone numbers for improved quality of life, especially for individuals with disabilities
To create a communication bridge between a brain-computer interface speller platform and the messaging services of WhatsApp, Telegram, e-mail, and SMS through the use of a virtual assistant running in a smartphone
Healthy adults ()
Google
(i) Communication (ii) Interface control/device interaction
Independence aid and participation aid
(i) Brain control interfaces (ii) Cognitive load during use (iii) Lack of nonverbal control (iv) Response time interval (v) Upkeep and maintenance
Brain-computer interface-based spelling system showed promising results with healthy subjects, achieving 86.14% accuracy and positive usability feedback. However, for subjects with motor impairment, accuracy is generally lower (67-68%) due to difficulty in gaze control, requiring improvements for real-world implementation
To develop a prototype smart home gesture-based control facility that can control various home appliances
Adults with speech impairments
Development/prototype
(i) Environmental controls (ii) Interface control/device interaction
Independence aid
Lack of nonverbal control
Gesture-based smart home control system enables easy appliance operation with hand movements, aiding people with disabilities, eliminates multiple controllers, integrates well with smart speakers, and promotes user-friendliness
To develop an automatic speech recognition model for use between the smart assistant and the user with improved accuracy for atypical speech
Adults with speech impairments
(i) Development/prototype (ii) Google
Interface control/device interaction
Independence aid
Speech interpretation
Automatic speech recognition models show promise in learning atypical speech patterns, highlighting the significance of accessible technology for those with speech impairments
To explore how interactions between people with disability and voice assistant technology have an impact on the individual and collective well-being
Adults with visual and motor impairments () and their family members ()
Google
(i) Environmental controls (ii) Information retrieval (iii) Interface control/device interaction (iv) Media management (v) Reminders (vi) Scheduling
Independence aid
(i) Cognitive load during setup (ii) Cognitive load during use (iii) Internet connection (iv) Limited functionality (v) Privacy (vi) Sign language (vii) Speech interpretation (viii) Upkeep and maintenance
Voice assistant technology enhances independence, daily activities, and reduces disparities for people with disabilities. Challenges include integration with smart home tech, performance improvements, and disability-focused features. Consider socioeconomic barriers for wider access. Positive impact on well-being and quality of life
To examine deaf and hard of hearing users’ experience and attitude towards personal assistants, as well as potential interactions with such devices
Deaf and hard of hearing adults ()
Generic/nonspecific
(i) Communication (ii) Environmental controls (iii) Information retrieval (iv) Interface control/device interaction (v) Media management (vi) Notes (vii) Reminders
Independence aid and participation aid
(i) Lack of nonverbal control (ii) Privacy (iii) Sign language (iv) Speech Interpretation
Over 60% of participants expressed interest in personal assistant devices that understand sign language and show sign language videos/animations on screen. They preferred text output but were interested in unique applications like receiving alerts about sounds and requesting sign language interpretation
To explore a device that interprets and recognizes American sign language to control smart home components
Adults who use sign language
Development/prototype
(i) Environmental controls (ii) Interface control/device interaction
Independence aid
Sign language
The device is capable of recognizing 4 signs (lights on, play, stop, and idle) with 89.4% accuracy. It can also initiate corresponding actuation commands with the same accuracy level