Review Article

Voice Assistant Utilization among the Disability Community for Independent Living: A Rapid Review of Recent Evidence

Table 6

Summary of included articles.

Author (year)Study aimType of study participantsTechnologyTasksUtilizationBarriersFindings

Lancioni et al. (2021) [24]To set up a new technology system (smartphone working in combination with a motion sensor) that supports the performance of people with disabilities in multistep activitiesAdults with intellectual disabilities and sensory impairments ()Amazon/Alexa(i) Cognitive aid
(ii) Instruction
Independence aidCognitive load during setupThe new system, with a higher mean percentage of correct responses, has the potential to enhance participants’ activity engagement and satisfaction, making it a more effective method for multistep instruction compared to the control system

Driesse et al. (2022) [25]To explore the experience and response of older adults using voice assistant systemsOlder adults with mild cognitive impairment and dementia ()Amazon/Alexa(i) Information retrieval
(ii) Interface control/device interaction
Independence aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Unique and quickly changing user needs
Older adults are enthusiastic about voice assistant use, but uptake declines with cognitive impairment. Cognitively intact individuals show greater potential for effectively utilizing these devices. 85.9% desire to have a voice assistant in their homes in the future

Turrisi et al. (2021) [26]To create a database of dysarthric speech to train voice assistants onAdults with dysarthria () and healthy adults ()(i) Development/prototype
(ii) Generic/nonspecific
(iii) Microsoft/Cortana
(i) Communication
(ii) Dictation
(iii) Interface control/device interaction
Independence aid and participation aidSpeech interpretationThe poor performance of commercial voice assistant systems emphasizes the necessity for dysarthric speech corpora to develop better assistive technologies

Omata et al. (2021) [27]To explore the features required in spoken dialogue systems used by people with dementia and to present a system to improve their quality of lifeOlder adults with dementia () and caregivers ()Google(i) Cognitive aid
(ii) Scheduling
(iii) Information retrieval
Independence aid(i) Unique and quickly changing
(ii) User needs
Positive interaction and personalized adaptation are vital for supporting people with disabilities. A prototype outing assistant with voice and visual interactions, encouraging activities over intervention, demonstrates the significance of dementia-friendly technologies

Mande et al. (2021) [28]To explore wake-up interactions for deaf and hard of hearing users for potential personal assistant devices that understand sign language commandsDeaf and hard of hearing adults ()Amazon/AlexaInterface control/device interactionIndependence aid(i) Lack of nonverbal control
(ii) Privacy
(iii) Sign language
Participants preferred the sign name option for waking up Alexa due to convenience and reliability. Hands-free talk-to-talk techniques were favored for equivalence to voice-based interactions. Privacy and accidental wake-ups with camera-based interactions were concerns

Jaddoh et al. (2023) [29]To identify studies of automatic speech recognition systems and analyze the interactions between people with dysarthria and those systems/devicesAdults with dysarthria(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Google
(iv) Microsoft/Cortana
Interface control/device interactionIndependence aid(i) Response time interval
(ii) Speech interpretation
Automatic speech recognition systems struggle with dysarthric speech due to its complexity, limited data, and varied severity. Involving dysarthric users in design and testing is crucial to addressing the challenges faced by voice assistants

Gordon et al. (2022) [30]To assess the outcomes of the implementation of the codesigned intervention with older people using a technology and coaching packageOlder adults ()(i) Amazon/Alexa
(ii) Google
(i) Communication
(ii) Environmental controls
(iii) Health management
(iv) Interface control/device interaction
(v) Reminders
(vi) Safety
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Cost
(iv) Internet connection
(v) Lack of nonverbal control
(vi) Limited functionality
In-home technology coaching empowers older adults with enhanced skills and confidence in using devices, fostering effective integration of assistive technologies through a user-centered approach, resulting in improved well-being, quality of life, and a greater sense of safety and security

Oliveira et al. (2023) [31]To explore the habits, preferences, expectations, and experiences of individuals with visual impairments with their daily activities at home and experiences with smart home technologies in a Brazilian contextAdults with visual impairments ()Generic/nonspecific(i) Environmental controls
(ii) Interface control/device interaction
(iii) Media management
(iv) Safety
Independence aid(i) Cost
(ii) Limited functionality
(iii) Low market availability
(iv) Privacy
(v) Speech interpretation
Visually impaired individuals seek affordable smart home tech for enhanced independence: identifying lights’ status, appliance information, environment details, and comprehensive appliance control. Cost and availability remain barriers

Balasubramanian et al. 2021 [10]To explore user experiences with a smart speaker to better understand the device’s potential for everyday living and potential impact on health and well-beingAdults with diabetes, dementia, Parkinson’s disease, asthma, Behçet’s disease, Cushing’s syndrome, phenylketonuria, liver disorders, low mood, depression, anxiety, dyslexia, cognitive impairment, or severe visual impairment () and caregivers ()Amazon/Alexa(i) Communication
(ii) Companionship/connection
(iii) Health management
(iv) Information retrieval
(v) Interface control/device interaction
(vi) Learning
(vii) Media management
(viii) Reminders
(ix) Safety
(x) Scheduling
Independence aid and participation aidSpeech interpretationPatients and carers showed optimistic attitudes towards the assistive device, reporting positive impacts on health and well-being

Zubatiy et al. (2021) [32]To explore how conversational agents empower older adults with mild cognitive impairment and their care partnersAdults with mild cognitive impairment () and their care partners ()Google(i) Communication
(ii) Companionship/connection
(iii) Environmental controls
(iv) Health management
(v) Information retrieval
(vi) Media management
(vii) Reminders
(viii) Safety
(ix) Scheduling
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
Conversational agents proved valuable for empowering individuals with mild cognitive impairment and their care partners. Adequate training and support enabled successful integration into daily activities, with benefits reported in information access, media, and caregiving tasks

Verbaarschot et al. (2021) [33]To test whether amyotrophic lateral sclerosis patients can control a visual brain-computer interface spellerAdults with amyotrophic lateral sclerosis () and healthy adults ()Generic/nonspecific(i) Communication
(ii) Interface control/device interaction
Independence aid and participation aidBrain control interfacesThe brain-computer interface speller demonstrated superior performance for amyotrophic lateral sclerosis patients and young healthy participants. It allows for potential applications beyond typing, such as controlling smart home devices. Participants generally rated the system positively

Akturk et al. (2021) [34]To explore the use of the Dexcom G6 Siri feature in blind patients with diabetes on intensive insulin therapy on glycemic controlBlind adults with diabetes ()Apple/Siri(i) Health management
(ii) Interface control/device interaction
Independence aidUpkeep and maintenanceDexcom G6 with Siri improved glycemic control and reduced severe hypoglycemia. Visually impaired patients accessed real-time glucose levels via voice command. Dexcom app on smartphones offered customizable alerts for hypoglycemia/hyperglycemia

Ding et al. (2023) [35]To explore the current practice of mainstream smart home technology delivery as assistive technologyProviders working with people with disabilities ()Generic/nonspecific(i) Communication
(ii) Environmental controls
(iii) Interface Control/device interaction
(iv) Safety
Independence aid and participation aid(i) Cognitive load during use
(ii) Cost
(iii) Lack of educated providers
(iv) Limited functionality
(v) Privacy
(vi) Removal of past assistive features
(vii) Upkeep and maintenance
Smart home technology benefits people with disabilities but faces challenges in delivery and maintenance as assistive technology

Arthanat et al. (2022) [36]To examine the feasibility of commercially available smart home automation technology intervention for individuals with Alzheimer’s disease and related dementia with emphasis on their safety and independence and reduction of care burdenOlder adults with Alzheimer’s disease and related dementia and their caregivers ( dyads)Amazon/Alexa(i) Communication
(ii) Environmental controls
(iii) Health management
(iv) Interface control/device interaction
(v) Media management
(vi) Reminders
(vii) Safety
(viii) Scheduling
Independence aid and participation aid(i) Caregiver hesitance
(ii) Cognitive load during setup
(iii) Compatibility issues
(iv) Internet connection
(v) Lack of educated providers
(vi) Unique and quickly changing user needs
(vii) Upkeep and maintenance
Mainstream home automation tech can enhance safety, activity engagement, and caregiver connectivity for those with Alzheimer’s disease and related dementia. Addressing concerns and introducing tech early to optimize benefits, supporting aging-in-place with autonomy

Arnold et al. 2022 [16]To examine the range and extent of personal voice assistants used for older adults living in the community, their technology readiness level, associated outcomes, and the strength of evidenceOlder adults(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Development/prototype
(iv) Google
(i) Cognitive aid
(ii) Communication
(iii) Companionship/connection
(iv) Health management
(v) Information retrieval
(vi) Interface control/device interaction
(vii) Media management
(viii) Reminders
(ix) Scheduling
Independence aid and participation aid(i) Cognitive load during use
(ii) Privacy
(iii) Speech interpretation
Existing studies on older adults’ use of personal voice assistants found convenience and usefulness for tasks like reminders and information searching. However, evidence supporting their efficacy remains limited due to the focus on usability rather than completed effectiveness studies

Cave et al. (2021) [37]To provide an overview of the literature on the use of automatic speech recognition by users with amyotrophic lateral sclerosisAdults with amyotrophic lateral sclerosis(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Google
(iv) Microsoft/Cortana
(v) Samsung/Bixby
(i) Communication
(ii) Environmental controls
(iii) Health management
(iv) Interface control/device interaction
Independence aid and participation aidSpeech interpretationVoice recognition accuracy for dysarthric speech is generally lower than for nondysarthric speech

Schlomann et al. 2021 [6]To synthesize current research in the context of voice assistant for older adults and propose specific research designs to provide better insights into the adoption and use of voice assistants in advanced ageOlder adults with intellectual disabilitiesGeneric/nonspecific(i) Communication
(ii) Companionship/connection
(iii) Health management
(iv) Information retrieval
(v) Interface control/device interaction
(vi) Media management
(vii) Reminders
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Compatibility issues
(iv) Privacy
Authors stress voice assistant studies for older adults, noting benefits like social interaction, well-being support, and independence. Concerns include privacy, dependency, and training needs

Furini et al. (2021) [38]To investigate whether conversational interfaces might improve the daily experience of students, professors, and various users within a university campusBlind adults () and wheelchair user ()Amazon/Alexa(i) Information retrieval
(ii) Interface control/device interaction
(iii) Navigation
Independence aidSpeech interpretationVolunteers accomplished ≥2/3 tasks and gave positive ratings for the voice app. User comfort varied, and noise affected Alexa’s speech recognition

Salai et al. (2021) [39]To explore the use of a novel voice-based interaction system that supports users with complex needs by introducing a personalized, human voice command between smart home devices and sensorsAdults with dementia, learning disabilities or autism, older and frail people, and social and healthcare organizations ()(i) Amazon/Alexa
(ii) Development/prototype
(iii) Google
(i) Cognitive aid
(ii) Environmental controls
(iii) Reminders
Independence aid(i) Cognitive load during use
(ii) Lack of nonverbal control
(iii) Speech interpretation
The novel solution supports complex needs, enhances independence, reinforces learning of commands, and provides transparent cause-and-effect explanations for home automation events

Shah et al. (2021) [40]To develop a functioning brain-computer interface prototype that can be integrated into a smart home to support people with limited motor abilitiesAdults with motor and speech impairments ()(i) Amazon/Alexa
(ii) Development/prototype
(iii) Google
(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
Independence aid and participation aid(i) Brain control interfaces
(ii) Lack of nonverbal control
The brain-computer interface for smart speakers got 72.82% overall accuracy. This interface empowers people with limited motor abilities to perform tasks independently, enhancing their daily life autonomy

Chen et al. (2022) [41]To develop a smart human-environment interactive environment using eye tracking for smart control devices for amyotrophic lateral sclerosis patientsHealthy adults ()Development/prototype(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Safety
Independence aid and participation aid(i) Lack of nonverbal control
(ii) Privacy
The system achieves an average accuracy of 93.2% in different light conditions and demonstrates positive user experiences across education levels and age groups

Shukla et al. (2021) [42]To develop a brain-computer interface system for home appliance controlAdults with motor impairments () and healthy adults ()Development/prototype(i) Environmental controls
(ii) Interface control/device interaction
(iii) Media management
Independence aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Lack of nonverbal control
The proposed system achieved improved threshold-free classification accuracies of around 92.44% (healthy) and 89.33% (motor impairment), showing cognitive abilities’ influence on performance, not disability type

Kurtoglu et al. (2022) [43]To develop a system that enables trigger sign detection for device activation and sequential recognition of American sign languageDeaf and hard of hearing adults ()Development/prototypeInterface control/device interactionIndependence aidSign languageThe trigger sign detection approach achieved a detection rate of 98.9% for American sign language users

Lea et al. (2022) [44]To develop a system for nonverbal, sound-based interactions that people with a wide range of speech disorders can use to interact with mobile technologyAdults with speech impairments ()(i) Apple/Siri
(ii) Development/prototype
Interface control/device interactionIndependence aid(i) Lack of nonverbal control
(ii) Speech interpretation
The system achieved an 82% success rate in detecting nonverbal sounds for users with disability. It showed promise in enhancing communication for those with speech impairments and providing accessibility for individuals with limited mobility. Positive feedback was received from users in situations where they could not interact with technology otherwise

Ryu et al. (2022) [45]To improve augmentative and alternative speech recognition performance of smart speakers by developing a text-to-speech productionAugmentative and alternative communication device users(i) Development/prototype
(ii) Google
(iii) Samsung/Bixby
Interface control/device interactionIndependence aid(i) Lack of nonverbal control
(ii) Response time interval
Augmentative and alternative interaction symbols and boards, along with optimized text-to-speech production format, improve speech recognition and enable independent device control and Internet interaction

Velasco-Alvarez et al. (2022) [46, 47]To control a home automation system through a brain-computer interface that allows the construction of voice commands for people with motor impairmentsHealthy adults ()(i) Development/prototype
(ii) Google
(i) Communication
(ii) Environmental controls
(iii) Information retrieval
(iv) Interface control/device interaction
(v) Media management
(vi) Scheduling
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Lack of nonverbal control
Brain-computer interface enabled users to access diverse media and environmental controls, along with a texting app. Integrating voice commands with brain-computer interface enhanced device control

Oumard et al. (2022) [48]To understand whether how and why voice assistants are suitable for blind and visually impaired people and what has to be done for an even better acceptance and benefitAdults with visual impairments ()(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Google
(i) Communication
(ii) Dictation
(iii) Environmental controls
(iv) Information retrieval
(v) Interface control/device interaction
(vi) Math
(vii) Media management
(viii) Navigation
(ix) Reminders
(x) Scheduling
Independence aid and participation aid(i) Internet connection
(ii) Limited functionality
(iii) Privacy
(iv) Removal of past assistive features
(v) Response time interval
(vi) Speech interpretation
Voice assistants benefit visually impaired users with tasks like entertainment, Internet access, time, calendars, and notes. They appreciate smart home integration, existing functionalities, and voice commands. Privacy and data security concerns persist, but 86% find the systems helpful, indicating an overall positive outlook with room for improvement

Sciarretta et al. (2021) [1]To provide an overview of the usability and barriers of smart speakers for people with disabilitiesPeople with disabilities(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Generic/nonspecific
(iv) Google
(v) Microsoft/Cortana
(vi) Samsung/Bixby
(i) Cognitive aid
(ii) Communication
(iii) Dictation
(iv) Environmental controls
(v) Information retrieval
(vi) Interface control/device interaction
(vii) Reminders
(viii) Scheduling
(ix) Spelling
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Lack of nonverbal control
(iv) Limited functionality
(v) Response time interval
(vi) Speech interpretation
(vii) Unique and quickly changing user needs
(viii) Upkeep and maintenance
Voice assistants show promise as inclusive assistive technologies for enhancing quality of life. Acceptance by diverse user groups, nonstigmatizing nature, and accessibility are key strengths. Addressing challenges and adopting an inclusive approach is vital for maximizing their potential in different situations and contexts

Lewis et al. (2021) [49]To determine whether Canadians with cognitive disabilities could benefit from voice-activated intelligent personal assistants to access digital services and increase their participation in the digital economyAdults with cognitive impairments ()Amazon/Alexa(i) Environmental controls
(ii) Health management
(iii) Information retrieval
(iv) Instruction
(v) Interface control/device interaction
(vi) Learning
(vii) Math
(viii) Media management
(ix) Navigation
(x) Reminders
(xi) Safety
(xii) Scheduling
Independence aid(i) Cognitive load during use
(ii) Compatibility issues
(iii) Limited functionality
(iv) Speech interpretation
(v) Response time interval
(vi) Unique and quickly changing user needs
Users’ opinions on voice assistants were positive, citing increased agency and access to well-being features. Echo Dot proved cost-effective for enhancing the quality of life for vulnerable groups. However, certain complex Alexa Skills, like transportation planning, caused hesitation and confusion due to extensive interactive dialogues

Hugo et al. (2021) [50]To evaluate the perception of smart home device benefits in 6 different consumer groups, including people with disabilitiesAdults with disabilities ()(i) Amazon/Alexa
(ii) Google
(i) Environmental controls
(ii) Interface control/device interaction
(iii) Safety
Independence aid(i) Cognitive load during setup
(ii) Cost
(iii) Lack of skills/apps
(iv) Privacy
Persons with disabilities prioritize safety, security, and well-being. Participants did not show a knowledge increase in voice-controlled smart home tech, indicating the need for improved demonstrations. Well-being was their primary benefit, followed by safety and security. Key barriers were cost and consumer privacy concerns

Uyanik et al. (2022) [51]To develop a smart home and wheelchair application controlled by a steady-state visual evoked potential-based brain-computer interface systemHealthy adults ()Development/prototype(i) Environmental controls
(ii) Interface control/device interaction
(iii) Navigation
Independence aid(i) Brain control interfaces
(ii) Cost
(iii) Lack of nonverbal control
Subjects achieved nearly perfect accuracy in device interaction and wheelchair navigation tasks (>90%). The system controlled a virtual power wheelchair and various household devices with ease. Advantages: low cost, wireless, portable, user-friendly, and high control accuracy without extensive training. Subjects demonstrated increased confidence and competence over time, finding the system easily adaptable and learnable

Corbett et al. (2021) [52]To investigate the use and usefulness of virtual home assistants among older adults and their support personsOlder adults with chronic health conditions and their caregivers ( dyads)Amazon/Alexa(i) Communication
(ii) Companionship/connection
(iii) Health management
(iv) Information retrieval
(v) Interface control/device interaction
(vi) Media management
(vii) Reminders
(viii) Safety
(ix) Scheduling
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Compatibility issues
(iv) Lack of educated providers
(v) Limited functionality
(vi) Unique and quickly changing
(vii) User needs
(viii) Upkeep and maintenance
Participants found virtual health assistants beneficial for aging in place, but faced challenges in learning and adapting to the technology. They utilized these devices for various activities, valuing features like timers and reminders. These devices were perceived as enhancing security and providing convenience, but more training and periodic assessments were desired to maximize their potential

Berrett et al. (2022) [53]To investigate the benefits of personalized augmented assistive technology to support people living with dementia in their daily activities in the homeOlder adults with early-to-mild stage dementia ()(i) Amazon/Alexa
(ii) Google
(i) Companionship/connection
(ii) Environmental controls
(iii) Health management
(iv) Information retrieval
(v) Interface control/device interaction
(vi) Media management
(vii) Reminders
(viii) Scheduling
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Privacy
(iv) Speech interpretation
(v) Unique and quickly changing user needs
(vi) Upkeep and maintenance
Participants welcomed the technology, finding it helpful, but it is currently unsuitable for long-term use in dementia.
Inclusive development approaches can inform future advancements. Successful deployment at home requires addressing both the person with dementia and their support person’s needs

Islam et al. (2022) [54]To investigate healthcare providers’ perspectives on using smart home systems for self-management and care in people with heart failureHealthcare providers ()Generic/nonspecific(i) Communication
(ii) Health management
(iii) Information retrieval
(iv) Reminders
(v) Safety
Independence aid and participation aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Cost
Participants saw potential in smart home systems for improving self-management of heart failure but had reservations. Benefits included remote monitoring, services, and independent living for patients

Caselgrandi et al. (2021) [55]To assess patients’ satisfaction and engagement in older people with postacute COVID-19 syndrome in the use of a voice assistantOlder adults with postacute COVID-19 syndrome ()Google(i) Companionship/connection
(ii) Health management
(iii) Information retrieval
(iv) Media management
Independence aid and participation aidN/RVoice assistant tool rated useful by 96% of participants, showing overall improvement at 6-month follow-up. Empowered participants, enhancing lifestyle, notably in physical activity

Barbaric et al. (2022) [56]To evaluate how participants with heart failure interacted with a voice app version of a digital therapeuticsAdults with heart failure ()Amazon/Alexa(i) Companionship/connection
(ii) Health management
(iii) Information retrieval
(iv) Media management
(v) Reminders
Independence aid and participation aid(i) Cognitive load during use
(ii) Speech interpretation
(iii) Upkeep and maintenance
Participants’ engagement declined due to app unreliability and adaptation challenges. Older users showed higher engagement, while middle-aged participants found the app more acceptable. Integration into daily routines and convenience for heart failure measurements were observed, but technical limitations and usability issues existed

Kocaballi et al. (2022) [57]To present the challenges of designing and evaluating conversational agents derived from recent healthcare projects conducted in the last 2 yearsOlder adults with chronic health conditionsGeneric/nonspecific(i) Health management
(ii) Interface control/device interaction
Independence aid(i) Cognitive load during use
(ii) Cost
(iii) Internet connection
(iv) Privacy
(v) Speech interpretation
Common challenges for conversational agents include domain integration, conversational competence, user-system interaction, and evaluation. In healthcare, additional issues arise, like empathy, safety, recruitment of vulnerable populations, and testing in real-world settings

Rohlfing et al. (2021) [58]To evaluate the ability of common voice recognition systems to transcribe dysphonic voicesAdults with speech impairments () and healthy adults ()(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Google
Interface control/device interactionIndependence aidSpeech interpretationVoice disorders are linked to lower word recognition. Surprisingly, a faster speech rate improved word recognition. Dysphonia severity consistently affected word recognition, while voice technology accuracy depended on dysphonia perception, not articulation disorders

Stellin et al. (2022) [59]To develop a hand gesture dataset to recognize the gestures of sign language and convert it into verbal languageDeaf and hard of hearing adultsDevelopment/prototype(i) Dictation
(ii) Interface control/device interaction
Independence aid(i) Lack of nonverbal control
(ii) Limited functionality
(iii) Sign language
The system achieves 99.13% accuracy in recognizing and translating sign language gestures into speech, promising significant assistance to speech-impaired individuals, enhancing communication, and providing them with a virtual voice

Velasco-Álvarez et al. (2022) [46, 47]To create a system to facilitate communication between a brain-computer interface and devices in the environment using voice commandsAdults with amyotrophic lateral sclerosis () and healthy adults ()(i) Development/prototype
(ii) Google
(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Media management
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Compatibility issues
(iv) Response time interval
Amyotrophic lateral sclerosis participants found the brain-computer interface system easy to use and rated mental demand and effort between 50 and 60 (out of 100). The system’s uniqueness lies in its user-friendliness and customizable menus for controlling devices, making it valuable to the field

Maleki et al. (2021) [60]To review all the papers on brain-computer interface-based smart home systems published in the last 6 yearsAdults with motor and speech impairments(i) Development/prototype
(ii) Generic/nonspecific
(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Media management
(v) Navigation
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Cost
(iv) Lack of nonverbal control
(v) Limited functionality
(vi) Upkeep and maintenance
Promising results in brain-computer interface system accuracy. These systems are practical for interaction with smart home devices but lack user-friendliness. User comfort was identified as a major issue

Akram et al. (2022) [61]To present a brain-computer interface paradigm for controlling home appliancesHealthy adults ()Development/prototype(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Media management
Independence aid and participation aidBrain control interfacesSatisfactory control of appliances and phone calls. Enhanced freedom, more devices to control, and dialing phone numbers for improved quality of life, especially for individuals with disabilities

Velasco-Álvarez et al. (2021) [62]To create a communication bridge between a brain-computer interface speller platform and the messaging services of WhatsApp, Telegram, e-mail, and SMS through the use of a virtual assistant running in a smartphoneHealthy adults ()Google(i) Communication
(ii) Interface control/device interaction
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Lack of nonverbal control
(iv) Response time interval
(v) Upkeep and maintenance
Brain-computer interface-based spelling system showed promising results with healthy subjects, achieving 86.14% accuracy and positive usability feedback. However, for subjects with motor impairment, accuracy is generally lower (67-68%) due to difficulty in gaze control, requiring improvements for real-world implementation

Wang et al. (2021) [63]To develop a prototype smart home gesture-based control facility that can control various home appliancesAdults with speech impairmentsDevelopment/prototype(i) Environmental controls
(ii) Interface control/device interaction
Independence aidLack of nonverbal controlGesture-based smart home control system enables easy appliance operation with hand movements, aiding people with disabilities, eliminates multiple controllers, integrates well with smart speakers, and promotes user-friendliness

Adams et al. (2021) [64]To develop an automatic speech recognition model for use between the smart assistant and the user with improved accuracy for atypical speechAdults with speech impairments(i) Development/prototype
(ii) Google
Interface control/device interactionIndependence aidSpeech interpretationAutomatic speech recognition models show promise in learning atypical speech patterns, highlighting the significance of accessible technology for those with speech impairments

Vieira et al. 2022 [14]To explore how interactions between people with disability and voice assistant technology have an impact on the individual and collective well-beingAdults with visual and motor impairments () and their family members ()Google(i) Environmental controls
(ii) Information retrieval
(iii) Interface control/device interaction
(iv) Media management
(v) Reminders
(vi) Scheduling
Independence aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Internet connection
(iv) Limited functionality
(v) Privacy
(vi) Sign language
(vii) Speech interpretation
(viii) Upkeep and maintenance
Voice assistant technology enhances independence, daily activities, and reduces disparities for people with disabilities. Challenges include integration with smart home tech, performance improvements, and disability-focused features. Consider socioeconomic barriers for wider access. Positive impact on well-being and quality of life

Glasser et al. (2021) [65]To examine deaf and hard of hearing users’ experience and attitude towards personal assistants, as well as potential interactions with such devicesDeaf and hard of hearing adults ()Generic/nonspecific(i) Communication
(ii) Environmental controls
(iii) Information retrieval
(iv) Interface control/device interaction
(v) Media management
(vi) Notes
(vii) Reminders
Independence aid and participation aid(i) Lack of nonverbal control
(ii) Privacy
(iii) Sign language
(iv) Speech interpretation
Over 60% of participants expressed interest in personal assistant devices that understand sign language and show sign language videos/animations on screen. They preferred text output but were interested in unique applications like receiving alerts about sounds and requesting sign language interpretation

Rajasekhar et al. (2022) [66]To explore a device that interprets and recognizes American sign language to control smart home componentsAdults who use sign languageDevelopment/prototype(i) Environmental controls
(ii) Interface control/device interaction
Independence aidSign languageThe device is capable of recognizing 4 signs (lights on, play, stop, and idle) with 89.4% accuracy. It can also initiate corresponding actuation commands with the same accuracy level

Caselgrandi et al. (2021) [55]To assess patients’ satisfaction and engagement in older people with postacute COVID-19 syndrome in the use of a voice assistantOlder adults with postacute COVID-19 syndrome ()Google(i) Companionship/connection
(ii) Health management
(iii) Information retrieval
(iv) Media management
Independence aid and participation aidN/RVoice assistant tool rated useful by 96% of participants, showing overall improvement at 6-month follow-up. Empowered participants, enhancing lifestyle, notably in physical activity

Barbaric et al. (2022) [56]To evaluate how participants with heart failure interacted with a voice app version of a digital therapeuticsAdults with heart failure ()Amazon/Alexa(i) Companionship/connection
(ii) Health management
(iii) Information retrieval
(iv) Media management
(v) Reminders
Independence aid and participation aid(i) Cognitive load during use
(ii) Speech interpretation
(iii) Upkeep and maintenance
Participants’ engagement declined due to app unreliability and adaptation challenges. Older users showed higher engagement, while middle-aged participants found the app more acceptable. Integration into daily routines and convenience for heart failure measurements were observed, but technical limitations and usability issues existed

Kocaballi et al. (2022) [57]To present the challenges of designing and evaluating conversational agents derived from recent healthcare projects conducted in the last 2 yearsOlder adults with chronic health conditionsGeneric/nonspecific(i) Health management
(ii) Interface control/device interaction
Independence aid(i) Cognitive load during use
(ii) Cost
(iii) Internet connection
(iv) Privacy
(v) Speech interpretation
Common challenges for conversational agents include domain integration, conversational competence, user-system interaction, and evaluation. In healthcare, additional issues arise, like empathy, safety, recruitment of vulnerable populations, and testing in real-world settings

Rohlfing et al. (2021) [58]To evaluate the ability of common voice recognition systems to transcribe dysphonic voicesAdults with speech impairments () and healthy adults ()(i) Amazon/Alexa
(ii) Apple/Siri
(iii) Google
Interface control/device interactionIndependence aidSpeech interpretationVoice disorders are linked to lower word recognition. Surprisingly, a faster speech rate improved word recognition. Dysphonia severity consistently affected word recognition, while voice technology accuracy depended on dysphonia perception, not articulation disorders

Stellin et al. (2022) [59]To develop a hand gesture dataset to recognize the gestures of sign language and convert it into verbal languageDeaf and hard of hearing adultsDevelopment/prototype(i) Dictation
(ii) Interface control/device interaction
Independence aid(i) Lack of nonverbal control
(ii) Limited functionality
(iii) Sign language
The system achieves 99.13% accuracy in recognizing and translating sign language gestures into speech, promising significant assistance to speech-impaired individuals, enhancing communication, and providing them with a virtual voice

Velasco-Álvarez et al. (2022a,b) [46, 47]To create a system to facilitate communication between a brain-computer interface and devices in the environment using voice commandsAdults with amyotrophic lateral sclerosis () and healthy adults ()(i) Development/prototype
(ii) Google
(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Media management
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Compatibility issues
(iv) Response time interval
Amyotrophic lateral sclerosis participants found the brain-computer interface system easy to use and rated mental demand and effort between 50 and 60 (out of 100). The system’s uniqueness lies in its user-friendliness and customizable menus for controlling devices, making it valuable to the field

Maleki et al. (2021) [60]To review all the papers on brain-computer interface-based smart home systems published in the last 6 yearsAdults with motor and speech impairments(i) Development/prototype
(ii) Generic/nonspecific
(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Media management
(v) Navigation
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Cost
(iv) Lack of nonverbal control
(v) Limited functionality
(vi) Upkeep and maintenance
Promising results in brain-computer interface system accuracy. These systems are practical for interaction with smart home devices but lack user-friendliness. User comfort was identified as a major issue

Akram et al. (2022) [61]To present a brain-computer interface paradigm for controlling home appliancesHealthy adults ()Development/prototype(i) Communication
(ii) Environmental controls
(iii) Interface control/device interaction
(iv) Media management
Independence aid and participation aidBrain control interfacesSatisfactory control of appliances and phone calls. Enhanced freedom, more devices to control, and dialing phone numbers for improved quality of life, especially for individuals with disabilities

Velasco-Álvarez et al. (2021) [62]To create a communication bridge between a brain-computer interface speller platform and the messaging services of WhatsApp, Telegram, e-mail, and SMS through the use of a virtual assistant running in a smartphoneHealthy adults ()Google(i) Communication
(ii) Interface control/device interaction
Independence aid and participation aid(i) Brain control interfaces
(ii) Cognitive load during use
(iii) Lack of nonverbal control
(iv) Response time interval
(v) Upkeep and maintenance
Brain-computer interface-based spelling system showed promising results with healthy subjects, achieving 86.14% accuracy and positive usability feedback. However, for subjects with motor impairment, accuracy is generally lower (67-68%) due to difficulty in gaze control, requiring improvements for real-world implementation

Wang et al. (2021) [63]To develop a prototype smart home gesture-based control facility that can control various home appliancesAdults with speech impairmentsDevelopment/prototype(i) Environmental controls
(ii) Interface control/device interaction
Independence aidLack of nonverbal controlGesture-based smart home control system enables easy appliance operation with hand movements, aiding people with disabilities, eliminates multiple controllers, integrates well with smart speakers, and promotes user-friendliness

Adams et al. (2021) [64]To develop an automatic speech recognition model for use between the smart assistant and the user with improved accuracy for atypical speechAdults with speech impairments(i) Development/prototype
(ii) Google
Interface control/device interactionIndependence aidSpeech interpretationAutomatic speech recognition models show promise in learning atypical speech patterns, highlighting the significance of accessible technology for those with speech impairments

Vieira et al. 2022 [14]To explore how interactions between people with disability and voice assistant technology have an impact on the individual and collective well-beingAdults with visual and motor impairments () and their family members ()Google(i) Environmental controls
(ii) Information retrieval
(iii) Interface control/device interaction
(iv) Media management
(v) Reminders
(vi) Scheduling
Independence aid(i) Cognitive load during setup
(ii) Cognitive load during use
(iii) Internet connection
(iv) Limited functionality
(v) Privacy
(vi) Sign language
(vii) Speech interpretation
(viii) Upkeep and maintenance
Voice assistant technology enhances independence, daily activities, and reduces disparities for people with disabilities. Challenges include integration with smart home tech, performance improvements, and disability-focused features. Consider socioeconomic barriers for wider access. Positive impact on well-being and quality of life

Glasser et al. (2021) [65]To examine deaf and hard of hearing users’ experience and attitude towards personal assistants, as well as potential interactions with such devicesDeaf and hard of hearing adults ()Generic/nonspecific(i) Communication
(ii) Environmental controls
(iii) Information retrieval
(iv) Interface control/device interaction
(v) Media management
(vi) Notes
(vii) Reminders
Independence aid and participation aid(i) Lack of nonverbal control
(ii) Privacy
(iii) Sign language
(iv) Speech Interpretation
Over 60% of participants expressed interest in personal assistant devices that understand sign language and show sign language videos/animations on screen. They preferred text output but were interested in unique applications like receiving alerts about sounds and requesting sign language interpretation

Rajasekhar et al. (2022) [66]To explore a device that interprets and recognizes American sign language to control smart home componentsAdults who use sign languageDevelopment/prototype(i) Environmental controls
(ii) Interface control/device interaction
Independence aidSign languageThe device is capable of recognizing 4 signs (lights on, play, stop, and idle) with 89.4% accuracy. It can also initiate corresponding actuation commands with the same accuracy level