Extension of CES2020 AVTR — VR in Autonomous Vehicles communicate with cognitive-impaired patients

Yulu (Laura) Lin
4 min readJan 13, 2020

Mercedes-Benz AVTR in CES 2020

Mercedes-Benz’s new Vision AVTR concept car debuted in CES 2020 this year. The goal of the car is “ interacting with the occupants”. Equipped with an advanced AI, the VISION AVTR comes to life by sensing the heartbeat and breathing patterns of the driver and passengers. Like many other electric vehicles, “sustainability” and “battery” are also hotspots. The battery is completely composable and recyclable, and the capacity can prove travel range of 435 miles (700 kilometers) per charge, which is also made possible by a high level of energy recuperation when moving or braking.

It is not denying that the vehicle focuses on the bond between the human and machine, and can be a milestone for EV(electric vehicles)’s charge range. However, the German automaker giant only mentioned their inside Virtual Reality experiences, the outside trust and interaction didn’t be introduced, such as how can the vehicle scan the disability’s impairments, how can the disability interact with the vehicle.

To explore sequentially and imagine enough for the concept car, I put forward the following thoughts.

Considering for Cognitive-Based impairments

Even the same cognitive-based impairment like auditory impairment can be classified into different criteria, for instance, two different linguistic patients may not demonstrate the same communicative competency. Take “Aphasia” as an example,the Agency for Health Care Policy and Research PostStroke Rehabilitation Clinical Practice Guidelines define aphasia as “the loss of ability to communicate orally, through signs, or in writing, or the inability to understand such communications; the loss of language usage ability.”

The reason why distinguishing the impairment is that the language and cognitive-communication problems associated with non–language-dominant hemisphere damage, dementia, and traumatic brain is different from problems associated to focal brain damage to the language-dominant cerebral hemisphere. Besides, auditory impairments are classified based on the differences between semantic, phonological, syntax and pragmatics.

Aphasia occurs mostly in adults who have suffered a cerebrovascular accident or a traumatic brain injury.This means that the only way to prevent social isolation for those patients is to assess functional communication directly, and then to work on skills that will allow the individual greater opportunity to integrate into social situations of choice.

VR for Speech-Language patients

In the past 20 years, researchers and clinicians have been increasingly interested in how environmental factors interacted with an individual’s level of language of impairments.

VR systems can be used not only as testing and validation environments but also as a training environment for people who are coming in contact with this kind of interface for the first time. The technology permits behavioral tracking and performance recording so that the data collected can be quantified, analyzed, and graphically repre- sented in an easily interpretable format (Rizzo et al., 2004). The key to the effectiveness of the technology used for Speech-Language Patients is related to the degree of presence experienced by the user. Presence implies “involvement, realism and naturalness” with the artificial environment (Freeman & Avons, 2000). If VR were to be used more extensively in speech–language pathology, there would need to be another level of presence, one that gives a “sense of being with another” (Biocca et al., 2003, p. 456) or social presence.

This technological interface may provide what Wallesch and Johannsen- Horbach (2004) describe as an interactive communicative environment where VR substitutes information received from a computer for information received normally through our senses in natural environments (Yoh, 2001).

One of the user tests includes the notion that the user attributes some level of intelligence to the virtual human, to the extent that an intention can be deduced from the virtual human’s behavior. At an yet higher level, this behavior would include nonverbal communication clues such as eye contact or turn taking. These would suggest that not only is the virtual human there but also the character’s behavior would suggest some com- municative intention, and hence some level of “cognition.”

Through sensory perceptions, the user would infer a communicative inten- tion on the part of virtual others. At this point, social presence could be construed as having been achieved( Linda J Garcia et al., 2007).

VR communication for Autonomous Vehicles

Now that we have briefly gone through how VR can be used into assessing aphasia impairments, we are beginning talking about VR application in Autonomous Vehicles(AVs).

A VR system provides AVs sensor data to many amusement applications inside the vehicles, such as games, leisure-based backgrounds and talking cartoon characters. Japanese car maker Nissan showcased a set of goggles for drivers and passengers which could deliver real-time information and project a talking cartoon character which communicates with the wearer.

The key component of the communication between VR devices and patients is according to “Social Presence Theory”. According to Patrick R. Lowenthal, definitions of Social Presence Theory are on a continuum. On one side of the continuum there are perceptions of a person’s being or existence, which focus on whether one projects themselves into the environment or if other people can recognize them. On the other side of the continuum the focus is on whether or not there is a positive interaction, interpersonal, or emotional connection between the communicators.

Robotics have leveraged VR for many decades. In fact, the partnership between robotics and VR can be found in literature since at least the 80s(Alexandre M. Nascimento et al. 2019). AVs are considered autonomous robots in a vehicle form-factor which can navigate in the infrastructure built for regular vehicles (with human drivers).

Conclusion

VR for healthcare has been explored for a long time, and has gained many achievements. We can consider use the tool for cognitive-impaired patients when designing autonomous robotics (vehicles) if seeing VR as a virtual assistant when people moving close to the car.

--

--

Yulu (Laura) Lin

Cognitive Neuroscience/City Science/Human-machine interaction Trained by Stanford, Harvard University and UMN-Twin Cities