A Multisensory Literacy Approach to Biomedical Healthcare Technologies: Aural, Tactile, and Visual Layered Health Literacies

by Kristin Marie Bivens, Lora Arduser, Candice A. Welhausen, & Michael J. Faris

Introduction

Exploring Interactions/Connections and Directions among Layered Literacies: A Crosswalk Conversation

Healthcare providers have long used technologies that employ multiple modalities—including sound, sight, and touch—to construct knowledge about the human body and interpret health-related information. Medical sonography and ultrasound, which rely on sound and sight, for instance, have been used for decades. And the reflex hammer, the tool used to perform the well-known knee-jerk test, assesses patients' reflexes through a tactile connection between instrument and flesh.

For those with no (or limited) specialized knowledge of medicine, many of these commonly-used healthcare technologies may seem to work through magic because they rely on processes that are invisible to the human eye. However, understanding and interpreting the information collected by these tools requires extensive training and professional expertise. Indeed, knowing how to read a sonogram or ultrasound, or how high a patient's lower leg should "jerk" upwards in response to the external stimulus of a reflex hammer, requires engaging in aural, visual, and/or tactile literacy practices, often in overlapping ways.

In this webtext we borrow Kelli Cargile Cook’s (2002) concept of "layered literacies" in technical communication to argue that health literacy is an embodied, multisensory experience that is invariably mediated by healthcare technologies (see our literature review). We illustrate this concept through three case studies that describe scenarios in which non-experts and lay experts engage in non-discursive literacy practices: parents caring for an infant in a neonatal intensive care unit (NICU), people with type 1 diabetes (T1D) self-managing their treatment, and public audiences reporting symptoms to a crowd-sourced flu-tracking program. We propose that the literacy practices we identify in each scenario—aural, tactile, and visual, respectively—are fundamentally shaped by the use of specific healthcare technologies unique to that scenario: physiological monitors, insulin pumps, and crowd-sourced flu maps. More specifically, we argue that these technologies enable, constrain, and integrate multisensorial literacy practices in ways that complicate the concept of health literacy. Further, we propose that understanding health literacy practices entails attending to the interconnectedness of our senses and sensibilities, which pushes back against the mind/body divide often invoked in Western cultures (see Segal, 2005). A better understanding of the ways that the public, which includes lay experts and non-experts, develops the "set of skills needed to function in the healthcare environment" (DeWalt, Berkman, Sheridan, Lohr, & Pignone, 2004, p. 1228) can ultimately lend insight into strategies for improving health-related communication.

Three Multisensorial Healthcare Technologies

In our first case study, Kristin Marie Bivens uses an echo methodological framework to examine the sounds of healthcare technology in two Neonatal Intensive Care Units (NICUs): one in the United States and one in Denmark. Her work explores physiological monitors that aurally track and biomedically surveil a baby's vital signs and focuses on two particular biomedicalized events: a baby's sneeze and a father's reaction to a monitor's noise. Based on her examples, she suggests non-experts need an aural layer of health literacy in order to more effectively navigate many healthcare spaces.

In our second case study, Lora Arduser discusses the practices of people with type 1 diabetes (T1D) using medical devices and other technologies (i.e., insulin pumps, continuous glucose monitors, smartphones, smartwatches, and the internet) to not only adjust but hack these technologies as a part of their daily work as expert patients. Her case study shows that by making certain hacks to their technologies (changes that medical providers or device companies explicitly suggest not making), people with T1D practice a tactile form of literacy.

Finally, in our third case study, Candice A. Welhausen analyzes a crowdsourced flu-tracking program, Flu Near You (FNY), which allows non-expert public audiences to report flu symptoms and visualize flu activity through the program's mapping features. She argues that participants' visual literacy practices are both enabled and constrained by theorizing the visual conventions used to communicate information about the spread of flu.

Integrating Layered Health Literacies

In Western cultures, healthcare technologies use sound, touch, and sight to track and surveil the body in order to better explain and understand its processes. Physiological monitors represent vital signs aurally (and visually on a screen); diabetic monitors read blood glucose levels through a tactile interface (and aurally if attention is needed); flu symptoms are visualized through touch screens (and entered tactilely through digital interfaces like smartphones). Users of these healthcare technologies then must be able to make sense of the information these technologies collect and the messages they produce through a multisensory approach.

As surveillance tools, these healthcare technologies might be viewed as replacing non-experts' and lay experts' agency. However, we propose they "drive" decision-making and agency by extending health literacy practices to their embodied (aural, tactile, visual) dimensions. Further, by engaging in these embodied, multisensory literacies, users are not simply acquiring and understanding information. Rather, through these literacy practices, they make particular kinds of health-related decisions and take particular actions—such as deciding whether to solicit the help of a NICU nurse (Bivens), administer insulin (Arduser), or get the flu shot (Welhausen). Thus, health literacy does not refer to a static state involving simple comprehension of health-related messages in order to comply with health-related behaviors suggested by healthcare providers; instead, it is a dynamic, engaged, participatory set of practices that enables and/or facilitates particular kinds of actions. This understanding of literacy breaks down the purported binary between expert and non-expert and exposes the levels of expertise needed by healthcare technology users.

Our case studies also account for various levels of involvement by users in their healthcare decisions. The site of healthcare decision-making, for example, plays a significant role in shaping what non- and lay experts do (action) and what they don't do. In a hospital intensive care unit, for example, parents are much less likely to take action—like turning off a beeping monitor that is not communicating important information—because the healthcare context dictates that only experts (the nurses and physicians in the unit) have the authority to make this decision and to act. Conversely, a lay expert diabetic managing her diabetes at home, as Arduser demonstrates, might easily make the decision to adapt the alerts of her blood glucose monitor to suit her health needs.

FNY users, on the other hand, represent a very different kind of non-expert audience. The healthcare contexts that Bivens and Arduser describe are taken seriously by most non-experts. However, the inverse is often true for the flu, as Welhausen points out at the beginning of her section. More specifically, because most healthy adults tend to recover from the flu, contracting the disease is often not seen as particularly dangerous. Further, most people will not necessarily be persuaded to participate in a program like FNY (regardless of the potential benefits to public health knowledge about the spread of seasonal flu) unless there are particularly compelling reasons to do so (e.g., more people are becoming ill during that year's flu season or a particularly virulent strain of the virus emerges). FNY maps show that the risk of contracting an infectious and communicable disease like seasonal flu is often strongly associated with geographic space: The more dots on the map, the greater the risk to the individual and the more pressing the need to get a flu shot.

Through each case study's examination of a single sense, we complicate Western medicine's mind–body binary toward health and disease (see Segal, 2005) and demonstrate an intermingling or "layering" of health literacies. In her discussion of aural literacies, for example, Bivens examines an incident when, after hearing an IV pump alarm, the father of a NICU infant fell down in the unit in his rush to find someone to respond to the IV pump alarm. The incident suggests his non-expert user status meant he was unaware that the alarm was just noise—not a purposeful sound—and, conceivably, he could have silenced the alarm himself with no adverse consequences.

By waiting for a nurse to notice and attend to the alarm, rather than attempting the action himself, the father's behavior contrasts with the agency that Arduser's bio-technology hackers exhibit as reported in her case study. For example, her discussion of Kerri Sparling’s Dexcom in a Glass hack demonstrates the ways tactile practices and aural cues are linked. For people with type 1 diabetes, aural alarms go off on their blood glucose monitoring device if their blood sugar is too high or too low. But these alarms can be ignored because these lay experts know what their own bodies can tolerate and when a specific blood sugar number is actually dangerous for them. These body-hacking practices are made possible, to a large extent, not only because users have the requisite knowledge in these contexts but also because they have the initiative and the authority to modify and adapt this technology to their uses.

The flu trackers that Welhausen discusses, too, engage in tactile literacies when they use the touchpad of their mobile device to zoom in or out of the user-contributed mapping feature on FNY's smartphone app, rearranging this visual information to meet their specific interests. Interacting with the program through touch is then also layered with the visual literacies needed to interpret and potentially make decisions about the information conveyed in FNY's maps such as whether they should be concerned about the spread of the flu at a particular point in time. As FNY continues to evolve, particularly the capabilities of the program's mobile app, users might also be able to engage in hacking practices similar to the ones Arduser describes, which may allow them to adapt visual information about flu activity to better meet their individual risk assessment needs.

Further, the push notification alert that FNY users receive every Monday on their smartphones when the system prompts them to submit a weekly flu report is interpreted somewhat similarly to Arduser's hackers, but very differently from the parent Bivens discusses: FNY users know the sound is simply a reminder to submit a report. Consequently, they can choose to ignore it or disable it. They can also silence the alert altogether, but still allow the program to send them a visual reminder via text messaging.

Through our case studies, we show how health literacies are integrated. And in the conclusion of this webtext, we propose a heuristic that medical rhetoricians and technical communicators can draw from in considering the ways that healthcare technologies shape and integrate aural, tactile, and visual literacies in differing healthcare communication scenarios.