Keynote speaker Rachel (Rae) Walker, Ph.D., RN, FAAN, opened the 2024 Zeigler Forum with a description of one of their early experiences with Artificial Intelligence (A.I.), a study that used eye tracking technology and machine learning to help cancer patients better manage chronic fatigue syndrome, a complicated disorder with no known cause and no single test to confirm diagnosis. The study helped ease physical suffering for participants, garnered positive attention for the computer engineering team that built the technology, and thrust Walker into unfamiliar territory as a consultant addressing health issues involving technologies like A.I. and machine learning.

But, said Walker, 鈥淭he study didn鈥檛 move us any closer to justice for patients experiencing chronic fatigue.鈥

Walker (they/them), whose scholarship focuses on equity-centered and community-directed health innovation and digital defense against technologies that cause harm, is the only nurse Invention Ambassador for the American Association for the Advancement of Science and co-founded Health Tech for the People, a multi-disciplinary research thrust focused on tech ethics and accountable design, in 2023. 

Explaining why mitigating chronic fatigue was a challenge that A.I. was unequipped to meet, Walker pointed to medical gaslighting, a behavior in which a physician or other medical professional dismisses or downplays a patient鈥檚 physical symptoms or attributes them to something else, such as a psychological condition.

鈥淔atigue is one of those invisible symptoms that can be incredibly disabling,鈥 said Walker. 鈥淚t's something we're seeing now with syndromes like long COVID. And depending on who you are in the world and where you sit, this can determine whether anyone takes your symptoms seriously.鈥

Walker鈥檚 point was that for A.I. to help patients with chronic fatigue, the Artificial Intelligence would also have to be able to convince the providers to believe the patients were experiencing symptoms of the condition.

鈥淲e would have had to have designed a means by which to address the underlying structures that lead to medical gaslighting and chronic undertreatment of certain symptoms - things like racism, sexism, ableism and the structural poverty required by capitalism,鈥 said Walker.

Walker focused the rest of the talk on what they鈥檝e learned from their experiences with A.I. and underscored the importance of questioning its potential for both good and harm, particularly in the context of the ways that care work is - and has historically been 鈥 valued and undervalued.

鈥淭hese are questions about A.I., but they鈥檙e not just about A.I. They鈥檙e about the forces that shape the society and systems in which we live, the ways in which we recognize and give value to resources and labor in our society and health systems, and who decides,鈥 said Walker. 鈥淭he 鈥榳ho鈥 is critical here: Who is defining, who is assigning value, who makes the decisions about what is health, and what is care, and what should that look like going forward?鈥

Walker emphasized that A.I. technologies like machine learning and generative A.I.鈥檚 predictive text draw upon past data to create statistically representative patterns, effectively replicating the past in the present.

鈥淎t a time when A.I.- animated technologies are in the headlines and on everybody鈥檚 lips and quite literally in our bedrooms and our pockets in the form of smart speakers and smartphones and the internet of things, we should make sure aspirations for health justice and our commitments to each other and all those we accompany in care stay front and center,鈥 said Walker.

How technology impacts clinical care

鈥淭echnology isn鈥檛 flawless,鈥 said Clinical Assistant Nursing Professor Brandon Brown, M.S.N., RN, reflecting on Walker鈥檚 keynote and its value for students in the health professions. 鈥淚t has bias, just like people do, and I think that's important to think about for nursing, especially as people are sorted into algorithms for care. Thinking about who gets sorted, where, and why is important for the nurse to keep in mind because that can cause harm.鈥

Brown also noted the limitations of some widely used healthcare technologies: 鈥淭here's an interesting case of the automatic soap dispensers in the bathrooms, how the technology was meant to be for white skin and wasn't working for people of color. And a study found that the pulse oximeter can be inaccurate for people who are not white."

"It's important for our students to think about how technology might impact the way in which they care for folks,鈥 said Brown. 

For Kristen Koeller, who earned her Doctor of Nursing Practice degree from 日韩无码 this spring, Walker鈥檚 perspective prompted an additional question.

鈥淭he key that's missing with A.I. is that there's no person-to-person connection,鈥 said Koeller. 鈥淚 question if A.I. will ever replace that. I don't think it will.鈥

The 2024 Zeigler Research Forum featured 70 student poster presentations, a talk by Dr. Melissa Scheiber, recipient of the 2023 CNHS Research Incentive Award, and a series of data blitz presentations by students.

Walker is a co-author, and Brown is co-editor, of 鈥楴ursing a Radical Imagination: Moving from Theory and History to Action and Alternate Futures.'