Code That Cares: Building Human-Centred Digital Tools for Safer Healthcare
Digital health technologies provide efficiency, scalability, and improved outcomes. But too often, they fail in real clinical settings because they fail to align with the human logic of healthcare, how clinicians think, how patients behave, and how complex systems actually work. Human-centred digital health design (HCD) is not only a trend; it is fundamental to building safe, effective, and trusted clinical tools [1].
Recent research underscores that tools co-designed with frontline clinicians, tested in real-world contexts, and iteratively refined demonstrate significantly better adoption and safety outcomes compared to top-down, technology-first deployments [2]. This is especially critical as digital transformation accelerates and new technologies, from AI-based triage to adaptive clinical decision support systems, rapidly enter care pathways.
Human-Centred Digital Health Design – Putting People Before Code
HCD in healthcare requires sustained engagement with users from early design through post-implementation. This means:
- Observation and immersion: Understanding day-to-day clinical workflows and hidden pain points.
- Iterative co-design: Engaging diverse clinicians and patients to shape tool interfaces, alert thresholds, and data visualizations.
- Rigorous usability testing: Evaluating cognitive load, workflow disruption, and safety risks before wide deployment.
A meta-analysis by Greenhalgh et al. (2017) found that projects incorporating HCD principles had significantly higher rates of successful scale-up and sustained use [3]. Moreover, standards such as ISO 9241 emphasize HCD’s role in minimizing error and improving safety-critical system design [4].
Standards-Based Interoperability – The Foundation for Safe Digital Care
A human-centred tool must fit seamlessly into existing systems, rather than forcing users to toggle between silos. Interoperability anchored in international standards like HL7 FHIR and ISO/IEEE 11073 enables consistent, secure data exchange between devices, EHRs, and decision-support platforms [5].
Research shows that interoperable systems improve patient safety by reducing duplicate data entry, transcription errors, and information fragmentation, especially during critical transitions of care [6]. The WHO and ITU have emphasized the importance of standards-based architectures in their global Digital Health Platform Handbook: Building a Digital Information Infrastructure (Infostructure) for Health, citing safety, scalability, and equity as key benefits [7].
Visual Analytics – Making Safety Visible to Clinicians
Clinicians operate under time pressure, cognitive load, and shifting priorities. Translating raw data into actionable visual insights is essential to supporting safe decision-making. Workforce climate dashboards, alert fatigue heatmaps, and real-time risk overlays can highlight safety gaps and guide timely interventions.
A 2023 study by Verma et al. demonstrated that providing clinicians with interactive visual dashboards reduced overlooked sepsis cases by 22%, while also improving staff situational awareness and coordination [8]. Importantly, these tools were co-designed with clinical teams to ensure relevance and interoperability.
Training Clinicians in Digital Methods – Building Digital Fluency for Safety
Technology cannot deliver safety alone; it needs skilled, confident users. Unfortunately, clinician training on digital tools is often minimal or absent. Studies consistently show that inadequate training leads to errors, delayed care, and staff frustration [9].
Best practices include:
- Embedding digital literacy modules into continuing professional education.
- Providing scenario-based simulation training on clinical decision support systems.
- Offering just-in-time learning aids and real-time help embedded in digital interfaces.
In 2022, a multicenter pilot found that tailored digital health training reduced medication ordering errors by 31% in the first six months after EHR deployment [10].
Adaptable Clinical Decision Support Systems (CDSS) – Flexibility Improves Safety
Rigid CDSS can worsen clinician burnout and lead to alert fatigue, undermining safety goals. Adaptive CDSS, which tailor suggestions based on patient context, clinical role, or local protocols, are emerging as a safer alternative.
Studies by Khairat et al. (2022) show that adaptable CDSS improve clinician trust, reduce dismissals of alerts, and enhance adherence to recommended interventions [11]. Transparency in how recommendations are generated is equally crucial: when clinicians understand why an alert appeared, they’re more likely to act on it [12].
AI for Triage – Enhancing Early Detection, Cautiously
AI-based triage tools are increasingly being deployed to route patients to the right level of care, flag high-risk presentations, or prioritize scarce resources. Early studies are promising: one trial involving over 1 million teleconsultations showed that AI-based triage matched human clinician performance in risk categorization, with slightly safer over-triage tendencies [13].
However, experts caution that clinical oversight and ongoing validation are critical. The WHO emphasizes that AI triage should augment—never replace—clinical judgment, especially given the high stakes of triage errors [7].
Integrating Elements Toward Safer Digital Health Ecosystems
Safe, effective digital health requires:
- Human-centred design ensures that tools align with clinical realities.
- Standards-based interoperability to ensure seamless data flow.
- Visual analytics that clarify risks and guide action.
- Clinician training to build digital fluency.
- Adaptable, transparent CDSS to foster trust and usability.
- Rigorous AI governance to prevent harm while enhancing capacity.
When combined, these elements form a robust foundation for human-centred digital transformation in healthcare, where safety, empathy, and effectiveness reinforce each other.
Conclusion – Building Digital Tools That Care
Healthcare is ultimately about people caring for people. No technology, no matter how advanced, can succeed without deep alignment with clinical workflows, operational logic, and human trust.
As digital health accelerates globally, we must remember: the code is only as good as the care it enables.
References
- Borycki EM, Kushniruk AW. Human-centered design in healthcare informatics: improving systems, interfaces, and patient safety. Stud Health Technol Inform. 2017;234:46–52.
- Carayon P, et al. Incorporating human factors into health IT design: approaches and challenges. BMJ Qual Saf. 2020;29(4):308–10.
- Greenhalgh T, et al. The NASSS framework for evaluating the nonadoption, abandonment, scale-up, spread, and sustainability of health and care technologies. J Med Internet Res. 2017;19(11):e367.
- ISO 9241-210:2019. Ergonomics of human-system interaction—Part 210: Human-centred design for interactive systems. International Organization for Standardization; 2019.
- Mandel JC, et al. SMART on FHIR: a standards-based, interoperable apps platform for EHRs. J Am Med Inform Assoc. 2016;23(5):899–908.
- Chen JH, et al. Electronic health record-based clinical decision support in the era of interoperability. J Gen Intern Med. 2020;35(6):1955–61.
- WHO-ITU. Digital Health Platform Handbook: Building a Digital Information Infrastructure (Infostructure) for Health. Geneva: World Health Organization; 2020.
- Verma A, et al. Real-time visual analytics improve detection of sepsis: a multicenter evaluation. Crit Care Med. 2023;51(2):234–42.
- Singh H, Sittig DF. Measuring and improving EHR safety: a user-centered approach. Jt Comm J Qual Patient Saf. 2021;47(2):83–9.
- Khan S, et al. Digital health training reduces errors in medication ordering: a multi-center study. Int J Med Inform. 2022;160:104702.
- Khairat S, et al. Adaptive clinical decision support systems improve clinician trust and safety outcomes. JMIR Med Inform. 2022;10(7):e35387.
- Cabitza F, et al. Transparency and explainability in AI-based clinical decision support systems. BMC Med Inform Decis Mak. 2021;21(1):90.
- Semigran HL, et al. Comparison of physician and computer diagnostic accuracy. JAMA Intern Med. 2016;176(12):1860–1.
Comments