Findings

Return to "Overview"

Of 124 survey-invitation recipients, 40 or 31 percent responded. This is within normal limits for such queries. Respondents from developed countries (United States, Europe, Australia, China) were predominant. Most respondents listed “medicine” as their profession, with “information technology / other” as the second highest-responding cohort; 26 different specialties were reported.

Most respondents were not familiar with prognostic scoring systems; those who were familiar listed 24 different systems with which they were familiar. Likewise, most had not used such a system. Contrarily, most respondents were aware of the Sequential Organ Failure Assessment (SOFA), which scores have been widely used in efforts to allocate critical care resources during the Covid pandemic. However, most had not used this system or any of its variations.

Challenges

There was consistent agreement that the following issues or challenges were “very important” or “important”: software quality, user knowledge and education, racial or ethnic bias, use for rationing or triage, and system accuracy; patient ignorance was regarded as somewhat less important.

Oversight

The clearest finding was a rejection of “no oversight” of prognostic scoring systems; 84% of those who answered this question signaled support for oversight by various entities. Those who “strongly agreed” or “agreed” that some sort of oversight was needed ranked the entities thus: professional organizations and government (88%), “institutions” (76%), and industry (48%). From this last, it is not clear whether to infer that self-regulation is inadequate or inappropriate.

Evaluation

Asked “What are your views regarding evaluation of prognostic support systems in hospitals?” respondents consistently ranked the following as of more-or-less equal importance: “evaluation before use, implementation,” after a specified period, continuously, in the context of actual use, and in simulations. More “somewhat agreed” than “strongly agreed” with evaluation in simulations, reversing the emphasis for other contexts.

Liability

Respondents reliably expressed support for widespread “liability and legal responsibility.” Most “strongly agreed” or “somewhat agreed” that legal responsibility should lie with hospital administrators, clinical ethics committees, individual clinicians, system designers, and software developers. This parallels current legal debate about the breadth of responsibility and accountability for use of decision support, AI and otherwise. It is surprising that “clinical ethics committees” were thought to share liability given that such committees are quite rarely implicated in legal challenges.

Patient Consent

“No patient consent or disclosure is needed” consistently elicited “somewhat” or “strongly” disagree responses. Very few strongly agreed that “patient should always consent to computer use,” “patients should consent for decision support,” disclosure was required by consent was not necessary, and case-by-case determination by hospitals. Generally, the strongest support was signaled for “patients should be informed about decision support, but consent is not necessary.”

Other Issues

The strongest views were disagreement that both clinicians and patients are well informed about medical computing tools. This is quite suggestive and supports the view that greater “health information technology literacy” is needed. A significant cohort strongly or somewhat agreed with the statement that “Traditional decision support raises the same key issues as AI systems.” This, too, is significant: recent excitement about machine learning and artificial intelligence seems too often unaware of a long history of ethical analysis of traditional decision support.

Limitations

This pilot study had several limitations:

  • Respondents were identified by the principal and co-investigator based on their knowledge of colleagues and others with an interest in ethics and biomedical informatics. Others might have selected a different cohort.
  • The sample size was small.
  • Only two queries were sent over a short time in December, a holiday month.

Discussion

This is apparently the first survey of its kind to identify ethical issues and related best practices in the use of computers in hospitals with a special regard for public health informatics. Given its limitations, it appears worthwhile to broaden both the scope of questions and the cohort of those asked to complete the survey.

We nevertheless detected nontrivial inter-rater reliability on issues related to

  • Decision support system limitations and shortcomings
  • The need for oversight and evaluation
  • The need for improved literacy regarding health information technology

Moreover, we find it significant that most respondents recognized that the use of computers in health care have long raised ethical issues. As the world now grapples with appropriate uses and users of artificial intelligence systems and big data repositories, there is an opportunity to learn from previous work.

Prognostic Scoring System Survey: Questions & Responses

Assess the importance of the following challenges, criticisms, or issues related to the use of prognostic scoring systems for resource allocation

Question Very Important Important Not Sure Not Very Important Not Important
Software quality 19 7 0 0 0
User knowledge and education 18 8 0 0 0
Racial ethnic, or other bias 22 2 1 0 0
Use of system for rationing triage 13 6 4 1 1
System accuracy 21 4 1 0 0
Patient ignorance 6 9 8 2 0

What are your views regarding oversight of prognostic support systems in hospitals?

Question Very Important Important Not Sure Not Very Important Not Important
No oversight needed 0 1 2 1 22
Oversight by industry 3 9 8 1 5
Oversight by institutions 13 7 2 4 0
Oversight by professional associations 17 6 2 1 0
Oversight by government 11 12 1 2 0

What are your views regarding evaluation of prognostic support systems in hospitals?

Question Very Important Important Not Sure Not Very Important Not Important
Evaluate before use, implementation 25 1 0 0 0
Evaluate after a specified period 19 6 1 0 0
Continuous Evaluation 22 2 2 0 0
Evaluate in the context of actual use 25 1 0 0 0
Evaluate in simulations 12 13 0 1 0

Liability and legal responsibility raise challenging questions. Who should be legally responsible for the use of a decision support system for resource allocation?

Question Very Important Important Not Sure Not Very Important Not Important
Hospital administrators 11 10 3 1 1
Clinical ethics committees 10 11 2 1 2
Individual clinicians 5 16 2 1 2
System designers 11 11 0 3 1
Software developers 8 10 3 3 2

Some have suggested that patients should agree to its use or be informed when a computer is used to provide decision support.

Remembering that alarms and alerts are a form of decision support, please respond to the follow.

Question Very Important Important Not Sure Not Very Important Not Important
Patients should always consent to computer use. 3 6 3 7 7
Patients should consent for decision support. 5 4 4 7 6
Patients should be informed about decision support, but consent is not necessary 2 14 3 5 2
Hospitals should determine need for consent on a case-by-case basis. 1 11 5 5 4
No patient consent or disclosure needed. 0 3 2 10 10

Other issues

Question Very Important Important Not Sure Not Very Important Not Important
Similar rules should apply to diagnostic and prognostic systems. 9 9 2 3 2
Traditional decision support raises the same key issues as AI systems. 8 9 0 5 4
Most clinicians are well informed about medical computing tools. 0 2 1 14 8
Most patients are well informed about medical computing tools. 1 0 0 7 18
Prognostic support tools should be included in electronic health records. 4 14 7 0 1