HIMSS: Online model aids enterprise image viewer selection

NEW ORLEANS - Clinical user evaluation of vendor enterprise image viewers can be performed online, saving time and increasing participation in the purchasing decision-making process, according to a Tuesday presentation at the Healthcare Information and Management Systems Society (HIMSS) meeting.

An online evaluation approach conducted at Johns Hopkins University yielded a more than tenfold decrease in clinical user evaluation time compared to a traditional onsite demonstration, said Dr. Paul Nagy, director of quality in the university's department of radiology.

"Because of that, we ended up getting 300 physicians to be involved in our process and we thought that was a useful model," he said. Nagy spoke during an education session that was cosponsored by the Society for Imaging Informatics in Medicine (SIIM).

In a perfect world, there are three groups of people who should be represented in the technology decision-making process: the executive suite, who will be paying for the technology; the technical group (the IT department); and the clinical users, he said. Common decision-making dysfunctions can prevent meaningful engagement between the respective groups, however.

If business needs are ignored, no one pays for the technology, and if technical needs are ignored, the system might end up isolated in a closet in an archaic manner that's not integrated effectively, Nagy said.

"And if you ignore clinical needs, no one is going to end up using the system at the end of the day," he said. "Your job, if you want to try to lead adoption of successful IT implementations, is to engage these three groups in a meaningful way."

However, navigating the cultural chasm between these three groups can be a difficult process, Nagy said.

Clinical feedback

Getting adequate and representative clinical feedback for evaluating IT systems presents a difficult quandary. Nagy cited what he calls the inverse participation law, in which the smaller the number of people who show up for the evaluation, the larger the number of people who show up later to complain about the system.

In addition, the self-selection paradox applies, in which the clinical staff members interested enough to show up for evaluations are generally advanced users with different methods than average users, Nagy said.

"Those clinicians who you do get to come to the demos are probably the wrong clinicians to help you decide which vendor to choose," he said. "The very few hardcore physicians who come to the demonstrations are going to be technophiles -- the power users who want to have advanced functionality and aren't as interested in usability."

As a result, the challenge is not just to get physicians, but to get the right kind of physicians to be involved in the evaluation process.

Disruptive technology

An opportunity is being presented by disruptive technology change in medical imaging today. For example, new HTML5 image viewers are redefining how medical images can be manipulated in a Web browser, Nagy said.

"What we saw was a new technology here that enables not just the 30,000 workstations that we have at Johns Hopkins to be able to access medical images, but also the new mobile devices that people are carrying around in their pockets," he said.

Two teams were formed to support this project. The technical team's job was to work with the business side to get them on board and make sure that it's the right technology for the institution. After winnowing the list of prospective vendors from 24 to three that satisfied the group's technical and business requirements, final evaluation was left to the clinical evaluators. The clinical team's role was to create a marketing strategy to get clinical users' attention, he said

The technical team created a Web platform to facilitate evaluation by hundreds of clinical users throughout the enterprise. This Web-based model was similar to the old Pepsi Challenge taste test, asking users to compare the same images on three different systems from unnamed vendors. Users could perform the evaluation on their own schedule in a self-paced manner without sales reps or trainers around, he said.

Convenient, fast evaluation

The three vendors' viewers could be evaluated side by side, all within 15 to 20 minutes. It was also a task-driven process, asking users to perform a task on all three software applications.

The experiment was designed to evaluate the three potential image viewer systems based on usability, perceived usefulness, and overall user satisfaction, he said. Participants were also entered into a drawing for an iPad 2.

Users were given a task script to evaluate a head CT from all three vendors for intracranial hemorrhage (including size of hematoma, if present, and degree of midline shift, if present), using the window/level, scroll, and measurement tools. They were then asked to submit a clinical image viewer evaluation form.

During the one-month evaluation period, the group received 304 completed evaluations, which were completed in an average of 15 minutes. Evaluator roles included residents (30%), attendings (25%), medical students (21%), fellows (16%), others (6%), and nurses (1%). In 1% of responses, the evaluator role was left blank.

The evaluators' experience level in electronic image review was as follows:

  • More than five years: 38%
  • Five years: 11%
  • Four years: 9%
  • Three years: 13%
  • Two years: 13%
  • One year: 16%

The evaluation resulted in one vendor slightly edging out the other two vendors, Nagy said.

The online evaluation approach dramatically reduced the cost of participation for the 304 evaluators.

A traditional onsite demo model of scheduling a one-hour demo for three vendors would have consumed approximately 900 hours (0.5 FTE), while the Web-based, task-oriented model took only 80 total hours, he said.

Page 1 of 775
Next Page