Saturday, June 30, 2012

Overview of academic health and medical informatics training

The specialty of academic medical informatics training takes place at the junction of clinical and library sciences and Information technology. It is at this intersection where prospective students and graduates learn the necessary skills to develop, manage, integrate and perform research on healthcare imaging and IT systems. Examples of these systems are hospital and radiology department information systems, typically known by the acronyms HIS for hospitals, RIS for radiology, CIS for cardiology and LIS for laboratories, as well as electronic patient record systems such as EMR and EHR, and imaging systems such as PACS systems. Graduates may also be employed by manufacturers who design and support medical devices, and academic institutions that are heavily involved with clinical trials research.
There is a large demand for professionals with these skills, especially with the US federal incentives for physicians and hospitals to (finally) start implementing EMR’s. Industry estimates of the need for professionals with these skills range from ten thousand to forty thousand annually. As usual, formal institutions and colleges are lagging behind in training and/or retraining enough professionals to serve the new and growing areas of employment. This is especially true for the healthcare IT professions as new technologies and devices are introduced at a rapid pace. As an example, when I started my own career in healthcare technology I worked on software development of second generation CT scanners, which now are in their fourth or fifth generation, while at the time MR and ultrasound were still in the laboratory. That is why the professional societies play such an important role in providing paths for continuing education.
Vendors provide a wide variety of training opportunities; however, such training is more product oriented and lack coverage of fundamentals. That is where the colleges and universities should take a role. For example, it is hard to understand the workings of a MRI without knowing some fundamentals of physics, and the same applies for knowing how a PACS system works without any knowledge of HL7 and DICOM interfaces. Similarly, one needs knowledge of coding systems for EMR support and database knowledge to manage a hospital information system.
One of the issues with medical informatics is that there is no accepted general curriculum. This type of training is typically provided by three major academic tracks, the clinical track focuses on nursing, training for physicians and public health scientists. The engineering track supports those working toward biomedical engineering or bio-informatics degrees. Finally medical information management, with its focus on coding, organizing and indexing, typically supports those on a library sciences track.
Even within this general framework there are many variations and terms used to identify this type of training. Here is a list generated from our own search:

Course/program name
Number of programs
Health Informatics
29
Translational Bioinformatics
19
Healthcare Informatics
19
Clinical Research Informatics
17
Public Health Informatics
17
Bioinformatics
10
Nursing Informatics
9
Health Information Management
8
Bioinformatics and Computational Biology
6
Biomedical Informatics
6
Medical Informatics
5
Clinical Informatics
4
Health Care Informatics
3
Applied Health Informatics
2
Bioinformatics Management
2
Clinical Research
2
Computational Biology & Bioinformatics
2
Health Administration
2
Health Systems Management
2
Medical Imaging Informatics
2
Other related programs, each with a different name
21


In our search we found 78 colleges and university offering a medical informatics or similar program. These programs are offered as master’s (the majority) but also doctorates, with fellowships and quite a few certificate programs, which typically require a master’s. A complete list of the institutions, with links and contact information is available. Many of these programs are available as on-line or e-learning courses.
Also, several non-profit initiatives have sprung up to address a very specific certification; one in particular is the profession of PACS administrators, which is addressed by ABII and PARCA. To-date these two have jointly awarded about 2,000 PACS administrator certificates. There are a few institutions that have geared their curricula towards a similar certificate, but most of these particular types of training and continuing education courses are dealt with by private organizations and professional societies.

Friday, June 29, 2012

PACS Administrator Tools and Tricks of the Trade: Using a DICOM Sniffer


Dogs can sniff hundreds of times
better than humans;
Sajiv, our boxer is no exception
A DICOM sniffer is an essential tool to troubleshoot DICOM connectivity issues for any PACS administrator, service, integration and biomed engineer. It is not something that one needs to use on a daily basis, but for the trickier and difficult-to-diagnose issues.
What is a sniffer? It is s a passive software application that listens to network communications and captures the raw frames that are exchanged and interprets the communication protocol that is used. In the case of a DICOM communication, we are looking for the TCP/IP packets and DICOM application level protocol. A sniffer is an essential tool for network and infrastructure engineers, however, not all commercial sniffers have a DICOM plug-in that allows for DICOM protocol interpretation. Wireshark, which used to be called Ethereal, is an open source, free tool and does have the DICOM plug-in as part of the basic software package.
Because a sniffer is a passive device, there is no noticeable interaction with the devices that are to be diagnosed. Therefore, it is a great tool to be used when there are non-reproducible issues or errors. For example, imagine that a modality errs when sending an image to the PACS, but when you send the same image from another device, such as a workstation or another modality, there are no problems. And, even worse, the error might occur only once every hour or so. Another scenario might be that the error code is not easily interpreted, e.g. it might display something like “communication error” or “DICOM error.” In another scenario, there could be a finger-pointing contest going on between two vendors, each arguing that the problem lies with the other party, and you are in the middle trying to find out who actually causes the problem.
The nice feature of the DICOM sniffer is that it shows the actual data going across, i.e. it provides hard evidence of what is exchanged, including the timing, as it provides a timestamp as well. That means that it also can be used to troubleshoot performance issues. For example, if there is a lot of noise on the communication line causing multiple resends of corrupted packets, or if it takes a long time for an application to reply back to a DICOM command, this overhead is recorded and visible.
Where to run or install the sniffer could be a little bit tricky. The best location would be to have the software run on a dedicated diagnostic computer, e.g. a service laptop, which would connect to an active listening port at a router or switch. A trick one could also apply is to have a simple hub that you connect to the sender or receiver network cable and to your laptop. However, these solutions often require approval and/or cooperation from your IT department. Worst case, you might ask them to capture the session on their own network sniffer and share the capture file with you to do the analysis. Another possibility would be to run the sniffer at the source or destination where it can be set up to listen to the sending or receiving network adapter.  The latter might require cooperation of your vendor and/or PACS administrator.
Configuring the sniffer, capturing the file, analyzing and saving the file is relatively easy, see the short video demo. After the analysis and/or validation using some of the open source validation tools, one can share the results, including the capture file with the vendors or integration engineers so they can fix the problem or create a work-around.
In conclusion, a DICOM sniffer is an invaluable tool for troubleshooting; especially in cases when there is limited access to log files at either the source or destination of the exchange that has trouble. It is particularly useful for semi-random or not easily reproducible errors, which is an essential part of a PACS administrator’s set of tools and tricks.

Saturday, June 23, 2012

Reading Radiology from your Smartphone, Really?

I wouldn't read this type
of images from my smartphone
There has been an explosion of healthcare “apps” for wireless intelligent devices such as smart phones and tablets. Not only are numerous utilities available for consumers, such as the one I have on my phone to record my heart rate, but there are also add-ins that are used by physicians. An example of this is the one that connects the ICU physicians at Cook Children’s Medical Center in Forth Worth with real-time data from the neo-natal unit, or the one that the physicians at the Dallas Medical Center use to show the fractured leg of my little grandson to his mom after he fell on the playground at his day care. And there is no end in sight as developers look to develop tests for people who have diabetes, and I can imagine that pacemaker data can be exchanged as well as many of these new devices now have blue-tooth or other wireless interfaces.
This is all great but every fast new development has a risk that the regulatory agencies and the trade associations will be playing catch up to provide appropriate guidelines and checks and balances to advise the use of these devices and protect the interest of patients. Case in point: I just came back from a PACS training onsite at a major hospital where the radiologists are reading chest radiographs on their laptops and even tablets. These devices are great for communication between physicians and also their patients, but one should never, ever, ever, use these devices to make a diagnosis. The AAPM has developed a set of guidelines for testing and evaluating monitors (see also white paper on “what monitor to use”). If I take one of their most important images, the so-called TG-18 QC test pattern and display that on my laptop, I can barely see half of the grayscale values that are supposed to be visible. The impact of this poor presentation is obvious: anatomical findings that differ only a few grayscale values will totally disappear.
The problem with using un-calibrated commercial grade displays for diagnostic purposes is not new. Some PACS vendors have been known to offer poor quality monitors as part of their package to undercut the price of others in order to get a sale. Not only does this impact patient care, but also exposes them to a major liability. I was called as an expert witness in a case where a radiologist was asked by a judge about his familiarity with the ACR guidelines for teleradiology and the use of the appropriate monitor for his job. I also have had some of our PACS administrator students tell me that they had to pull calibration records from prior years to show that the monitor which was used to make a disputed diagnosis was indeed calibrated.
The good news is that there are many reputable institutions that do indeed have a monitor quality control policy and procedures in place. Interestingly enough, for digital mammography, regular calibration and the use of FDA approved monitors is required. I am puzzled by the fact that this is not the case for diagnostic monitors, as I would assume that missing the indication for breast cancer is of the same level of risk as missing an indication for lung cancer. However, the FDA does not appear to think the same.
Now back to the use of the intelligent wireless devices, there is no question that these will make a major impact on the delivery of healthcare and that they will improve the exchange of information that will make healthcare more effective and safe. However, I surely hope that these will be used with caution and that regulatory agencies and trade associations such as the RSNA and HIMSS will catch up and put guidelines out and educate users on how to use these effectively and safely.

Monday, June 18, 2012

What’s In and What’s Out: Top 10 PACS Trends

Listening to the latest innovations
at the 2012 SIIM conference
Listening to the speakers at the recent SIIM 2012 PACS conference in Orlando, Florida, one could distinguish several new trends and technologies, some of them being clearly disruptive, some of them are merely an evolutionary development and some are just repackaged with a sexy buzz word (aka marketing hype) for a technology that already existed for many years. Especially with the latter, it sometimes takes a little bit of questioning and investigation to find out what the big deal is, which is kind of unfortunate, as it makes it seem as though it has been done to purposely confuse potential customers. In any case, here’s what I found:
1.       In with iPad, Out with the Workstation. This is a clear example of a disruptive technology as it is clearly overtaking an existing technology and reaching new markets and applications. Physicians can use the iPad not only to access images in conjunction with the EMR to review a case, but also utilize these for patient education and to discuss images at the bedside and by sharing with patients and families. Almost 50 percent of all physicians already use tablets and the use for nurse practitioners, physician assistants and other healthcare practitioners is expected to follow. A Johns Hopkins group reported that the security of these devices was far superior to any regular laptop because of the ability to remote wipe a lost device, access code expirations, auto-lock and several other security features. They also noted that the viewing capability was indeed the “killer app.”
2.       In with AVN, Out with VNA. The Archive Vendor Neutral solution (VNA) vs the Neutral Vendor Archive (NVA) seems to be a typical example of another word for the same solution. There is still confusion about what a “real VNA” means (see also the white paper). But the good news is that VNA solutions seem to have matured however, it has become obvious that several PACS vendors have problems with disconnecting their archive and image manger/database. Almost all PACS vendors have now renamed their own enterprise archive into a VNA, which seems to defeat the purpose. It still takes quite an effort to “connect” an existing archive with a VNA as reported by the Mayo clinic, which has done 13 migrations to date and is in the process of doing three right now. They reported that it can still take from a few weeks up to two years for “discovery” of the images of a PACS to the VNA image manager.
3.       In with Zero Footprint, Out with Thin Client. The zero footprint is a logical evolution from the thick client to thin client workstations, emphasizing the fact that there is no trace left at a device as soon as the software application has exited. That means that all information from a cache is flushed, making it a perfect solution to address patient privacy and security concerns. The protocol most often mentioned for viewers and other applications is the HTML5 protocol. While not quite considered mature, it seems to be emerging as the implementation of choice. This solution also seems to be favored over any other protocol solutions such as WADO (Web Access to DICOM Objects), which has some limitations such as not being able to exchange (yet) images in a single transaction or to bundle them in a series by study, or by the physician, or viewer solutions connecting to an enterprise archive or VNA.
4.       In with MINT, out with DICOM. The DICOM native protocol is not quite suitable for web access, especially as used by zero-footprint viewing applications. Issues that arise are because the DICOM protocol was defined in an era of fixed IP addresses, hierarchical queries, and the need for application level addressing (AE-Titles), instead of URLs.
The web version of DICOM, called WADO, has its limitations as mentioned above, and therefore a new protocol was developed called MINT, which claims to provide a significant (although maybe overrated) performance improvement for retrieval. It is especially effective if the data is preprocessed in the MINT format. There are vendors who have implemented a MINT interface and even use it natively as a data storage format to optimize access to large multi-frame datasets. Whether this will actually prevail or be displaced over the next few years by DICOM web services, improved WADO or another protocol, remains to be seen.
5.       In with Facebook, Out with MIRC. The RSNA has created an image exchange service called MIRC, which offers the capability to be used for teaching files to describe  image study characteristics, including diagnosis, in a meta-header with this study and providing an exchange service. Based on a survey, it appears that 60 percent of all patients are willing to share their images (with identifiers stripped). However, the image sharing activity between several academic institutions has only resulted in a few thousand studies to be made available. Personal Health Records (PHR’s) are also not used to the degree as expected. It might just take one of the giant social networks such as facebook or others to jump on the bandwagon and provide this type of information.
6.       In with texting, Out with emails. An email is responded to within an average of 90 minutes. By contrast, a text message gets reviewed within an average of 90 seconds. The use of mobile applications may shift the emphasis from radiologists generating reports to “impacting patient status.” More than one out of five results are not being followed up as shown by a poll taken by one of the presenters. Using mobile technologies may allow radiologists to follow up with their referring physicians and ultimately with patients to improve patient care.
7.       In with Video, Out with MPEG. The Mayo Clinic reported that they are currently archiving about one million images per day, resulting in their archive being filled with 1.3 billion images spread over nine PACS systems. For this type of institution, a Vendor Neutral Archive is not just a luxury, but a requirement to allow cross-departmental access. After being able to store successfully the MPEG encoded DICOM files such as generated by endoscopes and other Visible Light modalities, one of their next projects includes storage of “regular videos,” because they found that many departments store DVD’s that have patient diagnostic information acquired for all types of reasons. As an example, filming the “gait” of a patient walking before and after a specific therapy.
8.       In with Image sharing, Out with CD’s. Exchanging CD’s to import images from a patient created by another institution is still an area of concern as there are still users who create non-DICOM compliant CD’s. I believe that one of the main reasons for this is a lack of education. Almost all vendors by now are able to create standard compliant CD’s, however, it might require a non-default setting at the CD burner, as their default might be creating a proprietary CD with images that can only be opened by the embedded viewer. Most third party CD readers are now able to read the proprietary image formats generated by these “rogue” vendors, further increasing interoperability. However, it is clear that electronic exchange of the information is far preferred, which is where the image exchange services come in. These services allow users to exchange through a “cloud” where information is stored and to which they can connect in a secure manner through a plug-in or edge server. A good way of thinking about this service is that they act as an “ad-hoc” HIE. There are now several third party companies who provide this service.
9.       In with middleware, Out with tag morphing. Tag morphing was a big deal over the past few years as it was portrayed that without a lot of header modifications, it would be impossible for a VNA to exchange information with a PACS system. However, based on feedback from VNA vendors, this was apparently grossly over-rated. Without the changes, basic image viewing has proven to be possible, and whatever change is needed can easily be included in the migration process.
10.   In with RWF, Out with SWF. The traditional “Scheduled Work Flow” sequence specifies how an order is placed, filled and scheduled, followed by generation of a work list, retrieval by a modality, image acquisition and storage, and finally retrieval and reporting. Using all this information it is useful to perform business analytics showing how a department is performing. However, what is needed instead, is clinical analysis whereby the sequence is reversed, hence the title of “Reverse Work Flow.” To make a clinical impact, the result should be fed back to the next order, the image characteristics and diagnosis should impact the image processing for the next procedure of this patient and/or patients with similar characteristics. The same applies for creating the modality technique. For example, if a radiologist would decide that for a follow-up exam the technique could be reduced by a factor of two, this would impact the radiation dose that a patient would be subjected to, resulting in better care.
This year’s new health care imaging and IT technologies are mostly just a new “creatology,” some are just an evolutionary step, but in my opinion, the single most disruptive technology is the usage of mobile devices such as smart phones and tablets along with the proliferation of new applications and the capturing a new group of users. It will be interesting to see how this will evolve and how regulatory agencies and trade organizations will react and whether they will be able to facilitate it rather than choking it.

Saturday, June 9, 2012

2012 SIIM Conference message: Why is PACS not smarter?

The exhibition is always a great venue
 to get an update on the latest te4chnology
The major question posed during the keynote speech at the annual PACS conference in Orlando, Florida this past week was, “Why are PACS systems not smarter?” Dr. Eliot Siegel from the VA in Maryland says there is a big need for applying artificial intelligence or AI to the use of workstations, to allow the radiologist to become at least twice as efficient and have more clinical context information available at his or her fingertips. Vendors should start thinking “out-of-the-box” and apply technologies, which are commonly used in other applications and practices.
Currently there is only one-way communication between an acquisition device, which sends images to a PACS system, which are then reviewed and reported on, and then sent to an EMR. Siegel says that there should be a bidirectional exchange between the PACS, EMR and modality. The EMR should share pertinent other clinical information with the radiologist, who should be able to feedback information to the PACS and modality. For example, if a certain X-ray exposure was not optimal with regard to noise, presentation, image processing, or even the contrast agent used, that information could be fed back to that modality to optimize future imaging.
Most EMR systems are merely a direct representation of a paper record in an electronic format and do not necessarily have a lot of structured text, which makes it hard to extract important information. As an example, a physician might need to browse a complete lab report to find one particular value such as the creatinine level, which could be important for the procedure to be scheduled. In addition, instead of a block of text, a patient history could be represented in a picture of the patient with specific issues highlighted that can then be used to drill down for specific details.  Image processing and/or CAD analysis could be based on patient medical history and genetic information.
As several other speakers mentioned, the ideal viewer is the so-called “zero-footprint” viewer. This is a new term for “thin client,” which basically means that the image handling and processing is done at a server, whether it is a local PACS, an enterprise server (also referred to as VNA), or a cloud storage device. When a user exits the viewer, there is no “footprint” left, no local images or any trace of the software, which is also good for security reasons. One of the features could also be the display of reference papers and resources, which would include a direct link to one or more images that can be opened from the teaching file directly in the same zero footprint viewer to demonstrate a particular finding.
With regard to peripherals, a lot can be done to improve these. For example, new technologies used in gaming could reduce the risk of wrist and/or hand injuries to radiologists due to over-use of a mouse and keyboard. These input devices and the graphical user interface are “ancient” in comparison with the technologies used in the gaming industry or provided by search engines. Images are displayed in a linear, one-dimensional format, instead of, for example, using a “Google earth” user interface. There are devices that can interpret “thoughts” and allow a physician to command simple image manipulations such as scrolling through an image series, or adapt the Xbox Kinect user interface to manipulate images.
Hanging protocols traditionally have been the weak point for any PACS viewing station. The image presentations, sorting and ordering could be much improved by applying some of the intelligence matching the radiologist’s preferences.
It was very refreshing to have visionaries such as Dr. Siegel telling the audience in blunt language what he thought of the state-of-of-the-art of PACS technology, and point out what is lacking and what the technology could be. There is a lot of technology out there, it is just a matter of vendors implementing the technology to allow these systems to become more efficient and provide better, more pertinent clinical information to improve patient care.