Friday, November 30, 2012

RSNA 2012: It's all about the patient


There are two types of people who participate in the annual pilgrimage by visiting the RSNA trade show in Chicago:  the people who love it and thrive on the adrenaline and activities, and those who hate it. I belong to the first group, as it is exciting to see new products and gadgets, and to listen to the different languages around me while trying to figure their nationality. It is also a good venue to poll trends, get an idea of who is working where, and who moved, which new upstart companies are up and coming, who is acquired, and, last but not least, where is this industry going.

My perception was that there are still significant investments being made in healthcare, however, it is definitely shifting from buying devices and even PACS systems to building and expanding IT and infrastructure. For example, I heard many users complaining that “PACS systems have been commoditized” and vendors are not making any significant investments in this technology anymore. In addition, users have started weighing the benefits of buying yet another more powerful, bigger device, not only in terms of the bottom line, but also, and even more importantly, in terms of patient care.
Interestingly enough, the theme of this year’s event was all about the patient. 

However, as several speakers expressed, imaging, especially radiology, has been more removed than ever, not only from the patient, but also from the physician. One of the advantages of a PACS system that I heard often expressed is that the radiologist would not be “bothered” anymore by incoming calls, by technologists asking for advice or to consult, as images and results are readily available on-line. This had an unexpected negative effect of radiologists becoming isolated in a cubicle instead of talking with colleagues. This might not be a good development.

In a nutshell, what I learned is that healthcare imaging and IT are still good businesses, but the emphasis is shifting from imaging to IT. In addition, practitioners seem to forget the human interaction as it is easy to just stay behind your computer screen and hidden in your office. It is important to remember that emails and texting is no substitute for human interaction, which is still a critical part of healthcare.

Thursday, November 29, 2012

The OTech 2012 RSNA Awards.

The isles are always bustling with traffic

The 2012 RSNA in Chicago will most likely go into history as one of the lesser exciting and uneventful meetings compared with some previous years. Despite that, I was still able to find a couple of noticeable products worth sharing. Before you read the list I would like to add a disclaimer that this list is purely subjective and created with my engineering bias, so I tend to look more for new gadgets and exciting technology rather than clinical break-throughs. So, for any totally non-geeky persons, this list is probably boring, but for those who like new toys and gadgets, you’ll probably appreciate these.

Incredible engineering inside
1.       The Best Improvement Award goes to… the new 320-slice CT scanner. The Japanese might not be known for the most innovative and/or creative product designers and engineers, but they are masters in taking an existing concept and continuously improving, enhancing and refining it. This new Toshiba CT is a great example of that. It has a record scan time of about .25 seconds. Just imagine four rotations every time you say out loud “twenty-one, twenty-two, twenty-three, etc.” The G-forces on the X-ray generator, which is mounted on the gantry, must be enormous with that amount of weight and speed. The bore has been widened as well, so it is ideal for obese heart patients who need a cardiac scan, of which we regrettably have plenty in this day and age. No question that this is a major engineering accomplishment.

CCFL and LED next to each other,
no visible differences on the outside.
2.       The Ultimate Green Award goes to… the new series of LED displays. CCFL (Cold Cathode Fluorescent Lamp) technology is on its way out and being replaced by LED’s. For consumer lighting, this might still be a few years away as the price difference is still rather significant­­—imagine paying $20-$50 for a light bulb? For professional applications, however, such as display backlights, the advantages are significant. I would expect that in another year these CCFL based displays go the same way that old CRT’s went about 10 years ago. Every display vendor showed several samples of this new technology, but I found Eizo having the most complete LED based product line, with monitors of up to 8 MPixel using this technology. It is to be expected that the 10 MPixel displays, which are primarily used for digital mammography, will be available soon as well. Depending on the display type, the LED’s consume 50 percent to 30 percent less power. I noticed a significant difference in temperature just by touching the front screen indicating the energy efficiency. These LED’s are definitely more durable, and they don’t degrade as fast, therefore requiring less frequent calibration. The latter saves support and maintenance costs. Overall, a much better carbon footprint.

Definitely not a zero footprint
3.       The Most Over-hyped Award goes to…the “zero-footprint” viewers. Unlike the green technology, which is measured by carbon footprints, the “zero-footprint” concept refers to the fact that a viewing application can run on multiple platforms, including tablets and smart devices, without leaving a trace behind. There is no software to be downloaded and/or executed on the local client, which addresses the fear factor that most IT processionals have with potentially introducing malware and/or viruses. From a support perspective this is also highly preferred as a new release only needs to be installed on the server side. However, this concept is not new and merely a logical evolution to the browser-based web viewers. But because of a smart marketing ploy, suddenly every PACS vendor is hastening to announce that they too have released their zero-footprint viewer. Next year, this will be old news, similar to the VNA and cloud hype we saw in previous years. Oh well, we need a new marketing ploy every year to keep the users engaged and confused.

The control monitor on the bottom
manages the gestures while the top
shows the results of browsing
through a  series of slices
4.       The Innovation Award goes to…the mouse-less, gesture-based interface. One of the recurring themes in the RSNA informatics sessions is the lack of innovation in radiology and how we should take applications in the IT and consumer field to heart and apply them to this specialty. The gesture based interface showed as a commercial product for use in the OR by GestSure Technologies, as well as the poster in the informatics section using a similar technology in a research setting by a group of Swiss pathologists are prime examples.  A user can control a viewer application by simple left and right mouse controls being assigned to his or her left and right hands. I tried it and got the hang of it within a few minutes, while according to the manufacturer, it takes about 15 minutes to get trained and familiar with it. I am sure that it also depends on how familiar one is with this type of application, for example, I will bet that my 7-year-old grandson who beats me regularly in his WII games can pick this up in half the time and become more proficient than I ever will be. We need more of these kinds of toys.

Display shows an install
in Afghanistan
5.       The Most Ruggedized Award goes to… the vendor who deploys numerous Teleradiology systems in the areas where our soldiers are serving, such as Afghanistan and surrounding countries as well as previously in Iraq. I am talking about MedWeb, who has been able to provide systems that can reliably can transfer images from these areas of conflict to the medical centers in Europe for review and consults. With ruggedized implementation, I don’t only mean the hardware, but also the software, especially the communication protocols. The workflow also has to be foolproof, which is quite different than when using Teleradiology for emergency medicine in typical setting. The so-called workflow “exception cases” where patient information is unknown at the time of diagnosis are the rule instead of the norm in this environment.  The reason for the study or admitting diagnosis, which is typically part of the examination requisition, is frequently missing as well. Many PACS systems automatically create an exception or flag images as “unverified” or “broken” if information such as the Accession Number, or patient information is missing, again, this is the norm in this environment. I have a great deal of respect for these folks, especially their support people who install and maintain these systems on-site. There is a lot to be learned from these applications for use in a non-battle zone as well to make the product more robust and durable.

Pick your ambiance...
6.       The Ultimate Feng-Shui Award goes to … The vendor who does not only pay attention to product design but also includes its surroundings. Anyone who ever has had a CT or MR done, and while laying on his or her back, had to stare at a sterile ceiling with those blinding fluorescent lights knows what I am talking about. I am talking about Philips, who has been in the lighting industry since 1891 and is actually the world’s largest lighting producer as of today. Philips argues that it is not just about the light, but it is also, more importantly, about the ambiance. There have been studies done about the impact of lighting on productivity, and I can imagine that similar studies could be done about anxiety and potential stress levels that may be reduced by the proper ambiance. I can even imagine that more relaxed patients would cooperate more, listen to instructions by technologists better, and therefore have a positive impact on the workflow and efficiency. Now I am waiting for the vendor who could also take care of the typical “hospital” smell by providing an aromatic soothing environment as well.

Dental cone-beam CT
7.       The Most Disruptive Technology Award goes to…. Cone beam CT, which has evolved from dental only applications to spine imaging, extremity imaging, and for ENT applications as well. Because of its relatively small size, it is possible to image patients in a standing position as well, therefore imaging the impact of putting weight on certain joints. However for dental applications it seems to be the most disruptive as it could possibly replace the traditional Panorex devices in most offices with a full blown CT as this scanner has the capability to also take panoramic images in addition to the slice data which can be used for 3D imaging. The precision that can be achieved to create 3-D models is less than .01 mm, which is definitely much better than can be achieved with conventional CT imaging. Especially for dental implants, this increased accuracy is a major benefit, as they will create a much better fitting implant. The bad news is that this opens a whole new can of worms because those dentists now have to deal with these high tech, heavy-duty, dose-generating devices.

Additional spine stretching included
8.       The Best Pragmatic Product Award goes to... The company Dynawell, who came up with a very simple device which simulates the upright position while laying on your back using a set of adjustable straps and a scale. It couldn’t be any simpler, therefore you won’t need to invest in the new extremity CT or “stand-up” MRI and be able to achieve the same result. This simple solution is a good example of thinking outside the box and coming up with relatively simple solutions. We need more of these to lower the cost of these procedures.

Quite a large footprint
9.       The Most Promising Technology Award goes to… The many multi-modality devices which are becoming mainstream, such as the CT/PET, CT/SPECT and now also the CT/MRI. These modalities are not new by themselves, but the combination of these devices create a fixed reference point so that sophisticated mapping and fusion of these images, which would allow not only the anatomy but also function to be presented in a single view and is relatively simple to accomplish. It definitely greatly enhances the utility of the nuclear medicine images, which are traditionally very small, noisy and have a very poor resolution because of the limitations that a human body has with regard to dealing with radioactive tracers and agents. These systems are not inexpensive and especially the PET/MR requires a lot of square footage to operate, but for certain diagnoses, these examinations will very likely become the standard care. However, these devices will definitely not help in lowering the cost of healthcare.

A typical DR plate in its storage bin
10.   The Most Over-Priced Product Award goes to… Digital X-Ray Plates, aka DR. It took about 30 years for CR technology to become mature, commoditized and affordable. I worked on one of the first CR units made by FUJI in the 1980’s, which took a complete air-conditioned room to be installed and was in excess of $100k. Today, you can get a simple tabletop CR unit for about $15,000 to $20,000, which includes several plates, software, training, and warranty. Digital plate technology or DR, has matured with regard to its technology as the image quality is good, plates are now wireless and do not require a cable anymore, they are relatively robust and drop-safe and not as heavy as they used to be. However, they are still in excess of $50,000 for a plate. There are many small clinics around the world, especially in emerging countries that still use film in locations where even only $1.50 for a film is a major expense, if one can get film and chemicals to those locations to start with. This is in addition to the fact that two-thirds of the world population does not have access to basic radiology services, creating a need for an estimated 80,000 affordable X-ray units, for which digital technology could be a potential solution. The first manufacturer who is willing to price a plate based on the potential sales opportunity of tens of thousands of these plates will be able to create a true revolution in healthcare. As with many of these innovations, it might have to be someone from the outside, similar to what Apple did with the phones killing Nokia, or Canon with camera’s killing Kodak.

GE deserves a honorable mention for
their kid-friendly MRI
In conclusion, RSNA 2012 did not show a lot of revolutionary developments, but rather several significant improvements, and a couple of fun and small innovations. I am sure I missed some, don’t hesitate to point them out to me, or if you have any comments and/or opinions about these awards (even if you agree, I like to hear it!).

Monday, November 12, 2012

How to deal with finger-pointing between Imaging vendors.


As healthcare imaging and IT systems are getting more complex and the number of systems to be integrated is increasing, it gets harder to identify and troubleshoot interoperability issues. Information crosses several systems boundaries, several of which are not under the control of a healthcare imaging and IT professional, and almost always from different vendors. Upgrades and changes can occur at various systems and subsystems adversely impacting the operation. 

The key to resolving these issues is first of all locating the area of concern and then second visualizing it. This will assist the vendors who are involved to address the issue without them finger-pointing to each and not taking any immediate action.

An image might be incorrectly identified and/or processed wrong because of incorrect header information, which might have been initiated by an error in the personal Health Record (PHR), which was loaded into a Centralized Physician Ordering (CPOE) system, which placed an order through an interface engine, onto a RIS scheduler to a modality worklist broker, which was queried at a modality, copied the information in the image header, archived at the PACS, and retrieved at a viewer plug-in showing this information to a physician looking at an electronic health record of this patient.

To be able to address these kinds of issues, a Healthcare Imaging and IT professional needs to follow a systematic approach for locating the cause of the problem using a decision tree and then using the appropriate tools to visualize the issue. The good news is that vendors have stepped up over the past few years to increase their capability for logging, auditing and monitoring their interfaces, however, in many cases the errors are still vague and not to the point. Examples of such vague errors are “processing errors”, time-outs due to unidentified problems, resets or aborts, and many others.

In addition, there are also more tools available, most of them in the public domain that visualize issues at many levels of the interfaces, as detailed as the actual bits and bytes that are exchanged between the devices. The only barriers to using these tools is a general lack of knowledge and training of healthcare imaging and IT professionals as well as the vendor service personnel, and in some cases the lack of access to networks and routers due to security concerns by IT departments. The latter can in many cases be resolved by partnering with the people who are responsible for the IT infrastructure and try to get them involved with the resolution of the issue.

The first step in diagnosing the interoperability issue is to characterize and identify the type of issue. Tools to perform the diagnosis can be grouped as follows:
·         Utilities, such as accessible through a command line interface or service menu. These can be used to test basic connectivity for example by a ping or DICOM Echo.
·         Active simulators such as modality worklist simulators (see link for demo), RIS/PACS simulators, and viewers, all of them are available in the public domain.
·         Passive tools such as DICOM sniffers (see link for demo), also available for free which can not only make the information exchange visible but allows saving these interactions and have them processed by Validators to find out if there are any violations and/or issues with the data formats or protocol.
·         Validators to validate data formats (headers) as well as the protocol (see link for demo). Fortunately vendors who have developed an extensive set of validators also have made these libraries and utilities available in the public domain.
·         Test transactions in the form of scripts, and many test images to evaluate image quality as well as the image processing pipeline are available, mostly as a byproduct from the many IHE connectathons.

The interoperability issues that are to be identified using the tools above can be categorized into four areas:
·         Connectivity errors, which can be due to networking issues, incorrect addressing, problems with negotiating a connection between the applications, performance issues and status errors.
·         Display errors can be related to worklist issues for example, populating a worklist incorrectly. The correct display of the image and related information, hanging protocols, incorrect handling of Structured Reports, such as used to display measurements, CAD marks, identify Key images, or other information such as radiation dose. Overlays and presentations state information is a category by itself, including on how to handle incorrect “burned-in” text.
·         Image quality issues can be hard to identify as the source can be the image acquisition, modality processing, view station imaging pipeline or display itself. Test images and test objects inserted at various locations in the imaging chain will assist in troubleshooting these.
·         Exchange media problems are getting less but still present due to non-DICOM compliant CD’s being created that might have non-DICOM images, lacking a DICOMDIR, or stored in format not supported by the particular DICOM profile definition.

In conclusion, in order to troubleshoot interoperability issues, the first step is to follow a decision tree in identifying the type of problem, than selecting the appropriate tools to visualize it. Despite increasing complexity and many additional systems that are to be integrated, the availability of tools in the public domain, makes troubleshooting and diagnosing problems possible to be performed by Healthcare Imaging and IT professionals.


Monday, November 5, 2012

To upgrade or not to upgrade, that’s the question.


This would have been nice...

Upgrading when traveling sometimes poses the same dilemmas and choices as upgrading your software. Let me share with you my most recent travel upgrade experience, which was somewhat disappointing. Initially I was excited to get the email about my automatic upgrade to business class for my flight. As it was an early morning flight I was looking forward to a decent breakfast. However, at the gate, the agent told me that I owed them another $90 because I had run out of “stickers.” I reluctantly paid for my “automatic upgrade” as the flight was over 4 hours (which is my pain threshold for sitting in economy). Then, as I hungrily anticipated my breakfast, I was told they were out and that they only had cereal left as I was sitting in the last row. (Hint from a frequent flyer: Odd number American Airlines (AA) flights start serving breakfast in the back, even numbered ones in the front). Lastly, I took out my laptop only to find I could barely fit it in front of me as the person ahead had lowered his seat way back. So, lessons learned: I would have been better off keeping my exit row seat in economy, which would have been less expensive, and actually provided more legroom and workspace, something I’ll consider next time I’m offered an “automatic upgrade.”

Upgrading software can be a painful experience as well. I would classify these upgrades into the following categories: Operating System (OS) security upgrades, OS version upgrades, utility software upgrades and application software upgrades.

Security upgrades
These are a necessary evil. I say necessary because typically the longer you wait with these updates, the more vulnerable you are for a new virus or other malware product to hit your computer and potentially impact your system integrity. Although remote, there is the possibility that the upgrade will interfere with your other software, therefore, if it concerns a major upgrade, the vendor of your application software should typically test and release an upgrade for implementation. If the vendor takes too much time, you should do a risk analysis to assess the chance that you could be hit by a new threat, which depends on the firewalls and other measures you have in place to isolate your system, and weigh that against the risk that the upgrade by itself could impact system integrity. As a general rule, I suggest never allowing automatic updates, rather do updates manually after looking at the risk, and always test the upgrade first yourself.

OS version upgrades
This is a major issue, especially as we are about to go through this once again with Microsoft Windows 8. I would guess that the majority of institutions are still on XP, which, if you include Vista, is three versions behind Windows 7. Why change if something works? If there is no reason and/or need for additional functionality, I would stay with the old version as any new version requires training, testing, and impacts device integration as well. Some of the older peripherals might not even be able to work due to a lack of driver support by the vendors for new upgrades. Unfortunately, you might be forced to upgrade as the support for the old OS expires, but my suggestion is to postpone this type of upgrade as long as possible.

Utility software upgrades
Also a major issue, although most vendors have become smarter after being burned a few times. A notorious example of this used to occur every time a web browser such as Internet Explorer was upgraded, which would break web viewing software. Most software packages are starting to implement solutions that are as much as possible platform independent. Make sure to test any upgrades and again, postpone the change as long as possible unless you need specific new functionality.

Application software upgrades
Some new releases are known to have more bugs and/or be less reliable than their previous versions. A general rule of thumb is to stay away from any release that ends with a “0,” for example, but wait till the next level such as level “x.1” or even later, to make sure that all bugs have surfaced and changes been made and tested by someone other than yourself. If you use any “plug-ins” or other applications that are tightly connected, for example a special processing package, or a voice recognition application, make sure you are upgrading those at the same time, or verify compatibility as they need to be modified as well in many cases.

In conclusion, software upgrades in general are a necessary evil, and, as with upgrading during my travels, I would not automatically upgrade, but rather look at the alternative as you might be better off staying where you are.