Tuesday, December 31, 2019

Top ten healthcare imaging informatics trends for 2020.


Several new trends have emerged over the past five years in the imaging and informatics field. Using the terminology from the Garter hype cycle[1], some of them have not made it beyond the innovation trigger (yet), some ended up at the peak of inflated expectations, others ended up in the trough of disillusionment, and some have emerged to become somewhat mature technologies. I used the hype cycle categorization to show where the top ten trends are right now and where I believe they might end up a year from now.


1.       Augmented reality (AR) - Augmented reality superimposes a computer-generated picture on a user’s view, which is typically on a patient. It has great potential, imagine medical students working on a virtual patient or using a virtual scalpel performing a virtual surgery instead of practicing on a human cadaver. Dissections can be done virtually, and one could even simulate errors similar to what happens with a flight simulator.

There are several start-ups that are working on this technology, but much improvement is still needed. It might take another iteration of a Google glass-like set of goggles to replace the big somewhat cumbersome glass sets I have seen so far. It will also be challenging to replace manual controls with 100 percent voice controls and/or a combination of voice and other bodily controls. This technology is still in its infancy, there are a couple of trial sites, mostly for surgery and we don’t know yet all of the pitfalls, and where it will be used, so this technology is definitely at the beginning of the hype curve.

2.       Artificial Intelligence (AI) - If you attended the recent RSNA tradeshow or are following the literature, you would see that AI is very much in the hype phase right now. I counted more than 100 dedicated AI companies, most of them have not gotten FDA clearance yet for their new algorithms, and the FDA is struggling to deal with these submissions as they don’t know how to deal with an application that is “self-learning” and has a potentially unpredictable future behavior.

There are ethical concerns as well, especially as these algorithms need a lot of data that is currently stored in the archives of many hospitals, and/or in the cloud of the big three cloud providers, which are supposedly being mined in a way that protects patient privacy, but in many cases without their consent. There are also concerns that some of the algorithms were tested on a limited, biased subset of patients and not including all of the various races and cultures with different gene-pools.

Given the momentum, this technology will continue on its hype curve as there will be several new applications being cleared by the FDA. It will take another three years before their first round of financing will run out and they will have to show a realistic ROI and potential to their initial investors. I expect that it will take another few years before it reaches its peak and users will see what it can and can’t do, before it will start to drop down into the valley of disillusionment and eventually mature.

3.       Blockchain (BC) - this technology still gets a lot of attention, but it is closer to its peak as it has become clear what you can do and what you can’t do with it. Blockchain provides a distributed ledger that is public, which makes its application in healthcare limited as most healthcare applications are looking to preserve privacy. The fact that it is immutable, however, provides opportunities with registries as you want those to be widely available and accessible. Occasionally you might hear about a physician practicing without proper licensing in a certain state, which would become a much less likely event if we had a publicly available registry in place.

Another application might be for large public data sets with anonymized patient information that can be used to test new algorithms for AI or healthcare practitioner training. People have become aware that blockchain has limited applications in healthcare and we are waiting for some of those to materialize so we can learn its pro’s and con’s before it matures.

4.       3-D printing - 3D printing is not new; it has been used for more than 25 years to build prototypes and rare parts. Its prerequisite is the presence of a computer-generated model that can be interpreted by the printer and then printed using the so-called additive manufacturing technique instead of the conventional machining, casting and forging process.

What is new is that these printers have become less expensive, 3-D modeling more sophisticated, and the recent standardization for a PACS workstation interface has given this application a boost. Its application is somewhat limited as it is currently used mostly for surgery planning and simulation. A 3D print can provide a visual model especially for complex procedures. There is still a great opportunity for surgical implants and prosthetics assuming that one can print using the right materials. What better use of the technology than replacement of an artificial limb, for example, that matches exactly the other body part in the case of a paired body part or one that fits exactly. Storing and labeling these 3-D models is still somewhat of a challenge especially if one creates many of them. This technology still has to go up toward its peak before it will fall back and become a mature technology.

5.       FHIR - This new interface standard has skyrocketed in its hype. It is widely touted as the solution to all of the current interoperability problems and has a large support from ONC (Office of National Coordinator) in the US. There are a few, limited applications being introduced on a very big scale, for example, the Apple iPhone has a plug-in allowing you to access your medical record in participating hospitals. I have seen more deployments internationally than domestically, for example, some in western Europe and one in Saudi Arabia where patients can access nationwide scheduling using their phone apps. There are a couple of challenges which will cause it to reach its peak of inflated expectations over the next one or two years.

The biggest issue is the speed of its introduction and corresponding lack of maturity. The first formal, normalized release (R4) was not approved until early 2019. The term “normalized” is deceiving as FHIR is basically a compilation of many individual mini specifications for the various resources of which in R4 there are only 11 in a final, normalized state out of the close to 150 resources. One could therefore argue that less then 10 percent of the spec is stable and ready for implementation as more than 90 percent can and most likely will have substantial changes to its interface.

Also, I believe that the current lack of interoperability in healthcare is not so much the lack of a well-defined interface, despite the issues with HL7 version 2 and the overkill and verbosity of CDA documents, but more due to the siloed architectures of our current EMR’s and other healthcare applications, and resistance by vendors and institutions to share information. It might require the pending “anti-blocking” rule by ONC to get some real teeth, success stories to become more widely known, and the standard to get more mature before it reaches its peak. I am worried about the speed and momentum because the faster you go, the more damaging it will be if you crash. As of now, FHIR is still going full speed ahead and it might take another two or three years before we will see it go past its peak.

6.       Cloud - The cloud right now is beyond its hype and traveling down its negative slope. Using the cloud as a back-up for data recovery has been mature for many years, but the advantages and disadvantages of outsourcing your IT solutions to a cloud platform has become clearer. From a security perspective, most healthcare institutions spend between three percent and six percent of their IT budgets on cyber security and they are becoming aware that it is hard to compete with the thousands of security experts that are employed by the big cloud providers. It also has become clear that you still need to manage your cloud applications well, especially the configuration settings, which became clear after the 2019 Capitol One breach, which is touted as one of the largest breaches ever. There is a lack of trust by the general public on how their data in the cloud is being used by the big cloud providers and whether it is sufficiently anonymized.

The cloud is not always a good solution as there could be a shift from the cloud to edge computing when processing real-time data. A typical response time from the cloud would be about one second which is fine when accessing and retrieving information for human interpretation but when making a split-second decision such as used for remote surgery, the cloud is too slow. The good news is that we typically don’t need to make these fast decisions, unlike self-driving cars that need to avoid potential obstructions. So, the cloud is definitely past its initial hype and next year we’ll discover more of its inflated expectations before we’ll see it mature in a few years.

7.       IOMT - The number of IOMT (Internet Of Medical Things) devices will continue to explode. The problem is that people are becoming highly dependent on these devices as illustrated by the recent data outage of DexCom, a company which allows caregivers to monitor the blood-sugar levels of their kids, parents and others. When this communication suddenly became disrupted, there was a semi-panic among caregivers.

This device is not the only IOMT device that is being introduced, there are intelligent extensions to a person’s handheld device to measure vital signs allowing for a telemedicine consult and wearables that are able to record and communicate with pacemakers, intelligent drug dispensers, a scale and other devices. Challenges with these devices are the unrealistic reliance on these technologies and corresponding immaturity, unreliability and lack of redundancy.

These IOMT devices interface easily with your mobile devices using blue tooth for example, but what about the next step, i.e. how does it get into an EMR? There is a mechanism in any EMR to upload the nurse observations and vitals that are measured by the same nurse, but how about uploading that information from my smart watch when I come into the doctor’s office?

Last but not least, there is a concern about cyber security provisions in this technology as potential weaknesses in pacemakers and IV pumps have been published. All of that makes IOMT still immature and it will take a few more years before it will start to slope up again and get to a plateau of productivity.

8.       VNA - Vendor Neutral Archives (VNA’s) used to be the biggest hype in medical imaging two or three years back. Initially it was positioned as the best solution to avoid costly and lengthy data migrations from hospitals that were switching PACS vendors. Each major PACS vendor was scrambling to catch up and replacing their PACS Archive labels with a VNA label, which created confusion in the marketplace about the functionality of a “true” VNA. Subsequently, the VNA became the source for image display for non-radiology physicians requiring web interfaces and connections with a Health Information Exchange using XDS protocols.

As of now, VNA is being repositioned as an enterprise archive with its own set of problems. There is a lack of synchronization of the data between the various PACS systems and VNA, for example, if there is a change in the patient demographics, images have to be deleted, or other adjustments made. Standards exist to address this issue as defined by IHE but there is little uptake on the PACS side for those.

The biggest stumbling block is the lack of a uniform workflow by non-radiology or cardiology specialties and inconsistent access for patient orders and/or encounter information. Also, as institutions are starting to rely on a single VNA to manage images from 100+ institutions, there are some serious concerns and issues around redundancy and scalability.

The VNA is definitely not mature yet, but its pitfalls are identified, it is slowly going up the slope of enlightenment, which will continue for the next two or three years. The concern is also that because most of the independent VNA vendors have been acquired by PACS vendors, their rate of innovation will be slowed down because of the transition and lack of focus.

9.       DR - Digital Radiography (DR) has been replacing Computerized Radiography (CR) for the past several years by being able to capture digital X-ray exposure and convert it directly into an electronic signal producing a picture withing a few seconds, instead of having to scan a CR plate in a reader. However, CR technology is still great for developing countries, whereby they are still converting from film to digital. [MO1] The DR plate technology has been greatly improved, it has come down in price but is still not cheap, plates used to be more than $50k and are now getting closer to $20k. They are now wireless, so you don’t need a cable to connect a plate that is being used for portable X-rays, which can be a safety hazard. As a matter of fact, I heard firsthand from a radiology administrator who had a technologist who tripped over the cable and he had to deal with a worker’s compensation case.

The battery life of the removable plates is getting better, with some lasting up to half a day or more. In addition, the way they are sealed has also improved providing better protection against leakage of bodily fluids. However, most of them are still based on glass silicon so they are heavy and subject to damage if dropped. All of these factors, price, battery life, leakage protection and weight can still be improved, which is why the technology has not reached its plateau of productivity and is still ascending on the slope of technology enlightenment. This will continue for the next few years.

10.   POC Ultrasound - Point of Care (POC) ultrasound has a big potential. It is inexpensive (~ $2k to $6k), portable, and adds value when used correctly. It could potentially become the equivalent of a stethoscope for the physicians. In addition, it can become a tool for non-physicians such as midwives or Physician Assistants.

Because of its low price point, there is a huge market opportunity in developing countries where there are no diagnostic tools at all available in the majority of cases. The factors that will cause it eventually to get over the hill of inflated expectations over the next one to two years are immature hardware and software product features, for example, some of the probes are known to get really warm, and there is still a lack of clinical measurements in the software such as needed for cardiac echo and OB/GYN, and there is no universal architecture yet as some of these devices can use a standard iPhone or Android phone/tablet and some require a company provided proprietary handheld device. Some of the POC devices require a cloud connection which is a problem when working in an area without connectivity and the business models vary between monthly fees and one-time purchase.

Last but not least, acquired images need to be archived, and there is an issue with matching the images with the correct metadata containing the patient information and any other encounter based important information.

In conclusion, the Gartner hype cycle has been criticized for a lack of evidence that every technology actually goes through this cycle, however, in my opinion, it seems to apply to most of the new technologies I have seen developing over the past several decades. Also, note that the ranking of these technologies in this article is my own personal opinion, and I might be wrong, and I promise to produce an update a year from now and admit any assumptions that appeared to have been incorrect. The main purpose of this ranking is to use this as input when making a decision to implement these technologies. It is fine to take a bet if you are a risky person and like to be on the “bleeding edge,” but if not, you might want to think twice about using a technology that is labeled immature or super-hyped. And of course, you can disagree with my ranking; I always encourage feedback and discussion.


Thursday, December 19, 2019

Inside perspective on the Fuiji-film Hitachi acquisition.


It was about 30 years ago that I first visited Hitachi in Kashiwa, which is about one hour from Tokyo,
as a young CT product manager to discuss the implementation of a DICOM interface in their CT scanners.

Philips had just closed down their CT manufacturing in the Netherlands and were relying on Hitachi to provide them with a low cost, reliable and robust CT scanner, initially for the US market and then for worldwide distribution. This turned out to be a costly mistake for Philips as its management at that time underestimated the Japanese mentality of “Business is war.” It killed the CT market for Philips as Hitachi was slow to implement innovations and used the Philips channel to learn about the US market, which they promptly entered under their own name, not only to sell CT but also MRI’s.

I was amazed at that time with the Hitachi modular approach. Philips modalities, such as their CT and MRI would have completely different architectures, including the backend and even OS (MRI DEC based VAX and CT using the Philips minicomputer). Hitachi cloned their backend and connected a different modality, CT, MR or whatever frontend they needed thus achieving a great economy of scale. Philips eventually had to buy CT technology back by purchasing first Picker and then Elscint but never totally recovered in the CT marketplace where GE and Siemens (as well as Toshiba/Canon) have been dominant.

Hitachi always made pretty good ultrasounds, that is what Japanese companies do well. I visited the ultrasound manufacturing facility at that time and I was amazed by the cleanliness of their manufacturing. As a visitor,  we had to wear a yellow cap to distinguish ourselves, and before we entered the manufacturing floor, we left our shoes outside and put on slippers (way too small for my large Western feet of course) and I saw the most spotless manufacturing I had ever seen, despite the fact that we at Philips had a pretty clean shop as well.

Fast forward 10 years, I had moved from Philips to Kodak and was project manager for computed radiology (CR). Kodak had some of the smartest scientists in Rochester New York and had patented the CR technology several years prior. We referred to this as the “Lucky” patent after the person who filed it. However, Kodak was so fixated on analogue film, which eventually led to their demise, that they had sold the CR patent to Fuji, who promptly commercialized it and in addition applied for many patents around it to lock down this technology. When Kodak woke up and saw its potential ten years later, it had to get those patents back. I am sure it paid more for it than it sold them for. This was so embarrassing for Kodak that Kodak management told me to strip out all references to the original patent in my presentations as I was telling the world that “Kodak invented CR.” Kodak had some serious catching up to do and to speed up the CR commercialization, Kodak found a manufacturer in California, Lumisys, that made a small tabletop CR, which it bought the rights to.

Fast forward 20 years to the 90s and Kodak sells off everything in its portfolio to avoid bankruptcy, including their imaging business that included CR, which was bought by a private investment group that still owns it and rebrands it as Carestream. Fuji has maintained its market position as one of the premier CR providers and also became a pioneer in the PACS business as one of the first companies offering a software-only PACS and viewer. Interestingly enough, the software was mainly developed in the US, as Japan is a very good place to make hardware, but they don’t really know how to develop software well while the opposite is the case in the US. Just think about their motorcycles and look at the sophistication and refinement of a Honda Goldwing vs a Harley.  Hitachi still makes pretty good ultrasounds but never quite made a big dent in the CT/MR market.

As of today, the Fuji PACS business has matured, they have a good market share in some regions, e.g. in my own area, which is the Dallas metroplex, where they are number one. They also have some large contracts especially with the US government, although they seem to lag in innovation in this market. However, they are without any question number one in digital detector technology: CR’s are replaced with DR plates and they just introduced at RSNA 2019 the first super lightweight DR detector without glass (silicon) using a thin layered flexible semiconductor carrier.

Hitachi never quite made a big dent into the US market with its “big iron” devices, i.e. CT and MR compared with the big four (GE, Philips, Siemens, Canon/Toshiba), except for selling to outpatient imaging centers. In my opinion, they were managing their business too much from Japan with a Japanese mindset. If they would have taken their car manufacturing counterparts as an example, which design and manufacture their cars in the US to meet local market preferences, it could have been a different picture. Their ultrasounds are still pretty good; however, it will be hard to compete with a $2,000 Butterfly or $6,000 Philips Lumify. In contrast, fuji has a nice complement with its Sonosite product line of low cost portable Ultraosunds and just announced a handheld called the iViz air. But the main issue for Hitachi is their bottom line, as they are planning to lift operating margins to 10 percent or above by 2021 and their medical business does not meet that objective.

Fuji needs economy of scale. It missed out on the Toshiba deal which was bought by Canon, and it apparently missed out on the AGFA deal, which was just bought out by a European holding company called Dedalus, and I bet that the Carestream private investors wanted too much money. So they were looking for new opportunities, which Hitachi provides. Samsung is also preying as they need to diversify beyond their electronics and mobile business and healthcare is one of their growth initiatives, however it is funny how culture and politics sometimes dominate business and South Koreans just don’t play well with Japanese.

The Hitachi acquisition will get Fuji to a market share of close to 10 percent in the medical device and IT market, still about half that of the big four, so it might be looking for other potential targets. GE seems to have changed its mind about selling off their healthcare division, but who knows, maybe Watson-IBM is next? We’ll see what happens.

In the meantime, maybe it is time that FUJI changes its name from FUjIFILM to FUJI-DIGITAL or take Kodak as an example and call themselves FUJI-STREAM. Regardless, there will be hundreds if not thousands of employees changing their emails and business cards, while others, including myself, their address books, but we are used to that in this fast changing and interesting business of healthcare.

Friday, December 13, 2019

Top 10 healthcare IT cybersecurity recommendations from HIMSS forum.


The HIMSS Healthcare Security forum in Boston is where the CISO’s (Chief Information Security Officers) come together to listen to their peers, government representatives and vendors on what keeps them up at night. And yes, stories about security and privacy breaches are kind of scary as they often create significant damage to the reputation of their institutions and cause financial loss, often in the form of penalties, but even more in recovery costs. For example, as one of the speakers told us, a stolen laptop that had more than 10,000 unencrypted emails from one of their physicians resulted in a $300,000 fine but also required hiring 30 temps to go through each individual email to find the 4,000 that had significant PHI and had to be notified. This incident amounted to a direct cost of more than $1 million dollars.

The best part of this conference, however, was not the swapping of anecdotal stories about breaches but learning what a hospital should be worried about the most, and what should be low on the priority list because one might feel overwhelmed with the many potential threats and breaches.

Here is my top 10 take-aways:

1.       Zero-day events are over-rated. A zero-day event is the first time that a vulnerability is made known before a security patch can be installed, during which time the weakness can be exploited by a hacker or malware. However, it is rare for exploits to take advantage of these zero days on short notice, however, there was one that was identified and exploited within one hour. By  far, the majority of breaches are due to weaknesses that were known for a long time and people had not gotten around to fix them for months or longer. Case in point, the Wannacry ransomware attack that infected 70,000 devices at the NHS hospitals in the UK for a few days in May 2017 was the result of a Microsoft security flaw which had a patch available for several months.

2.       Put pressure on medical device vendors to allow for end-point security. Most large medical device vendors refuse to allow a hospital IT department to put any software or agent on its devices, let alone security related software. However, if you negotiate it upfront as part of the purchasing process and/or are a big enough player in the provider field, they can be swayed to do this. The argument that it invalidates their FDA approval is a myth that is used by vendors inappropriately. Most medical devices use an embedded OS, VxWorks, which is the most common real time operating system in use. There are actually 11 vulnerabilities that have been discovered in the underlying network software used by VxWorks, aka the “Urgent/11,” which has resulted in a FDA safety communication bulletin.

3.       Network segmentation and monitoring is essential. If you are a small hospital and have no leverage with your vendors to negotiate end-point security and/or have old legacy devices, the next best step is to monitor these devices externally. The reason for monitoring is that many of them have obsolete operating systems (XP, Windows 7 or old embedded OS’s) and are vulnerable for exploitation, and by the way, telling your hospital or radiology administrator to replace a CT or MR which cost a $1 million+ because it has a security vulnerability is almost certainly not going to fly. This not only affects medical devices; it also can be an lab system running an old database or webserver (notably Apache) that is obsolete as well.
In these cases, there are two bywords to live by, micro-segmentation and zero trust. Micro-segmentation allows for networks to be configured using software such that certain devices only talk with each other. If a device or application moves, the security policies and attributes move with it. Zero trust means that it is not sufficient to only protect the perimeter; nothing can be trusted anymore as devices might become infected as well, so it shifts the focus to internal protection.

4.       Most password schemes are often pretty much useless. A vendor demonstrated that an encrypted 6-character password using SHA-256, which had the required upper and lower case and non-character, can be cracked in less than one minute using open source tools on a relatively fast server (not even a supercomputer). In addition, to the fact that 52 percent of people use their birthdays, names of their kids, spouses or pets as passwords, with the first character being the uppercase, followed by a “1” and special character “!, which are easy to guess by anyone browsing their Facebook profile in case of a targeted attack, many re-use their passwords.

Almost everyone’s account has been hacked at some point in time, whether it is from your Target account, Equifax account, Bank-One, or any other major breach in the past, so if you use the same password, someone will now be able to access your current bank, Facebook, retirement or other account you might have. One should make sure to use more advanced password encryption and, even better, two-factor authentication or, best, biometric identifiers. In addition, passwords should be changed at a minimum every 90 days (the generally recommended 30 days was suggested as being overkill).

5.       Inventory and purchasing management is critical. One needs to know what devices are purchased, make sure they meet basic security requirements, and know what is connected to the network at what location. Not only do you need to know what devices are connected, you also have to know its “typical” behavior. For example, a CT scanner might access an EMR for a worklist and send images to the PACS. If it suddenly starts to query the hospital billing system or tries to send images to an IP address in Russia, there is an obvious issue.

Characterizing behavior is often done using a network sniffer such as Wireshark. Network security tools can monitor this behavior and there is a good opportunity to use AI to “learn” about the typical behavior so that any deviation from that behavior can be flagged. This goes back to the “zero-trust” principle as mentioned earlier.

6.       Manage and monitor your service providers. In one case, a service engineer connected a legacy CT scanner running XP to an external, unprotected connection to download upgrades and it was promptly infected with malware. This was fortunately detected, as the institution had proper network detection in place. In another case, an x-ray unit crashed its hard disk and, as the service engineer did not want to rebuild it from scratch, he used a cloned version from another nearby hospital, which was infected with malware. And of course, any USB flash drive with upgrades must be scanned for viruses. It sounds almost too hard to believe but one of the speakers had a major security incident because a physician who received a “free” USB stick at the airport in Moscow put it in his hospital PC, which caused great havoc.

7.       BYOD (Bring Your Own Device) is a major challenge. Of all hospitals in the US 71 percent allow some form of BYOD. Physicians like to use their own devices, whether it is for texting a colleague about advice, or taking a picture of a patient in the ER as evidence. In addition, it will not be unusual for a physician connecting an ultrasound probe to a smart phone, the latter may soon become a replacement for the stethoscope. I personally think that the remaining 29 percent of hospitals not allowing BYOD’s will not be able to hold out long. Allowing a BYOD has major implications. First of all, the attack surface is exponentially increased, second, there is a big resistance against IT “taking over” personal devices. Early attempts of IT protection actually caused interactions and interference with other usage. It wouldn’t be the first time that using a VPN that encrypts the clinical messaging impacts the operation of let’s say access by the physician to email or, even worse, Amazon or their brokerage account.

8.       Double your security budget. Compared with other industries, the amount of money spent on healthcare cyber security is many factors less, while the potential gain for hackers is many factors more. A medical record fetches 10 times the price of a credit card on the dark web. Security budgets have been decreasing to about 3 percent of the overall IT budgets in healthcare. Knowing that it would be impossible due to the limited resources of the hospital IT departments to boost the level of spending to that of other industries, it was concluded that you should spend at least twice as much as you do today. An external security consultant should be able to benchmark your current spending with your peers and other industries if you need to convince your management to do so.

That security can be a life-or-death factor was illustrated with the case of the UK ransomware incident where ERs shut down, which means that a stroke victim who has a 30- minute window to be treated could be left out. Imagine if that had happened in the US, it would be a perfect class action suit for negligence because IT did not keep up with patches causing serious patient harm.

9.       Limit your attack surface. There are several ways to do this, first of all reduce the on-device footprint, use Zero Footprint (ZFP) viewers, preferably using standard browsers, which means that as soon as you log off, all information is erased. If there is any confidential information on an electronic device which can easily be carried around and/or stolen or accessed, make sure all of it is encrypted.

Running every application in the cloud as is feasible, however with some caveats. Make sure that cloud access is guaranteed secure and there is redundancy and back-up. The overriding argument for the cloud is that any of the cloud providers have literally thousands of security professionals managing its security, which is no match for your own resources. However, moving to the cloud means that 80 percent of what your cyber security staff knows today becomes irrelevant, as managing the application in the cloud is basically an entirely new job. Therefore, be prepared to retrain your security staff.

Consumerization of healthcare is another major issue impacting the attack surface. Consumerization has many aspects, first, it requires a different mindset from providers. Intermountain Healthcare out of Salt Lake City has been a pioneer with this as it started to call patients “consumers” instead of patients. It hired an ex-Disney executive as Chief Consumer Advocate. Regardless of whether the institution is ready, patient/consumers will come to the hospital with their wearables that provide an EKG, heartrate and vitals recorded, and information provided by their apps that record their glucose level, or connects with their pacemaker recording cardiac events. Note that seven out of ten Americans track healthcare data on their mobile phones. If we want consumers to take responsibility for their health, they also should be able to contribute their own health data to their EMR’s and patient records used by healthcare providers. Imagine the security surface attack level that just has been exponentially increased again.

10.   Concentrate on high risk areas. There was a recent publication report that CD’s with DICOM images could be exploited by embedding an executable in the so-called pre-amble of these files. This caused quite a stir, however, when a new threat is discovered, one should analyze the potential risk, i.e. how likely is it that someone would “execute” a DICOM image file.

The same applies for potential hacking of pacemakers, infusion pumps, anesthesiology equipment and other devices that appear on YouTube videos or in the news as becoming targets for hackers. Instead of worrying about these high-visibility, low-likelihood threats, concentrate on your legacy equipment, worry about patch management, inventory your systems, segment your network and use security dashboards to manage your cyber security. Just implementing a dashboard causes the number of incidents to decrease by as much as 30 percent in six months.

Even though you might not directly be involved with cyber security, a conference such as the HIMSS security forum is very useful as it gives an inside perspective of the challenges we are facing in healthcare.

Here are some excellent resources if you would like to learn more:






Friday, December 6, 2019

My RSNA2019 top ten.

Welcome to my 36th (!) RSNA

I always enjoy RSNA, it is good to catch up with old and new friends, see what is new in our world of radiology, and last but not least enjoy a piece of deep-dish pizza or a Wiener Schnitzel and Apfelstrudel at the Christmas market. 

Here are my observations:

1. RSNA this year was all about AI. Several major vendors were exhibiting AI driven workflows and new clinical applications for this new phenomenon. In addition, if you were able to make it to the basement of McCormick place, you would find a dedicated hall just for the AI vendors. However, the size of this so-called AI Showcase was in inverse proportion to the amount of traffic, maturity of the products, and number of real-world implementations.
The AI "basement"
There is no question in my mind that AI is still very new and, except for some niche clinical applications, still has a long way to go before large-scale deployment is going to happen. I asked several vendors how many installs they had and the answer ranges from a couple to maybe a few hundred, which compared with the number of hospitals worldwide is a drop in the bucket. In addition, there was relatively little traffic in the dedicated AI hall, much less than at the other two main exhibit floors, so AI did not appear to be top of mind for most attendees.

AI at its best:
integrated with a PACS viewer
There is no question that AI in the long term will become ingrained in the daily workflow and add significant value and increase specificity and sensitivity to the diagnosis by supporting the diagnostic process, however, it might be a couple of years before we’ll see an impact, especially in the day-to-day work of radiologists who work outside the major academic centers, where most of the initial implementations are being tested and deployed.

This is how it should look:
Path on left and Xray on right
2. Digital pathology is taking off in the US. Several western and northern European countries are at least 5 years ahead of the US as they started implementing digital pathology 5+ years ago. FDA approvals held up deployment in the US, but recent clearances are allowing its implementation. There is also an issue with return on investment, which is negative, as you cannot get rid of the slides containing the specimens. There are actually extra costs as now you’ll need to get slide scanners, view stations and an image display and management infrastructure. The good news is that the lag in implementation allows the US to learn from early experiences and become leading edge instead of bleeding edge. 

Why is pathology important for radiology? The reason is that pathology images and reports provide a valuable additional datapoint for the radiologist. Initially, physicians would only look at shared pathology images during tumor board discussions, but there are other applications such as for screening immigrants who typically get an x-ray and possibly lab test to look for infectious diseases.

Another major impact of the implementation of digital pathology will be on image and archive management. It is very likely that these images will be stored on the radiology PACS archive and almost certainly on the enterprise archive or VNA, assuming that the facility has one. Most departments are still trying to manage the onslaught of the additional data from 3-D breast images (DBT) filling up the available data storage at least,  if not more than twice as fast. Wait until you get whole-slide scanned images from pathology, that are multiple gigabytes in size.

POCUS from GE,
innovative 2-sided probe
3. POC (Point of Care) ultrasound is continuing to make inroads. Stanford recently put a POC-US in the hands of every resident and faculty physician, see link. The top three players in this market is Philips with the Lumify, which seems to have the most comprehensive set of features especially OB/GYN measurements and templates, the GE unit, and the Butterfly. Butterfly is somewhat of an outlier as it has a subscription model for its usage and uploads images in their cloud. Pricing is between $6k and 2k for these units. A major challenge with these devices is how to archive any of the images that the physician wants to keep as they have to be properly identified with metadata to make sure they end up in the correct patient folder.
EMR vendors are pushing solutions to upload these directly into their systems, which is a mistake, images belong in an enterprise image management system together with all other images, however, these archives, often a VNA,  have been slow to adapt to the specific workflow requirements for these devices, even though IHE has already put a specification out defining on how to do this.

4. In addition to POC-US, there is POC-DX, POC-CT and POC-MR. The POC-DX, also known as the x-ray portables, have been around for a long time, they are mainly used in the OR, ER and ICU’s to provide bedside diagnostic x-ray.

Cute portable for kids
These portables use digital x-ray plates, which are wirelessly connected so that the images can be transferred automatically from the plate to the portable console for processing and QA and then wirelessly sent to a PACS for physician and radiologist viewing. The DR plates are getting less expensive, battery life is getting better, but they are still rather heavy, and one has to be careful to protect them from body fluids as many are not 100% sealed.

Fuji showed a flexible sensor detector which brought the weight of the

Most innovative product IMHO:
flex detector
plate back to a mere 4 lbs. Except for developing countries, where price is still a big determinant, DR is now replacing CR at a rapid pace. Sedecal showed a “ruggedized” version of its portable unit which can be transported in a “box,” has big wheels and is mainly used in the field by specialized users such as the Red Cross or the military in areas of conflict and natural disasters.

Looks like a CT,
moves like a portable
POC-CT has grown up as well. These CT scanners have evolved from a “CT on wheels” to truly portable units and can be moved around as easily as portable x-ray units. These have built-in radiation screening as part of the gantry and a lead flap in the front and back to screen any additional radiation.

POC MRI,
my second most
 innovative product choice
The POC-MR was a newbie at the show. It is still subject to regulatory approval which can be expected later this year. Its application is somewhat limited due to its low field strength (.064T), but the advantage of the low magnetic field is that there are no issues with shielding, as a matter of fact, they were scanning in real-time in the booth. The images are very noisy but new advanced image processing and AI can improve the image quality up to a point that they are usable for the application needed.

5. Photographs can assist in diagnosis. Photographs can provide important contextual information and can be taken by providers as well as patients using a camera or smartphone. There are clinical and technical challenges to recording and managing these pictures. The clinical challenges include privacy and how to deal with sensitive photos, including the definition of what constitutes a sensitive photo. Technical challenges include security as well as how to capture the appropriate metadata such as patient information and body part. 
Good example showing photo and image
There are two working groups established that are supported jointly by HIMSS and SIIM to address these issues, the Photo Documentation Workgroup dealing with the clinical and technical issues and the Data Standards Evaluation Workgroup dealing with analyzing the existing standards for nomenclature related to body part and anatomic region. White papers can be expected from these workgroups in the near future.

Still need huge glasses but effect is amazing
6. Virtual Reality (VR) is moving to Augmented Reality (AR). VR has been somewhat of a niche application, mostly used by surgeons to prepare for surgery as it can show true 3-D models of the organs using CT or MR source
data. VR has always been a little bit disjunctive from the real patient as there has been no real direct connection between the actual subject and the images that are shown in a 3-D space. AR is changing that as there is a direct connection between the patient and the image created by the 3D. For example, a surgeon can look at the patient through his special AR glasses and see the synthetic image super-imposed on the body part of interest. Again, VR and AR are somewhat of a niche application but it is quite fascinating and really cool to be able to have “x-ray vision” and look inside a body and see its organs from different angles and perspectives, which should be of great help to surgeons. A great example of how radiology supports other specialties.

7. Monitor management for home reading is a challenge. Imagine that you want to read from home, and for your worklist and reporting you use a laptop computer. One would typically have two medical grade monitors, but that could be three or four as well. The good news is that most radiologists are starting to learn that using a medical grade monitor is a requirement for reading anything CR/DR and certainly mammography.
Monitor management black box
This means that the monitors are calibrated to show each individual pixel value into a greyscale value that an observer can distinguish so as not to miss any subtle changes in pathology, and they are typically managed remotely including the possibility of keeping those calibration curves in case the quality of the monitor display was challenged in a potential malpractice lawsuit (which is not uncommon). 

However, when trying to connect those multiple monitors using a standard windows PC, the hanging protocols, i.e. where the images are displayed is challenging and it might vary upon rebooting the PC. Therefore, one might use one of those small “black boxes,” which has a video board inside and a controller that can remotely connect to the calibration management software. It manages the display order so that it is consistent any time a radiologist connects his or her laptop again.

MRI with built-in recliner
8. New open MRI’s are being introduced. There have been open MRI’s for a long time, the advantage is accessibility to the patient which is especially important when doing surgery. Other reasons for doing an exam in an open MRI might be for patients who are claustrophobic. Lastly, if a patient has a condition that only shows up when he or she is standing or sitting, i.e. if there is a need to show the load-bearing there is now a unit that allows the patient to keep on sitting. Another example of how some of the common devices are being created for niche applications.

9. 3-D printing is maturing. The novelty of 3-D printing is somewhat over compared with last year’s RSNA, but there was still quite a bit of interest,
Amazing detail
and several vendors displayed some amazing examples. Also, since 2018, the DICOM standard includes the so-called STL (stereolithography) file format, which is commonly used by CAD software. This format can be used to send to 3-D printers, but also can be encapsulated into a DICOM file, i.e. with the typical DICOM header, the modality being “M3D,” similar to the encapsulated PDF files. It can then be managed on a PACS archive such as a VNA and added to the study, e.g. the CT, and be used to reprint if so desired. There is no question that for surgery planning for difficult and rare cases, this is a great tool that is becoming available.

Looking for volunteers!
10. In case you missed the friendly ladies at the RAD-Aid booth, you can website and sign up as a volunteer. I have been very fortunate to have first-hand experience with the impact that you can make by teaching in developing countries and supporting your peers in your area of expertise. Remember, you don’t have to be a radiologist teaching interpretation or IR, but there is also a major need for people teaching basic x-ray as well as CT, MR, US, and even how to procure and maintain systems, how to manage a department, and how to troubleshoot image quality and technical problems.

Excellent Tech support built in
The good news is that some of the vendors are incorporating features in their products that kind of “guide” a technologist through a procedure. A good example is the Carestream CR console that shows how to expose an extremity and make sure to use collimation, something that is obvious to anyone taking an x-ray in the developed world, but is often overlooked in these emerging markets. I can promise you that volunteering can not only make a major difference in the lives of the ones you touch and interact with, but you’ll become a different person.

In conclusion, this was another great year, there were some great talks, my favorite was “AI in
Cabs and Ubers lining up
for drop-off
developing countries,” where I think it can make a major impact due to the limited resources and lack of training. Some African countries have fewer radiologists than there are in my hometown, and therefore AI can be a major help. Remember, in those cases we are not concerned if there are a few percentage points gained in specificity or sensitivity, if you start with “0,”anything is pretty much a gain.

However, regarding the state of AI, I have never seen so many vendors without FDA clearance promoting solutions based on limited datasets from only a subset of the populations, for example how valid is an AI algorithm based on a clinical study in China to a population in a downtown US city where the majority is African-American?

I am curious to see the progress made by the same time next year, if I missed you this time, I hope to see you next year!

Monday, November 25, 2019

DICOM Modality Installation Checklist part 2


So you did all your homework prior to the new modality to be installed as described in part 1 of this post, i.e. you checked the conformance statements, used a simulator to query a worklist and send test images and checked if they display correctly at the PACS workstation. However, when you connect the new modality to the PACS it does not work. What do you do?

1.       Check connectivity: Ping the IP of the worklist provider and your destination(s), and then do a DICOM ping (aka Echo or Verification). The DICOM Verification feature might sometimes be hidden or only available under a service menu, but in many cases,  it is right there on the desktop or as a menu item. In rare cases there is no DICOM Verification implemented, shame on those vendors because it robs the service and support engineers from a very valuable tool. Failure of the network or DICOM ping indicate network issues, addressing (port, IP AE-Title) misconfiguration, or failure to add the device to the ACL list at the PACS.

2.       Assuming you have connectivity, but your images don’t show up on your PACS, the first line of defense would be to check the logs on either side, i.e. client and server or in DICOM lingo, SCU or SCP. The images at the PACS might have ended up “unverified” or “broken” which means that there is something wrong with the metadata or header. It is most likely an Accession number of ID integrity issue. Usually, these issues can be fixed with the standard tools available to the PACS administrator, however, in rare cases, you might need access to the PACS database to find out what happened, and in some very rare cases you might need to do an off-line validation of the metadata to see what causes the issue. The off-line validation will take the ages and runs a check against the DICOM data-dictionary. There are several DICOM validators that do this, both David Clunie has a validator and DVTK has a validator. In the case that the modality worklist does not show up, you again look at the logs and as a last resort, you will have to use a DICOM sniffer to see where the communication has broken down. A good illustration of such a problem was an Ultrasound of a major manufacturer which did not display the worklist, and only after using the sniffer we could prove that the information was actually received by the modality, and therefore, the fact that it was not displayed was a problem at that modality. I actually found out after running the validator that one of the worklist attributes had an illegal value and therefore the modality did not display it.

3.       Assuming you have a worklist at the modality, there might be information missing in the list or, there are too many entries or too few, meaning that the attributes used to filter the list were not applied correctly. In that case you will have to work with the interface specialist to map the HL7 orders to the worklist. Filters that determine what worklist items are displayed typically include the Modality, Scheduled AE-Title and/or Station Name. These have to be mapped from procedure codes, patient location and other elements in the HL7 order message.

4.       Assuming you are able to look at an image on a workstation, there could still be a display issue with the image ordering and view port positioning, which is typically determined by series and study descriptions as well as orientation information. If there is an image quality issue, there could be a problem with the pixel interpretation pipeline. The latter can be tested by using the test set developed for the IHE display protocol which have any possible permutation and combination of image types, photometric interpretation, presentation states, look up tables and other parameters impacting the display.

After troubleshooting these issues it should work! Congratulations on a job well-done. Remember with he proper training and tools you are empowered to solve these kind of tricky issues and problems by yourself instead of having to rely on your vendors who in many cases resort to finger pointing to each other. That is one of the very frequent reasons that IIP professionals show up for our training classes, in addition to getting additional career opportunities. Hope to see you at one of our training classes, see our schedule here.

Monday, November 18, 2019

DICOM Modality Installation Checklist.


One of the typical responsibilities of a PACS administrator is adding a new image acquisition
modality to the PACS system. It is also one of the more challenging tasks as it is often hard to predict how the device will interact as these are still not quite “plug-and-play.” To make it worse, this is often a visible and highly anticipated task, as in many cases the new modality has been expected for a long time. So, when it finally arrives at the loading dock, users want to see it up and working as soon as possible.
With proper preparation prior to and during the actual installation, the success rate of the install can be increased and the time to get it up and running can be greatly reduced and frustration kept to a minimum.

This is the check list I recommend prior to the install:
1.       Do a “paper validation” between the modality and its connections, i.e. DICOM worklist provider, DICOM destination(s) for image Store, Storage Commitment, Modality Performed Procedure Step, and Structured Reports. Get the DICOM conformance statements for these devices and compare them against each other. Make sure you get the right version of these conformance statements as functionality can differ substantially between different releases. Specifically look for the following in these documents:
a.       Make sure that there is support for the type of DICOM files (SOP Classes) you will be exchanging. Be aware of and look for support of the new “enhanced” SOP Classes such as for CT, MR, Angio, RF, breast tomosynthesis, IV-OCT and others.
b.       If you want to compress the images at the modality, make sure there is support of the type of compression at the source and destination(s) (JPEG lossless, lossy, Wavelet, MPEG for video, etc.)
c.       If you want to use Storage Commitment, make sure its behavior between the SCU and SCP matches with regard to the handling of the associations for the reply.
d.       If you want to use Modality Performed Procedure Step (MPPS), make sure that the implementation matches your workflow, for example, you don’t want to have MPPS report the study being complete if there are still images to be sent, processed, or imported.
e.       Match the worklist attributes between the modality and worklist provider and look for alternate mapping in case attributes might be missing on the modality side. An example would be to map missing patient weight or allergies in a Patient Comment field if that is required at the modality but not displayed.
2.       Do a “file validation” by asking the vendor to send you a CD with images, making sure that each type of image is on the CD. In addition, get sample Structured Reports, such as dose reports for CT or measurements for ultrasound and echo. Import these files on a test PACS, Voice Recognition and Dose management system and verify proper display of the images and measurements.  Make sure that the hanging protocols work at the workstations and if not, troubleshoot it to find what the cause is (study descriptions, body part, etc.)
3.       Do an “install validation” by using a modality simulator that is able to query a worklist using the same attributes as used by the new modality and simulate Store for the various file type to the test PACS. Simulate the Storage Commitment and MPPS. There are commercial modality simulators available (e.g. OT-DICE) as well as open source ones (DVTK). When doing the simulation, use the same IP address, port and AE-Title that the new modality would be using. It is strongly recommended to use best practices for the AE-Titles and port numbers, i.e. use an all caps AE-Title that indicates the institution, location and modality, and use the standard port number (11112) as assigned by IANA to DICOM devices. Work with IT so that you get a new, fixed IP address assigned for the new modality and make sure they configure the VLAN and routers to allow access.
If you have taken all these precautions, you should be able to swap out the simulator for the actual device, and the chances are that it might be “plug-and-play” assuming you addressed all the issues during the pre-install phase.
However, if it still does not work, you might want to do some troubleshooting using the tools as described in part 2 of this post.