When encryption doesn’t mean more secure

when encryption doesnt mean more secure

I have had a number of clients reach out to me about how to implement whole disk encryption, SQL transparent data encryption, and encryption of VMware VMDK files in order to satisfy “data at rest” security requirements. My response is usually something like “Say that again?”

These types of encryption approaches are designed to better protect data at rest on media that may be accessible to individuals who are not authorized to access such data. This is usually some form of portable media such as a hard drive in the notebook computer, a portable USB hard drive, a USB stick, a backup tape, etc. 

And by “at rest” we are talking about files that have been saved to media and are not currently open or active. So to summarize, these types of encryption solutions are intended to protect data at rest on some form of portable media or media that is generally accessible to individuals that should not have access to sensitive data stored on that media. What I’m seeing, however, is that this type of encryption is being adopted to address “encrypt sensitive data” compliance requirements such as PCI DSS.

The intent of such “encryption of data at rest” requirements is to protect specific data from unauthorized access whether it be via application access, network file system access, or physical access. If the sensitive information is on storage media that is physically secured in a data center and this data is protected with appropriate network file system access controls, then the only thing remaining is to render the data unreadable to any unauthorized party at the application access level. 

This is where column or field level encryption comes in. Only authorized individuals or processes have access to the sensitive information in unencrypted form, and only authorized individuals or processes have access to the decryption keys that allow such access.

Let’s switch back to whole disk encryption and SQL transparent data encryption. When a system that’s running either of these is brought online, all users of the system have access to unencrypted data. Not just specific users who have been authorized to access specific sensitive information, but all users. 

When a server running BitLocker has finished booting, every process and user running on that host has access to data that BitLocker is decrypting for them on the fly every time it’s read from disk. A SQL database server running TDE makes all of its data accessible to all processes and users that have access to the database. 

While the database is running, the encrypted data is decrypted on-the-fly for all to see. The decryption keys are automatically presented regardless of who is requesting them. This isn’t really “protecting specific data from unauthorized access with encryption” is it?

With the proliferation of virtualization and cloud-based systems, we are now seeing this same thinking applied to protecting sensitive virtual systems. For a VMware environment, VMDK files can be encrypted to protect them from unauthorized access and use, but this is also a method that’s identical to solutions like whole disk encryption and SQL TDE. 

The data is only protected after it’s been written to disk, the VM is not actually running, and the decryption keys are only accessible to specific services and users that require access to the sensitive data. In most environments, this is not the case.

This type of encryption does have its place. For example, in multi-tenant or public cloud environments, it may be desirable to only allow specific authorized hypervisors to use certain virtual instances. It may make sense for SQL TDE to encrypt every database write to disk if you are using a public cloud providers’ storage and backup solutions. 

It might be a good idea to use whole disk encryption on a system that is physically at risk of being stolen. But just throwing these types of solutions at a system because they have the word encryption in them and they are easy doesn’t always mean that you’re actually doing a better job protecting sensitive information.

The Ups and Downs of Electronic Medical Records

the ups and downs of electronic medical records

The case for electronic medical records is compelling: They can make health care more efficient and less expensive, and improve the quality of care by making patients’ medical history easily accessible to all who treat them.

First Person: Which ‘HT’ to Treat: Hypertension or Hammertoe? (October 9, 2012)
Redefining Medicine With Apps and iPads (October 9, 2012)
Small wonder that the idea has been promoted by the Obama administration, with strong bipartisan and industry support. The government has given $6.5 billion in incentives, and hospitals and doctors have spent billions more.

But as health care providers adopt electronic records, the challenges have proved daunting, with a potential for mix-ups and confusion that can be frustrating, costly and even dangerous.

Some doctors complain that the electronic systems are clunky and time-consuming, designed more for bureaucrats than physicians. Last month, for example, the public health system in Contra Costa County in California slowed to a crawl under a new information-technology system.

Doctors told county supervisors they were able to see only half as many patients as usual as they struggled with the unfamiliar screens and clicks. Nurses had similar concerns. At the county jail, they said, a mistaken order for a high dose of a dangerous heart medicine was caught just in time.

The first national coordinator for health information technology, Dr. David J. Brailer, was appointed in 2004, by President George W. Bush. Dr. Brailer encouraged the beginnings of the switch from paper charts to computers. But in an interview last month, he said: “The current information tools are still difficult to set up. They are hard to use. They fit only parts of what doctors do, and not the rest.”

Long before computers, many hospitals and doctors charged for services in ways that maximized insurance payments. Now critics say electronic records make fraudulent billing all too easy, and suspected abuses are under investigation by the Office of the Inspector General at the Department of Health and Human Services.

Like all computerized systems, electronic records are vulnerable to crashes. Parts of the system at the University of Pittsburgh Medical Center were down recently for six hours over two days; the hospital had an alternate database that kept patients’ histories available until the problem was fixed.

Even the internationally respected Mayo Clinic, which treats more than a million patients a year, has serious unresolved problems after working for years to get its three major electronic records systems to talk to one another. Dr. Dawn S. Milliner, the chief medical informatics officer at Mayo, said her people were “working actively on a number of fronts” to make the systems “interoperable” but acknowledged, “We have not solved that yet.”

Still, Dr. Milliner added that even though there a lot of challenges, the benefits of information technology are “enormous” — improved safety and quality of care, convenience for patients and better outcomes in general.

Patients at Mayo’s headquarters in Rochester, Minn., and its Arizona and Florida sites can see their records online, even via an iPhone app; those in Mayo’s network of doctors’ offices and hospitals in the upper Midwest will eventually have similar access.

In the rare event that a large-scale system goes down at Mayo, backup measures are ready, teams are called in to make rapid repairs, and if necessary “everyone is ready to go on paper,” Dr. Milliner said.

Reliable data about problems in the electronic systems is hard to come by, hidden by a virtual code of silence enforced by fears of lawsuits and bad publicity. A recent study commissioned by the government sketches the magnitude of the problem, calling for tools to report problems and to prevent them.

Based on error rates in other industries, the report estimates that if and when electronic health records are fully adopted, they could be linked to at least 60,000 adverse events a year.

The report, to the Agency for Healthcare Research and Quality, analyzed ways of forestalling hundreds of information technology “hazards” at seven hospitals and health systems. A typical example would be drug orders transmitted by an electronic app to a pharmacy using a different app. 

“It’s hard to keep them speaking the same language, to automatically link a medication in one app to exactly the same medication and dose in the other app,” said Dr. James M. Walker, chief health information officer of the Geisinger Health System, who led the study.


The Obama administration will issue a report on patient safety issues in early November, the current national coordinator, Dr. Farzad Mostashari, said in an interview. That report was requested last year by a panel on health I.T. safety at the Institute of Medicine, a unit of the National Academies of Science.

The institute recommended that the government create an independent agency like the National Safety Transportation Board to deal with patient safety issues, and it called for an end to “hold harmless” clauses that protect software manufacturers from lawsuits but can limit the freedom of doctors and hospitals to publicly raise questions about errors or defects.

Elisabeth Belmont, a lawyer for the MaineHealth system, based in Portland, advises hospitals to reject contract language that could leave them responsible for settling claims for patient injuries caused by software problems. “One software vendor was surprised when my client opted to walk away and purchase the software from another vendor who had a more reasonable approach on these issues,” she said.

The institute also recommended that software manufacturers be required to report deaths, serious injuries or unsafe conditions related to information technology. So far, however, neither a new safety agency nor such a reporting system has been adopted.

Some of the largest software companies have opposed any mandatory reporting requirement. But Gail L. Warden, chairman of the institute’s patient safety panel, said in an interview that the industry was divided on the issue; some companies were accustomed to regulations for their widely used medical devices for imaging, for example.

Critics are deeply skeptical that electronic records are ready for prime time. “The technology is being pushed, with no good scientific basis,” said Dr. Scot M. Silverstein, a health I.T. expert at Drexel University who reports on medical records problems on the blog Health Care Renewal. He says testing these systems on patients without their consent “raises ethical questions.”

Another critic, Dr. Scott A. Monteith, a psychiatrist and health I.T. consultant in Michigan, notes that Medicare and insurance companies generally do not pay for experimental treatments that have not proved their effectiveness.

A Medicare administrative contractor, National Government Services, said recently that it would deny payment for treatments using “cloned documentation” copied from electronic records rather than individualized patient notes composed by doctors and nurses to show medical necessity. Dr. Monteith said the electronic systems were “disrupting traditional medical records and, beyond that, how we think” — the process of arriving at a diagnosis. For example, the diagnosing process can include “looking at six pieces of paper,” he said. “We cannot do that on a monitor. It really affects how we think.”

Deborah Burger, a registered nurse for more than 30 years who works with pain medicines and anti-anxiety drugs for colonoscopy patients, said electronic systems offered “drop-down menus of so-called best practices.”

“The problem is each patient is an individual,” said Ms. Burger, who is president of the California Nurses Association. “We need the ability to change that care plan, based on age and sex and other factors.”

She acknowledged that the system had one advantage: overcoming the ancient problem of bad handwriting. “It makes it easier for me to read progress notes that physicians have written, and vice versa,” she said.

Some experts said they were hopeful that the initial problems with electronic records would be settled over time. Dr. Brailer, who now heads Health Evolution Partners, a venture capital firm in San Francisco, said that “most of the clunky first-generation tools” would be replaced in 10 years. “As the industry continues to grind forward, costs will go down,” he said. “Tools are being simplified.”

Mark V. Pauly, professor of health care management at the Wharton School, said the health I.T. industry was moving in the right direction but that it had a long way to go before it would save real money.

“Like so many other things in health care,” Dr. Pauly said, “the amount of accomplishment is well short of the amount of cheerleading.”

Two Texas HIEs join forces, enable inter-network exchange

Health information exchange (HIE) continues to move forward in the Lone Star State. Two of Texas’s 12 regional health information networks, Healthcare Access San Antonio (HASA) and Austin-based Integrated Care Collaboration (ICC), have announced that they are the first in the state to establish the exchange of patient health information between their HIEs.
Through the connection and this inter-network exchange, physicians using the HIEs are able to reach providers in 89 counties in Central, South, and East Texas. The ability of HASA and ICC to facilitate inter-HIE exchange is the end result of collaboration between their respective HIE vendors, Medicity and Centex System Support Services and their use of Direct exchange, the platform and standards for exchange established by the Direct Project with significant guidance by the Office of the National Coordinator for Health IT (ONC).
Physicians often are limited to exchanging information within the boundaries of the HIE in which they participate. Physicians in the HASA and ICC networks – covering 89 counties in Central, South and East Texas – can now communicate conveniently about their patients’ care regardless of these boundaries. The ability to share patient information across HIEs helps physicians coordinate care for their patients and make more informed clinical decisions.
Not only are participating providers able to send secure messages to each other, but they are also able to exchange patient summary documents. HASA intends to use the new inter-HIE connectivity to inform analytics projects aimed at quality improvements as well as to support initiatives focusing on population health management. Additionally, the experience should serve as a means to connect HIEs and federal agencies in the future.

The Benefits of Inter-Network Exchange

The partnership between HASA and ICC is the latest health IT development in Texas that involves HIE. Last week, the University of Texas at Austin announced the opening of its HIE laboratory wherein participants in its 9-week certificate program would have the ability to simulate and test HIE using real software provided by two major HIE vendors, Orion Health and Informatics Corporation of America (ICA), and several EHR developers.  “We’re trying to stay out in front,” UT Austin’s Dr. Leanne Field told EHRintelligence.com, “And our teaching and our hands-on experiences by design will help students be prepared for the latest technology that they may see in the workplace.”
The news of the connection between HASA and ICC confirms Texas’s place in moving HIE forward in advance of Stage 2 Meaningful Use, which is set to begin in 2014 for eligible professionals and hospitals in the EHR Incentive Programs.