The healthcare industry’s transformation isn’t just taking place in consult rooms, retail clinics, or EHR developers’ office buildings.
Health IT infrastructure - the engine that drives clinical, operational, and administrative improvements, is also undergoing its own revolution as organizations turn to the cloud as a way to reduce costs, improve security, and speed up the workflows of users across the care continuum.
The lower deployment costs and increased flexibility of cloud technologies and virtualization strategies support rapid infrastructure growth as entities urgently expand their health IT capabilities to meet the high demands of users.
“Healthcare is transforming and the needs of healthcare infrastructure teams are transforming,” said Nebraska Medicine Vice President of IT Brian Lancaster to HITInfrastructure.com during the 2017 VMworld conference in Las Vegas.
“To achieve our goals, we have to invest in new approaches, data warehousing, analytics, and virtual care. All these things are needed to pivot from fee-for-service to value-based care.”
Organizations tend to struggle with how to plan for and adapt to these changes. Entities have to understand where their infrastructure’s current needs lie, what their future goals are, and how to utilize cloud and virtualization to their advantage - without resorting to down-time that busy providers simply cannot afford.
The need to be always-on in healthcare can be a significant challenge for IT decision-makers, who have to plan for the future while continuing to accommodate and support their current infrastructure. This often leads to infrastructure improvement plans that span five or more years.
IT decision-makers are also challenged by the demand to produce an infrastructure with a positive ROI while remaining in budget with the upgrades.
“Every year I sit down with my CFO, and she wants more but she also wants me to cut five percent of my budget,” said Lancaster. “How can you do more and get five percent of your budget?”
The answer is clear, Lancaster said. “Be virtualized. Move to the public cloud.”
Cutting back infrastructure costs with virtualization
Virtualization is a way for healthcare organizations to cut back on infrastructure costs while providing a more advanced, secure environment.
Nebraska Medicine’s long-term infrastructure improvement plans involve virtualizing much of their environment. Virtualization gives organizations more visibility and control over their IT environment.
“We have a ton of work to do over the next three or four years with virtualizing every aspect of the network,” Lancaster explained. “So far we've only done our Epic EHR environment.”
“We have plans for the whole stack. Anytime anyone needs an environment, resource, or an IT service, it's now automated and it can be deployed. Right now, we're a private cloud, so it's on-premises, but it's deployed using virtualization technology.”
Part of Lancaster’s plan involves creating more value through micro segmentation as the organization optimizes while paying close attention to cybersecurity.
“The industry’s rapid digitization is outpacing security controls,” Lancaster stated. “Healthcare providers are also becoming a larger target due to the premium people can get for PHI on the dark web. It's a perfect storm.”
“The other complexity is the data also has connections to hundreds of different systems, laboratory systems, radiology systems, x-ray systems,” he added. “We really didn't know what to open or shut. We looked at our environment and discovered we had over five million open ports. The attack surface was huge.”
Virtualization and automation bring value to IT infrastructure and allow organizations to be smarter and more proactive with their environments.
“After implementing advanced infrastructure solutions, we were able to virtualize those ports and we went from millions of ports to a handful of ports,” Lancaster said. “The direct correlation to the attack surface is amazing. We have a tiny attack surface now, which can translate to a huge return on investment because the potential of a breach is so much smaller.”
Virtualization begins with the data center
Data centers are the foundation for many other infrastructure solutions including disaster recovery and analytics, but they tend to be expensive because of the hardware and software investments they require.
Improving the data center includes data migration, which is a laborious process that needs to be completed carefully to ensure the migrated data ends up where it belongs in its new environment.
At Moffitt Cancer Center in Tampa, Florida, virtualization is a way to improve data centers and prepare for other future infrastructure technologies while ensuring adequate redundancies in case a disaster strikes.
“We had one monolithic datacenter and a lot of single points of failure with no resiliency,” said CTO Tom Hull. “Two years ago, we embarked on this massive infrastructure modernization. We started to build virtualization, then it was a cloud strategy, ability strategy, our data center strategy and our disaster recovery transformation.”
Moffitt was at risk of losing business continuity if it had a system failure with no standby. A hybrid environment of multiple data centers and the use of a public cloud, as well as a software-defined data center, helped to reduce the risks..
“We had several thousand servers, both physical and virtual,” Hull recalled. “Some of them were running Windows 2000, Windows 2003. They were old and susceptible to threats and unable to be patched.”
“We virtualized all those old physical servers and upgraded them. Now I can recommission those in a very easy way to have a warm standby. It helps with our backup and recovery and our disaster recovery, as well.”
Lancaster faced similar challenges when he arrived at Nebraska Medicine in 2015. He handled the transition to a new data center by virtualizing parts of the data center, making it more elastic for current and future infrastructure demands.
“Prior to that time, we were about 60 percent virtualized,” Lancaster explained. “Before the move we virtualized every system we possibly could and landed about 80 to 90 percent server virtualization and workload virtualization. That set the stage for what we decided to do with our new virtualized data center. We added more capacity and have the latest thinking architecture, but what were we going to do next?”
Modernizing the data center gave both organizations the ability to explore more advanced infrastructure technology and plan for what their future infrastructure would include in several years’ time.
Using public cloud to increase infrastructure flexibility
Leveraging a public cloud can potentially save organizations a significant amount of money on storage by removing the need to maintain an on-premise environment.
Organizations going through IT transitions or growth periods can also use the public cloud to their advantage for extra space while they’re growing.
“The fundamental benefit of cloud is around its cost of the environment,” Lancaster explained. “But with the cost, there's a trade-off on risk, especially with security concerns.”
The public cloud is useful for developing new applications because of the flexibility it offers during the building and testing process, he added. Once the application is tested, it can be moved to the on-premises data center or a private cloud hosted on site if the organization does not wish to keep the public cloud.
“Our firewalls now protect both private and public cloud when they're properly configured and we can start to consume more public cloud resources securely,” said Lancaster. “The benefit is that I no longer have to spend millions and millions of dollars on infrastructure in my data center. At some point, I can start moving more and more to help with cloud resources.”
Organizations need to use the public cloud smartly and be aware of how it will affect their infrastructure and their budget in the long term.
“The only advantage of public cloud in my opinion is convenience,” said Hull. “There are challenges with the public cloud. One is that it's an operational expenditure.”
“If I'm going to use a public cloud like AWS or Azure software and get those on an expense basis, I must predict what that's going to be every month,” Hull continued.
“That means my operational expenditures are going to go up, and that's not good. The challenges of correctly sizing what we have in our private cloud and then managing what parts to put in public cloud and how much public cloud space we’ll need is an architectural challenge.”
What to keep in mind when building an infrastructure strategy
To truly embrace advancing technology, Lancaster stresses the importance of establishing positive and open relationships with vendors as well as the IT decision-makers within the organization.
“We have to establish effective relationships across the board,” said Lancaster. “We spend a lot of time doing business relationship management to understand where CFOs and strategy officers want to go,”
“Then IT comes back as a consultant and tells them what we've heard, the five things we can do today with technology we have, and some technology we probably need to look at to enable our long-term vision."
Lancaster advised IT departments to avoid becoming barriers. Instead, they must work with their teams to figure out how to make things happen instead of telling the organization they simply can’t complete certain tasks because the technology is not there.
“Work through what the CFO and strategy officers are trying to achieve and come back with ideas,” Lancaster advised. “Move towards more of a consultant mode and try to establish those relationships within the enterprise.”
Hull agreed with Lancaster’s sentiment about opening up lines of communication among the organization.
“You have to put together a strategy that includes a cloud strategy, a mobile strategy, a technology, a service management model, and review board that includes stakeholders,” he said.
Hull also advised that organizations mapping out their infrastructure strategy should put together a roadmap that includes a cloud strategy, mobile strategy, a technology review board that includes stakeholders, and a service management model.
Entities should be able to bring those four things into a framework no matter the size of the organization, he stressed.
“The point is to be able to rate those strategies, govern them through good IT governance, and have that hybrid approach that is expandable,” Hull said.
Virtualization and cloud give organizations more freedom because they are not bound by what hardware they can afford and physically maintain.
Taking advantage of virtualization along with a flexible cloud strategy cuts back on costs while allowing IT decision-makers to plan for more innovative technology for the future.
Organizations need to begin with their data center and how to make it more elastic and accessible. Once flexibility is established using virtualization and cloud, adding applications and expanding the infrastructure further becomes much easier with the added visibility and control.