w w w .d kl .com
M ay 20 16
Cont nte 2
w w w .d k l .com
ents ents Separating signal data from noise when
dealing with Big Data in the age of genomics p.6 Embracing Open Source Technology For The Mainframe p.12 Healthcare IT Organizations face Critical Challenges p.18 The New World of Virtual Medicine p.26 Six ways to improve IT efficiency and profitability p.32
3
Letter from the Editor
It is no surprise that our world is changing at a rate never before seen in human history. Our reliance on technology and the impact it has on our lives have led to a duality that equals both a burden and a freedom. From our mobile devices, to our television screens, to our social lives, and of course to work—the amount of data we feed and consume from the global machine is staggering. With all this, we now add to the mix our own health: how we care for ourselves and others. The impact of data within the healthcare industry has again brought us to a new level of dependence and freedom in a simultaneous fashion. Political views aside, the impact of the Affordable Care Act on U.S. citizens represents millions of new files, accounts, claims, and more—all of which falls directly on the shoulders of the IT department. Couple that with our new dependence on mobile communications: our doctors’ visits now done via video conference all while real-time data is fed from patient to medical staff, giving insight into a patient’s wellbeing and medical status. That data is not to be taken lightly, not only for its content, but also for the amount of data collectively that again must be managed by global IT infrastructures. It is for this reason that we bring you our healthcare issue of DirectionIT Magazine—to give you insight into leading-edge technology, how medical and insurance data affects our lives, and the companies and institutions that manage it, and what our future holds as we look forward to a brave new data-filled world. We hope you enjoy this issue.
Allan Zander Editor-in-ChiefPublisher
5
Separating signal data from noise when dealing with Big Data in the age of genomics by Andrew Armstrong
As we know, Big Data refers to vast volumes of both structured and unstructured data—so huge and complex that it is hard for most to process and analyze. Just think of databases such as DNA sequencing and other molecular technologies, diagnostic medical imaging, behavioral factors, financial transactions, geographic and social media information—and you will have an idea of the magnitude of processing and analyzing the data. A major contributor to Big Data is the genome sequencing of humans and other organisms, as well as other types of data that are becoming more diverse, vast and complex. So much so, that our ability to effectively store, manage, share, analyze and interpret data has been not only challenged, but also exceeded. As the American biologist, Leroy Hood, commented, “We predict that in five to ten years each person will be surrounded by a virtual cloud of billions of data points.” We have all heard about the Big Data revolution and how it will transform everything including health and healthcare. Just look at internet content such as “Why ‘Big Data’ is a Big Deal: Information science promises to change the world” by Jonathan Shaw, Harvard Magazine, or “Can Big Data Tell Us What Clinical Trials Don’t?” by Veronique Greenwood, The New York Times Magazine. Today, we see several promising applications of Big Data in improving health, incorporating the use of genome sequencing technologies.
6
w w w .d k l .com
7
For an example of how these applications can solve a problem, just take a look at the cholera outbreak that swept through London in the mid-nineteenth century. John Snow, who is considered the father of modern epidemiology, outlined his investigation on paper, indicating the homes where cholera had struck. After spending much valuable time searching for the source of the outbreak, he found that it was the Broad Street Pump, and this was long before the cause of cholera was known. Today, of course, Snow would have had access to GPS information and disease prevalence data, and he would have solved the problem in only hours. Even so, today we frequently have more noise than signal. Thus, it is difficult to determine useful data from data that is not, that is just noise. An example of noise skewing an analysis occurred in 2013 when influenza hit the United States hard and early. An analysis of flu-related internet searches drastically overestimated peak flu levels relative to those determined by traditional public health surveillance. Many false alarms could also have resulted through thoughtless analysis, leading to assumed associations between Big Data points and the outcome of the disease. We need to ensure that the causality is correct so that it doesn’t lead to harmful or ineffective interventions. Regardless, there have been some funny correlations: • • • • •
People who drowned after falling out of a fishing boat correlates with Marriage rate in Kentucky Number of people who drowned by falling into a pool correlates with films Nicolas Cage appear in Per capita cheese consumption correlates with Number of people who died by becoming tangled in their bedsheets Divorce rate in Maine correlates with Per capita consumption of margarine Age of Miss America correlates with Murders by steam, hot vapors and hot objects
These spurious correlations are courtesy of tylervigen.com.
8
w w w .d k l .com
The genomics field has addressed the problem of signal and noise by requiring replication of study findings, and by requesting stronger signals in terms of statistical significance. Now when analyzing Big Data, the genomics field necessitates the use of collaborative epidemiological studies, animal models, and other work as well as Big Data analysis. The strength of Big Data is in finding associations, but it does not show the meaning of these associations; therefore, the first step is to find a stronger signal. Knowing where to look, identifying the dataset such as Snow’s rumination on paper, is the place to begin. If you had to rely on just Big Data as your source, you could end up on Tyler Vigen’s website of spurious correlations. Snow found where to look by experimenting: he removed the handle from the pump and dramatically moved from correlation to causation— rapidly reducing the spread of cholera in the London population. Here are three ways in which to realize the potential of Big Data to improve health and prevent disease in the world of genomics: 1. An epidemiological base When studying Big Data in health and disease it is critical to have a strong, supportive epidemiological base. You need to replicate the associations you find using Big Data that will confirm your findings and make them generalizable. For example, the study of representative and well characterized populations such as that conducted by the NCI Cohort Consortium. This is an extramural-intramural partnership formed by the National Cancer Institute (NCI) to address the need for large-scale collaborations to pool the massive amount of data and bio-specimens necessary to conduct a wide range of cancer studies. 2. A translational research agenda Translational research, is simply research that applies discoveries generated in the laboratory to studies in humans (bench to bedside), or that speeds the adoption of best practices into community settings (bedside to practice). To reap the benefits of using Big Data for genomics research, you need a translational research agenda that moves beyond the bench to bedside model, or initial discovery. 3. The integration of knowledge It is crucial to link and integrate disparate data sources to learn everything possible so you can apply advanced analysis. When you leverage heterogeneous datasets and securely link them, you have the potential to improve healthcare by identifying the right treatment for the right individual or subgroup. An evidence-based knowledge integration process applies to all Big Data, not just genomics. Separating the true signal from the massive amount of noise is neither easy nor straightforward. But the challenge needs to be met as, according to many researchers, lives are lost every day that data sit in storage. The combination of a strong epidemiological base, a translational research agenda, the integration of knowledge and Big Data will help. Furthermore, as disease researchers now have access to human genetic data and genomic databases of millions of bacteria—they can combine the data to study treatment outcomes. According to McKinsey & Company, with the right tools Big Data could be worth $9 billion to U.S. public health surveillance alone, improving detection of and response to infectious disease outbreaks. Big Data is helping to make the world a better place, especially in the field of healthcare where it is being used to predict epidemics, cure disease, avoid preventable deaths, and improve the quality of life.
9
available at
available at
Embracing Open Source Technology For The Mainframe
12
w w w .d k l .com
6
reasons to start now Increasingly, the passion of IT teams for open source means they bring a "default to open source" mentality to every project on which they work. Open source in the enterprise had been quite successful in finding solutions for many business needs. Naturally, using open source solutions are increasingly becoming the first choice for operating systems, middleware, and cloud needs. Beyond that, open source solutions are also rapidly becoming the first choice for other business needs, such as user authorization and telephony. Open source is a good fit for the enterprise, and is the first choice for many over traditional COTS solutions. Here are six reasons why open source is winning in the enterprise, ones that apply directly to the mainframe environment.
13
1,
Speed
Is your organization competing on speed? If it isn’t, it should be. With open source you have speed, as one of its advantages is its ability to take the community versions and quickly get started. IT understands whether the community versions can solve your business dilemma—and deliver immediately. Once you reach that stage, professional support and services for open source products aren’t far behind, especially those supported by Red Hat. Nothing better than having the best of both worlds: agility, flexibility and getting started not only quickly, but also at minimum expense. Plus you can mature to a large scale, fully supported, enterprise-grade implementation—and you don’t have to be concerned about proprietary licensing, making it even faster.
2,
Agility & flexibility If you’re an IT leader, no doubt you’re tasked with providing agility and flexibility for your enterprise. If you can’t compete on either, then the competition is going to leave you behind. With open source, you enable technology agility, which usually offers many ways to solve your problems. With open source, you help keep your IT organization from getting blocked because a vendor hasn’t a particular capability you need. Why wait? You can create the capability yourself. Here’s an example. You could stand up an OpenStack, a Red Hat Enterprise Linux or the community equivalent, or perhaps a MongoDB, and with the open source software available free over the internet, you could easily do it on your own. Develop your skills: commence building a project, a platform, or test feasibility. Imagine doing that with similar proprietary products from VMware, Microsoft or Oracle? Just to get started, you would have to spend days or weeks simply negotiating terms and conditions, and fees.
3,
Information security If you’re looking for a solid information security record, commercial open source offers high responsiveness to information security challenges by the open source community and vendors. Their reputation in this regard excels. When you’re looking at code that is in some cases decades old, and you’re able to proactively identify and fix problems rather than have the code slowly disintegrate in a proprietary environment, then you avoid a security breach, that is another advantage of open source.
14
w w w .d k l .com
4, 5, 6,
Cost effectiveness When you compare the cost of open source against the cost of a proprietary solution, you’ll find that open source is usually much more cost effective. Typically, open source solutions are less expensive in an enterprise environment for equivalent or superior capability. Not only that, but open source solutions also enable enterprises to start small and scale, which makes a lot of financial sense for an enterprise that’s cost conscious for budgetary reasons.
More attractive talent It’s a fact: open source leads to enterprises attracting much better talent. That’s because many good tech people are familiar with open source and believe that’s where the industry is heading. They are passionate about open source because they can create their own projects, giving them the ability to interact with others outside the enterprise to develop solutions. This means that developers have much more flexibility and freedom—hence, attracting talented individuals.
Looking at the future It’s said that open source is the future. Especially since web, mobile, and cloud solutions are increasingly built on open source infrastructure. As some data and analytic solutions are only available in open source, this further reinforcing the perception that open source is the future. Furthermore, future architectures are more likely to be based on open source, such as mobile solutions with the Android platform; web solutions where the majority of websites are based on open source technology; and cloud solutions where almost all, with the exception of Microsoft’s cloud, are also based on open source solutions.
15
available at
18
w w w .d k l .com
Healthcare IT Organizations face Critical Challenges Healthcare IT organizations face enormous challenges: Providers such as hospitals, HMOs, and PPOs must deal with spiraling costs, increasingly strict regulatory requirements, rising patient expectations, constrained reimbursements, and more. Payers such as healthcare insurers must deal with equally spiraling costs, constrained premiums, even more regulatory requirements, customer defection, and more.
19
Here are some possible solutions to four of the challenges faced by healthcare IT organizations:
1. Regulatory pressures With the healthcare industry in America being one of the most heavily regulated in the world, insurers must comply with many layers of state and federal laws and controls. These include HIPAA (Health Insurance Portability and Accountability Act of 1996) privacy and security requirements, rigorous auditing and reporting mandates, and exceedingly complex and restrictive rules and payment mechanisms for Medicare, Medicaid, and other programs. Assuring compliance on so many fronts severely strains resources, while increased state and federal regulator scrutiny punctuates the urgency. Similarly, healthcare providers are faced with a myriad of compliance requirements. These are related to patient information security, operational practices, and Electronic Health Records (EHR) management, and include HIPAA, HITECH (Health Information Technology for Economic and Clinical Health Act), SOX (Sarbanes-Oxley), JCAHO (Joint Commission on Accreditation of Healthcare Organizations), and other standards. For insurers, increased regulation can force downward pressure on premium prices, making it increasingly difficult for a business to remain profitable. For healthcare providers, reduced funding and regulated government reimbursements combine to put the squeeze on margins.
2. The Patient Protection and Affordable Care Act While many in the healthcare insurance business were concerned that the Patient Protection and Affordable Care Act (PPACA) would provide healthcare insurance to individuals at the expense of healthcare insurers, and that the insurers would suffer, the exact opposite turned out to be true. What has happened, in part due to new business from new clients with government-subsidized premiums, insurers have in fact benefited tremendously from PPACA. That is the good news part for healthcare insurers; the bad news part is that they are now faced with a sharp increase in business and are searching for ways for their IT systems to cope with that growth. Like any growing business, healthcare IT organizations must find ways to cope with growing business demands without incurring drastic capital costs for what could be completely unnecessary forklift replacement IT projects.
3. Increasing costs Increased healthcare costs It is common knowledge that healthcare costs are spiraling upwards at a yearly rate many percentage points higher than cost increases in any other sector of the economy. For insurers, it means constant increases in claims and related costs, and constant plan adjustments to stay ahead of the curve. For healthcare providers, it means having to charge more and more for services. Complying with increased regulation significantly increases costs for both insurers and providers. Increased healthcare IT costs Increased regulation makes a direct and measurable impact on IT costs as they require organizations to regularly update and modify many, if not all, their applications. Adding further to IT costs for those organizations running mainframe systems, hardware and licensing fees continue to rise, even for legacy systems. Organizations that have invested heavily in modernization and migration programs often find themselves with higher costs than before: for some, increased personnel costs were not carefully considered, and equipment and licensing costs were higher than expected. Others have learned that they must continue to support and pay for their mainframe systems longer than planned or, in some cases, indefinitely due to project delays and limits on the capabilities of the new systems.
20
w w w .d k l .com
4. The rock and the hard place These considerable pressures on both ends of the healthcare landscape place payers and providers between the proverbial rock and hard place: How can they provide high-quality service and fulfill the growing regulatory requirements, while contending with rapidly growing demands on their systems, continuously rising costs, and limited resources? Some of the most critical problems faced include, but are not limited to: • • • • • • •
Healthcare patient, insurance and payment data residing in different locations and on different platforms, challenges users who need this information and IT organizations tasked with delivering the information Constant concern with growing demands on systems and rising operational costs Web and mobile access being a critical need, while applications and data are on the mainframe Mainframe applications performing well enough, but presenting a challenge when market conditions change during mergers, etc. Some mainframe applications that do not perform well and that offer no obvious way to address this outside of yet another system upgrade, or a complete infrastructure forklift replacement Some packaged applications that also do not perform well Despite maintenance and tuning efforts, DB2 applications are high on mainframe system resource usage and corresponding operational costs
As healthcare IT organizations look at doing their part to reduce costs, they will also look long and hard at their IT systems for opportunities for both operational and capital cost savings, new technology investments, and costly system upgrades. Furthermore, many IT vendors exist who will encourage healthcare organizations to modernize their systems by replacing their mainframe-based systems with distributed systems, playing the “mainframes are old and expensive” card. In actuality, solutions like this can be very costly, and may be unnecessary since many mainframe-based HIT systems already provide capable business intelligence, simplified billing, real-time claims adjudication, and efficient solutions for online transactions. As well, migrations like this often result in systems that turn out to be less capable and costlier in the long run due to dependency-creep, scope-creep, and significantly increased IT personnel costs. To many, it appears that any “solution” is going to cost a lot. But that is not necessarily the case. Mainframe systems can be optimized and operating costs can be dramatically reduced for a fraction of the cost of grand migration schemes. This applies both to those organizations running modern mainframe applications that are under constant development and maintenance, and those organizations running legacy mainframe applications.
2 1
Healthcare IT solutions for mainframe optimization IT optimization solutions can impact a healthcare organization’s IT costs by improving mainframe system performance and cost-effectiveness, providing a tremendous competitive advantage for both healthcare insurers and healthcare providers. Healthcare IT organizations can: • • • • • • •
22
Get healthcare patient, insurance and payment data where it needs to be, in real time Increase system capacity to handle workload growth and to control operating costs with one single solution Get healthcare professionals, practitioners and patients access to data and services using new web and mobile applications—while leveraging legacy IT mainframe systems Transform existing mainframe applications into low-maintenance, marketagile, high-performance, merger-friendly applications Accelerate mainframe applications using high-performance mainframe inmemory technology Accelerate packaged legacy applications without touching the code Reduce DB2 application resource usage and operational costs by improving SQL quality
w w w .d k l .com
Healthcare IT solutions for mainframe optimization include: •
Automate mainframe soft capping – Fortunately, there are solutions that will help healthcare IT organizations to effectively handle current online and batch workloads by more efficiently employing current system workload capacity. Better than that, they can handle current workloads and newer workloads using the CPU that they are now consuming. These solutions can automate mainframe soft capping, making the process much more flexible and predictable—without actually capping any business-critical application processing. Healthcare IT organizations can avoid capping altogether by dynamically controlling LPAR defined capacity limits, leveraging available capacity from other LPARs, essentially providing capacity on demand.
•
Performance-optimized replication – The best solution for healthcare IT organizations with this particular challenge is a replication solution that is well integrated and that has performanceoptimized Changed Data Capture (CDC). It promotes workflow continuity by allowing for near real-time data replication that eliminates the problems found in many other data replication and ETL systems.
•
New web and mobile applications leverage legacy services – You can leverage existing assets and IT organizations can build new applications on distributed systems (or on a mainframe zLinux OS LPAR) using developers writing applications in server side JS, Java, .NET, etc., and run tight but powerful APIs that will allow those applications to communicate with the existing assets.
•
Transformative mainframe applications – IT organizations can have the best of both worlds: the power, throughput, security and reliability of the mainframe and the agility of modern distributed systems—but better, faster and more responsive than any other solution. Business applications, particularly business rules, do not need to be handled by mainframe IT personnel. The mainframe can be used to host an organization’s most important market-agile business applications.
•
High-performance in-memory technology – High-performance mainframe in-memory technology is the perfect solution for healthcare IT organizations with this type of problem. Using current applications and the same mainframe hardware as before, IT organizations can dramatically improve application performance.
•
Accelerate packaged mainframe applications – A relatively painless solution for this problem is a technique that uses high-performance in-memory technology and SQL interception. IT organizations can turn their DB2 proprietary legacy applications into in-memory applications without changing any application code at all.
•
Automate DB2 SQL quality assurance – An ideal solution to DB2 SQL quality challenges would be an automated Quality Assurance approach. Automated DB2 SQL quality assurance will improve DB2 SQL quality and performance in both development and production environments. Organizations can analyze their current SQL to help identify areas where SQL inefficiencies exist, and to know precisely where to make improvements. When applied, improvements will sharply reduce the organization’s mainframe resource usage metrics: less resource usage for each access request from the applications accessing mainframe DB2 data.
By leveraging existing IT investments, healthcare organizations can achieve improved throughput capacity and overall system performance, a sharp decrease in cost per transaction and application maintenance costs, and greatly enhanced market responsiveness—for much less than that the cost of competing solutions. Want more information? Download the White Paper Optimization Solutions for Health Insurance IT Organizations here:http://www.dkl.com/optimization-solutions-for-healthcare-it-organizations
2 3
available at
Available at
w w w . L E X T E M P U S . c o m
How we interact with the Healthcare Industry is changing rapidly We’re all familiar with virtual medicine but, although not new, we are seeing its rapid growth and increasing adoption. In 2013, according to Ken Research, the market for virtual medicine generated annual revenue of $9.6 billion, representing a growth of 60% from 2012, when overall revenue was $6 billion. Furthermore, the research also states that the virtual medicine market is expected to grow by approximately 32% compound annual growth rate (CAGR) from 2013 to 2018. Factors leading to the adoption of virtual medicine include secular trends in technology, policy and reimbursement shifts, evolving technology, and evolving consumer preferences.
26
w w w .d k l .com
2 7
Let’s take a closer look at some of the key drivers of these trends: 1. Expectations of consumers for convenience Today, consumers expect convenience. It’s called “Ondemand Economy” and so far it has attracted over $4.8 billion in investment from institutional investors. (“The ‘On-Demand Economy’ Is Revolutionizing Consumer Behavior—Here’s How.” Mike Jaconi. Business Insider). It’s inevitable that healthcare should be affected as evidenced by the recent rise of urgent care centers, health kiosks, e-visits, and a multitude of health and wellness mobile apps—now over 100,000. (“Mobile health app revenue to grow tenfold by 2017, study predicts.” John N. Frank. Modern Healthcare) This will, of course, stimulate virtual medicine adoption. 2. Omnipresent and inexpensive mobile broadband Around the world and in the U.S., broadband use and smart phone adoption are increasing. This year, 2016, will see whether the predicted smart phone subscription will outnumber those for basic phones, and whether mobile data traffic will grow at a compound annual growth rate of 40%. (“Ericsson Mobility Report On The Pulse of the Networked Society.” February 2015.) With an omnipresent mobile technology and dataenabled infrastructure, healthcare is poised to deliver new and more convenient access points for consumers who need to interface with the medical system. 3. Rising healthcare consumerism continues In 2014, the single largest one-year increase in enrolment in high-deductible consumer-driven health plans occurred: from 18% to 23% of all covered employees. (“Costs Slow as Health Care Consumerism Grows.” Tracy Watts and Beth Umland. December 31, 2014.) Employers increasingly offer these plans to employees, and employees increasingly take on more of the out-of-pocket costs; thus, virtual medicine becomes increasingly attractive with its lower cost and greater degree of choice.
28
w w w .d k l .com
4. Changing revenue models that reward value, not volume Today, 744 Accountable Care Organizations (ACOs) cover a total of 23.5 million lives. (“Growth And Dispersion Of Accountable Care Organizations in 2015.” David Muhlestein. Health Affairs Blog.) Since 2011, the number of organizations and covered lives has grown more than tenfold. The U.S. Department of Health and Human Services (HHS) January 2015 announcement to move 50% of Medicare payments towards alternative payment models by the end of 2018 will reinforce this trend. (“Better, Smarter, Healthier: In historic announcement, HHS sets clear goals and timeline for shifting Medicare reimbursements from volume to value.”) In February 2015, the largest payer in the U.S., UnitedHealthcare, announced that its total payments to physicians and hospitals that are tied to value-based arrangements have nearly tripled in the last three years to $38 billion. Rewarding providers for value means moving to lower cost options while, at the same time, maintaining quality healthcare delivery. Virtual medicine will be crucial in this scenario: it will help to enable the cost-effective and safe provision of value-based care. How we interact with the healthcare industry is affected by the fact that Virtual Medicine solutions vary widely in the degree to which they integrate back into the traditional delivery system, connect the Primary Care Physician (PCP)-patient relationship, and load to the main health record. As a result, tradeoffs between consumer convenience and data cohesion have arisen as a potential issue for the safety of the patient and continuity of care. Here are the four levels that clarify the integration to fracturing of care: 1. Integrated Care Level The first level contains 100% of the consumer’s health relationships, whether with primary care provider (PCPs), family/caregivers, or ancillary professionals. It represents 100% of all health record data. It is the same as many industry terms, synonymous to the “medical home,” “universal patient record,” and supports patient (and caregiver) selfefficacy. Furthermore, consumers and their care team, gather, diagnose, translate and act on healthful insights based on their health literacy.
3. Extended Integration Level This is when the risk of fracturing care and data emerges because of the extension of care through additional provider relationships, ones that are still linked peripherally to the care team. At this level, integration still occurs, but with an inherent risk. For example, virtual medicine used in home visits or during rural clinician shortages, integrated urgent care and online e-visits with referral clinicians—so many additional clinicians could cause the fracturing of care and data if they weren’t integrated back into the primary health record. 4. Outside Care Level This level operates outside of and separately from the existing care team and its network. For example, one-off encounters such as employer-sponsored e-visits, community-based retail clinics and kiosks, as well as consumer apps and devices. The risk is high of diffusing patient data and disrupting the continuity of care when data isn’t integrated. The consumer is at risk of such activities as repeated tests and incomplete diagnoses due to partial data. Our interaction with the healthcare industry is increasing, especially in light of the new offerings and models of virtual medicine. What we need is objective evidence on the cost, quality, and access implications of these new virtual medicine innovations, such as outside care that is far from the core. There is also a need for a consumer-driven data exchange to support these models and connect them back to the core. The Blue Button initiative, spearheaded by the ONC (Office of the National Coordinator for Health Information Technology), focuses on giving consumers easier online access to their health data, enabling portability so that patients can securely move their data as they please, rather than seeing the data reside in legacy systems owned by payers and health systems. The Blue Button symbol signifies that a site has the functionality for consumers to go online and download health records. Consumers can use their health data to improve their health, and to have more control over their personal health information and their family’s healthcare. Initiatives like Blue Button are paving the way for a more integrated, cost-effective and efficient way for consumers to manage their health records. So perhaps one day the right balance will be achieved between consumer convenience and data cohesion.
2. Virtual Medicine-enabled Level This level reinforces the original integrated care relationships between consumers and their care team, including caregivers. All virtual medicine does at level two is enable the in-person care team to be more available, efficient and scalable. What it does not do is alter pre-existing in-person, long-term care team relationships. It enables the enrichment of these relationships through technology, capturing all data and care back to the main health record.
2 9
available at
available at
6
ways to improve IT Efficiency & Profitability for Healthcare Insurance Companies By Wayne Sadin – CIO Advisor
If you’re a healthcare insurance company, you’ve probably felt the repercussions of the Affordable Care Act. Since its inception, it has created the largest growth in the number of insured people in four decades—greatly affecting the IT burden within the insurance industry as a whole. In fact, since the passage of the Affordable Care Act five years ago, about 16.4 million uninsured people have gained health coverage. And, more than 12.3 million additional individuals are enrolled in Medicaid and CHIP as of April 2015, compared to before October 2013. It’s this growth in the number of insured people, and the associated insurance claims, that continue to challenge IT organizations when managing the IT workload associated with increases in insurance policies and claims.
32
w w w .d k l .com
3 3
Here are six ways that will help you improve IT efficiency and profitability:
1
Obliterate Wasteful Steps
When we think of “efficiency” we usually think in terms of speeding up the steps of a process—if each step can go faster, we’re more efficient, right? Not quite. An even better way to increase efficiency would be to take some steps out of the process completely. Do you remember “reengineering?” What most people saw done in the name of reengineering was staff cuts, but most people missed the point. The original Harvard Business Review article that coined the term “reengineering” was published in the July/August 1990 issue: “Reengineering Work: Don’t Automate, Obliterate,” by Michael Hammer. The article didn’t talk about cutting jobs, per se: it talked about rigorously documenting a process; decomposing it into steps; looking at the cost/benefit of doing each step; and eliminating the steps that didn’t have an adequate ROI.
2
Consider IT’s Superpower
3
It’s the job of IT to help the business understand how its processes work—from end to end—before looking at how it can improve them. IT is in a unique position as it’s one of two corporate functions with the remit to trace a process from start to end across departmental lines. This is its superpower: the power to break down arbitrary corporate silos and actually understand what’s happening in the “claim-to-pay” cycle, especially as it’s handed off from one silo to another.
Look at the Edges
Look at the “edges” of process for improvement opportunities. An “edge” is anyplace the process starts/stops/ restarts/reworks/transitions. Here are three types of “edges” to get you started:
34
•
Silo handoffs: If department “A” has a “check outgoing quality” step at the end of its process, and department A+1 has a “check incoming quality” step at the beginning of its process, there’s an opportunity to become more efficient.
•
Startup/Shutdown: Most continuous processes work pretty well when things are humming along. But if a process must be stopped (for maintenance, or shift change, or for weekends), starting it without breakage can be tricky. Look for this breakage, because reworking errors can be very expensive.
•
Dirty Data: Most processes work well when data is complete and correct; many fall apart when missing data must be found (By whom? When? How?), or incorrect data must be fixed (often in a much later process step). Look closely at how erroneous or incomplete is handled, and I bet you’ll find manual, adhoc, poorly documented (i.e., wasteful) recovery steps just begging to be cleaned up.
w w w .d k l .com
4
Make friends with IA
IT is one of two functions that can see a process in its entirety. The other function is Internal Audit (IA). Here’s a hint for the CIO: make friends with the head of IA and include auditors on project teams. While IA’s official mission is ferreting out fraud (that’s what everyone thinks, anyway), inefficiency and fraud have many of the same symptoms (manual steps, lots of errors, downstream rework, etc.) Internal Audit is already identifying process steps being done manually—a red flag for Sarbanes-Oxley (SOX) and HIPAA compliance, and an opportunity for efficiency through automation—and will discover data errors and rework that also present opportunities for efficiency through better data. Remember your superpower? You’ll need it here, because data is best fixed at the source, and the source is often many steps earlier than the problem step found by audit.
5
View SOX as a friend
6
Another CIO tip: SOX is your friend. While everyone complains about the “red tape of SOX,” the vast majority of SOX controls are common sense and help the business run better. Many SOX analyses discover Excel spreadsheets in the midst of critical processes, and every Excel “SOX critical control” is an opportunity to remediate a SOX issue at the same time a relatively expensive human being (or maybe two human beings, because there’s often someone else maintaining the poorly documented and complex Excel spreadsheet) can be replaced by automation. And how many chances does the CIO get to bring a smile to the CFO’s face—and perhaps get additional funding for a “SOX remediation” project?
Consider Company Efficiency versus IT Efficiency
CIOs are under constant pressure to cut the IT budget, while still delivering everything everyone wants, of course. When faced with that pressure it’s easy to forget the concept of “financial leverage.” What I mean by financial leverage is this: the IT budget is a very small percentage of total firm expenses, but can have a disproportionately large impact on overall firm efficiency. Think of a seesaw with the fulcrum set very far to one side—that’s the size of IT spend versus the firm as a whole—a small movement in IT produces a much larger movement in the company: that’s leverage. Here’s an example: •
Most IT budgets are set at 1% to 4% of sales. Assuming a $2B company with IT at 2.5% of sales, the IT budget is $50M.
•
Imagine cutting the IT budget by 10%—a very deep cut, indeed—and the “hero” CIO has just dropped $5M to the bottom line. Well done!
•
What if the CIO could make the whole $2B firm just one percent more efficient instead of focusing internally? That’s a $20M impact, or 4 times the benefit.
•
What if adding 10% to the IT budget ($5M) could improve company efficiency by another one percent? That is another $20M contribution, or a three-month payback.
3 5
w w w.d kl .com
w w w .d k l .com
May 20 16