WEBINAR: PHI in the ACO - A Focus on Data: Analytics, Collection, Risks and Contracting Considerations

WEBINAR: PHI in the ACO - A Focus on Data: Analytics, Collection, Risks and Contracting Considerations

July 17, 2014 2:00 pm

(Save to cal)


Healthcare IT and security attorney Tatiana Melnik and Accountable Care policy attorney Carrie Nixon join Online Tech to discuss PHI in the ACO: A Focus on Data - Analytics, Collections, Risks and Contracting Collections.

Title: PHI in the ACO: A Focus on Data - Analytics, Collections, Risks and Contracting Considerations

Description: Accountable Care Organizations cannot succeed without a strong information technology framework because they must collect, analyze and report data. This session will discuss:

  • The ACO model and the related technology infrastructure
  • The role of so-called big data, the need for data analytics and the ability to combat fraud
  • Using technology to engage patients, access quality of care and meeting reporting requirements
  • Legal risks including data breaches and other privacy violations
  • Contracting considerations with IT and software vendors



View Slides

April Sage: Tatiana is an attorney who specializes in IT, data privacy and security and specializes in regulatory compliance, especially all things HIPAA and HITECH. She's the managing editor of the Nano Technology Law and Business Journal and formal council member of the Michigan Bar Information Technology Law Council. Carrie joins us today from Nixon Law Group and she's the President of Accountable Care Law and Policy. She's a founding member of Health Care Solutions Connection, a network of experts and consultants providing integrated service solutions for the health care industry. With no further ado, Carrie, I'm going to turn things over to you to kick off our webinar. Thanks so much for joining us again today.

Carrie Nixon: Absolutely, thank you April, for having us again. I'll start off by giving you a little bit of an outline of what we're going to cover today. We're going to start off by doing just a little bit of level setting, talking about what an ACO is in brief and how we got here. Then we'll move to some of the ACO technology requirements and look more specifically at the role of big data and the analytics around it. Finally, talk about some legal risks and vendor contracting considerations.

Next slide. Level setting; why are we looking at the accountable care organization model? First of all, it’s not news to anyone on this call that our current system of health care was simply unsustainable. It was 18; it has been 18 percent of our gross domestic product. The next closest country in the world to that is The Netherlands, at about 12 percent. Despite our number one ranking in spending, the US is ranked 37th in quality by the World Health Organization.

What we're seeing here is a real disconnect. This is a problem, it’s especially a problem with the baby boomers who are coming of age and increasing our aging population. Most of the money, we know that most of the money spent on health care comes in the last decade or so of a person's life and because the baby boomers are aging and that population is growing, our costs under the current system will only increase if we stick with the same system and that’s just not sustainable.

The current system is a fee for service model and that model creates inefficient incentives, basically where unnecessary or ineffective treatments or tests are rewarded monetarily and this doesn't make sense. What we need to do is we need to align the incentives. We need to look at integration for lower fees and for better care. What does this look like more specifically? Next slide, please. Basically, the bottom line is this looks like a fundamental change, there’s no two ways about it. The goal is moving toward what many of us refer to as the Triple Aim and that will be a familiar term to many of you on this webinar. That is moving towards better care for individuals and better health for populations all at a lower cost.

This move is generally referred to as a shift towards accountable care. That’s not just accountable care organizations; accountable care methodology can also include bundle payment initiatives and anything where the use of evidence-based tests and treatments to achieve a successful outcome is rewarded. Today though, we're going to focus mainly on the accountable care organizations which is one of the lynch pins of the reform effort. Next slide, please.

What is an ACO exactly? The NCQA definition of an ACO and I referred to this in our last webinar as well but it’s worth a refresher is an alliance of physicians, hospitals and other providers that coordinates care for a particular group of patients, to improve quality and to reduce costs. An ACO and accountable care organization can be either a private ACO or a public ACO.

There are approximately 14 million commercially insured patients that are being served by non-Medicare ACOs today but let’s take a closer look at the Medicare ACOs and where they are. In terms of geography the Medicare Shared Savings Program ACOs are centered largely in highly populated and urban areas, most particularly on the coasts.

You can see that in the middle of the country or in areas where the population is more rural you're not going to see a lot of ACO activity or you don't see a lot of ACO activity at this point. I think that’s going to change due to CMS’s realization that this kind of a model really applies to rural populations as well and we need to make it work for rural populations by overcoming some of the barriers that a current ACO model poses for rural providers.

Right now there are about 259 Medicare Shared Savings Program ACOs to date and on January 1st, we will see a new round of ACOs awarded by, certified by CMS. Don't know how many it will be yet but the deadline for the application for the new round of ACOs, the 2015 ACOs is I believe the end of July. There may be some people on this call who are busily applying to become an ACO or there may not because the application process itself takes a lot of effort so maybe you are frantically working on this and will look at the webinar at a later date.

The application is coming up and we'll know on January 1st who got through that process and how many ACOs will be added to the MSSP Program. Next slide.

In the final ACO rule, CMS particularly and specifically set forth that ACO participants must have a meaningful commitment to the ACO’s clinical integration program to ensure its likely success. This is a big deal, the rule actually, it seems like a pretty simple thing but it’s actually a big deal. All of the ACO participants have to in some way, demonstrate a meaningful commitment to the success of the program to clinical integration.

Technology plays a huge, huge role in this, there’s no question about that. This includes internal assessments of cost, quality of care, patient engaged in outcomes, finances, financials, much more. All of this requires data, data is the key thing here and even further than that, it requires data management.

We take a look at the ACO technology requirements that were set forth in the proposed rule in front of us and since the ACO would have an infrastructure such as information technology that enables the ACO to collect and evaluate data and provide feedback to the ACO provider suppliers across the entire organization including providing information to influence care at the point of care by, for example, a shared clinical decision support, feedback from patient experience of care surveys or other internal or external quality and utilization assessments.

The final rule also goes on to specifically address data sharing between ACOs and CMS and this is a very important part as well so we'll talk about that a little more in a moment. Next slide please.

Let’s talk first of all, about some of the fundamental requirements for technology support for technology to support an ACO. I want to delve into this a little more deeply and talk about things that you should consider as you go about building a technology infrastructure for an ACO.

Some of the fundamental requirements include just in broad terms, things like the information sharing capacity. For example, electronic medical records, electronic health records, data collection and systems integration which is actually a very crucial part of all of this. Patient safety is another fundamental requirement and privacy and security concerns of course, that’s no surprise to anyone.

Let’s go a little bit deeper into each of these and talk first of all, about clinical aspects of the technology requirements. First and foremost, from my point-of-view, I think it is absolutely critical that when you are looking at a technology that’s going to play a part in your ACO infrastructure, you have to be sure that it is user friendly. In my mind that’s first and foremost. If something is not user friendly and easy to manage and understand, people aren’t going to use it or they certainly aren’t going to use it properly and that’s a really big deal. I see this a lot when I'm talking to physicians who are transitioning to or using electronic medical records.

Many of them absolutely hate the interface for their electronic medical records and therefore they don’t really do a good job of using the m. If the doctors and the clinicians who are inputting this information just aren’t really supposed to be inputting information, aren’t really using the tool as it should be used then you're really defeating the purpose. From my perspective, from a clinical standpoint, user friendliness has to be absolutely top of mind.

I say you should have some clinical staff take a very close look at some of the clinical tools that you are considering using in your ACO infrastructure. Let them test drive those tools and make sure that they're comfortable using them. Ask a lot of questions of your vendor and ask for references from current customers and then call those references. Actually go the next step, call those references and say, “Hey, what do you like about this system? What do you not like about it? Is it user friendly? Do you actually enter data into it? Do you actually use it how it’s supposed to be used?”

They’ll generally give you a pretty honest answer, so that’s something to really take into consideration on the clinical front. Some other aspects of the clinical part include clinical decision support. For example, does the technology that you're using include clinical decision support like prompts, notifications, reminders at certain key stages for patients with respect to whatever it is treatment or condition that they're dealing with?

Does it support good internal communication? For example, does whatever technology you're using, does it support good communication between the ACO staff, the providers and the patient? Does it notify the providers for example, when a patient has reached out to try to contact them, to try to contact the system? Does it let them know the substance of that communication and does it pop up with a reminder to say, “Hey, you need to give this person a call back.” Or, “Hey, you need to check in with them because you said that their visit that you would give them a call.”

The next thing to think about is external communication. What, does your tool allow you to actually communicate in an efficient fashion with organizations or providers that are outside of your particular network? Do you get for example, a notification of when a patient of yours goes to the ER? Or are you able to give access to records as necessary to some kind of an outside provider? Those are some things to think about and some among the most critical in my mind.

Let’s turn to quality for a moment; this is where the patient safety aspect comes in. Some of you may have been surprised to hear me say patient safety as a fundamental requirement for technology but there is an enormous patient safety aspect to some of the tools that are out there. First and foremost, and we'll talk about this more in a moment but the quality of performance measures are key.

Under the ACO rule there are four domains of quality measures or four sections of quality measures and they have to be reported to CMS. There are, divided into those four domains there are actually 33 measures where information has to be collected and recorded. A technology tool can also be used in a provider’s quality assurance program where a provider is reporting back something that went well or something that went poorly to the QA Committee and is making sure that follow up happens so that any poor outcomes don’t happen again. It’s important for monitoring and it’s obviously important for quality improvement as well.

Next, one that everyone is very interested in of course is the financial aspect. Technology that focuses on the financials needs to look at risk sharing analytics and this is a huge, huge thing for ACOs. Does your technology have shared saving algorithms, cost sharing algorithms that really give you insight into what your finances are the direction that they're headed and the direction that you need to go?

There are a number of platforms and technologies out there that are looking at this, that are constantly improving and trying to get it better, try to crunch the data a little more effectively but when you're looking at a technology on the financial side, be sure to ask what kind of algorithms do you have in place with respect to shared savings and cost sharing and predictive capacity?

Do your financial technologies have the ability to handle multiple types of payment arrangements? Do they track profit and loss? Other considerations that are important. Next slide, here we go, the technology requirements go on to include care coordination. For example, real time insurance coverage information. When someone comes in does your technology allow you to immediately undertake an insurance verification process to make sure that A, the person the is covered and B, that you have all of the correct information that you need to process that patient?

Another thing to consider is partnering with payers regarding case management arrangements. For example, is your technology able to allow you to enter into certain case management type arrangements with certain payers? The arrangement might be different with Blue Cross than it is with Humana, than it is with Cigna. Does your care coordination software have the capacity to deal with that? Does your software also have, or your technology also have the ability to share data during care transitions with other providers adequately?

This goes right into the reporting mechanism as well. We talked briefly about the quality before. Your technology absolutely, 100% has to be able to gather and retrieve the data for the quality measures and to report these measures accurately. If there is inaccurate or inadequate reporting, there is no savings for the ACO and that’s just the bottom line, the reporting is it’s a hard line it’s really everything, it’s what CMS bases your shared savings payments on so you've got to be sure to get that one right. When you're looking at technology there again, it’s important to ask for references and see how much work they’ve done in this area before.

On the population management side, this is, you want to look at identifying high risk patients, monitoring those patients. This is where some of the tele help technology can come in as well. ACOs are starting to use tele help a little bit more. We'll go to the next slide and I’ll turn it over to Tatiana to talk a little bit more about the importance of technology.

Tatiana Melnik: Thank you, Carrie and thank you everyone for joining our session today. As Carrie discussed, technology plays a pivotal role in an ACO but the technology is in place to allow ACO participants to gather the data that fuels the ACO. Collecting and analyzing the data is truly the key to the success of any accountable care organization and that’s because in order to improve ACOs must be able to consistently measure themselves to determine where they stand in terms of the requirements and then of course, distribute those measures to their key members so that those key members can improve. Having the right technology in place, one that fits an organization’s work flow and processes make this data gathering and share effort possible.

Conversely though, when you have the data you can also analyze to see if an organization of the work flow should be changed. As those on the technology side know, often times it’s not necessarily the technology that’s the problem it’s the way the organization is operating and using the technology that’s creating these inconsistencies and the difficulties that some providers encounter.

Healthcare certainly does not lack data. Consider for example, the clinical environment where data comes in from admissions, patient history and chart, imaging, lab results, clinical notes, among many other sources. The health care space will probably creating at least an exuviate of data annually and that number will grow. We have more data sources such as mobile devices connecting into the system.

Some, particularly clinicians will probably say that there is data overload in health care, too much information. The question that comes to the minds of many is what can you possibly do with all of these data? Maybe more directly, how can you handle and process all of this data? That’s really where this notion of big data comes into play and what big data means depends on who you ask.

There are many definitions of big data but there is a group of terms that are generally used to characterize certain aspects of big data and they're commonly called the three V’s of big data. That the volume, velocity and variety. Volume as probably is pretty evident means that there is a lot of data that’s being produced. IDC, which is a technology research firm, estimates that the world’s data is doubling every two years, that’s a significant amount of data. Worldwide, we're up to measuring data in yottabytes.

As you may have heard me in the last slide mention, in health care we're creating at least an Exabyte of data annually. In case you're wondering where that’s solved in the naming scheme, I've include a graphic for you so you see where it falls in terms of that amount of data we're talking about here. As you can see, we're looking at a significant amount of data.

The next V in big data is velocity, which refers to the rate at which data flows into and out of an organization. Given the improvements in technologies in data processing capabilities, some data can be processed as it streams into a system and it’s important to recognize that only some of the data can be processed. By no means can all of the data be processed real time because we really at this point in time, don’t have the technologies and certainly the capacity to deal with that much information. It’s also important to recognize that it’s not just about the input data, we also want to consider the output because the tighter the feedback group in a system then generally the greater the competitive advantage because the quicker you have the information on problems and good aspects the faster you can replicate the good things you're doing and correct the problems that you're seeing.

It’s important to have a process in place that’s either analyzing those outputs and flagging, sending up necessary notices to the responsible teams in an organization so that if there’s for example, a drug to drug interaction that it’s going to impact the patient’s safety that someone knows about that problem right away.

The last V in big data is variety, which simply means that data comes in many forms and successful systems and good analytics tools must be able to handle all kinds of data. It’s important to understand that different analytics tools are built for different types of data. You really need to understand the kind of data that you're processing so you have a better handle on the types of outputs you're going to get.

More recently the researchers have added another V, voracity which speaks to data assurance where the data, the analytics and the outcomes are error free. We're at a point now where we're not looking the three Vs of big data, we're now looking at actually at the four Vs of big data where we have volume, velocity, variety and voracity because it’s important that data is good. You can’t, if you use bad data to make decisions, inherently your decisions are going to be bad. Certainly, it’s very important that you're using good data.

It’s fairly obvious I think from the common fact Carrie and I have made so far that ACOs are going to produce a lot of data. In our data driven environment, data is power. Consider for example, comments that were reported earlier this year from Greg Sheff, who’s the Executive Vice President of Clinical Services at Seton Health Care Family in Austin. He noted that one of their biggest successes was uses predictive analytics to define the high risk patients and get a handle on them.

That’s pretty powerful when you can specifically pluck out patients in a population that are high risk and then figure out how to address your processes and change your processes to better accommodate their needs. Really, that’s really what we're talking about when we're looking at patient centered care. You're looking at what’s going to be best for their patient and analytics really gives you an opportunity to analyze what’s going to be best for your particular patient.

I should note that the data is becoming the new commodity. In this respect I want to specifically highlight the data broker industry which has been under congressional spotlight over the last several months. In 2012, for example, the data broker industry generated about 150 billion, with a B, dollars in revenue. Clearly, data is important and someone is willing to pay for it. It’s not like big data is going away and the importance of analytics is not going away. The organizations that survive are going to be the ones that graph the big notions and these needs for using data to improve everything about their organization.

Let’s look at for example, quality and using data to improve quality. By measuring and comparing the quality of care within a specific organization, if you can identify unnecessary or potentially harmful treatment and even more importantly, determine the pervasiveness of these tests within a specific population. They can find where these tests are happening, figure out how often they're happening and stop them because when you have the data, when you have the information you can then take action on that information.

ACOs can also improve care management. The analytics can be used to identify gaps in treatment as well as patients who may not be adhering to their treatment regimen. ACOs can also identify high performing and low performing providers. For example, I've heard of organization publishing these so called good and bad list to their entire provider network and taking advantage of the I think, naturally competitive instincts of most providers. No one wants to see themselves be at the bottom and when they see where they stand as compared to their colleagues, they have more of an incentive to do better.

For those of you for example, who are dealing with providers who are maybe not responding as well to the technology as you'd like, this may be a way for you to generate some more interest from them, but maybe perhaps giving them some sort of ranking to see where they stand compared to some of their better performing colleagues. Naturally they're a little competitive; they're going to want to important. Of course, those kinds of methods have their own downsides so you certainly want to consider how your providers would react to something like that before you institute that kind of program.

You can see from these few examples that there are plenty of uses for big data and analytics can be quite powerful. There are of course, a number of challenges with using information technology. At the forefront is that organizations have limited resources, there’s limited money, talent, time and so forth. Many of these analytics tools are expensive because you're getting the whole, you're not getting only the system, generally you're getting the staff and the team that comes with that analytics tool to help you implement and to help you process.

Many of these analytics tools as I mentioned are expensive but there also are open source options. As I mentioned though, when you're buying a service from someone you're getting that additional support staff that you're not going to get if you're using open source option. While you're saving cost on the one hand, you need to have the talent and prepare to have the time on hand to implement and learn those tools.

I've mentioned previously, data is also coming in from a large variety of sources. This raises issues with interoperability, interface costs, general data integrity and management concerns, and this means of course, that an ACO must have a plan on how to handle all of this data before it gets started. It’s very difficult once you start handling these large amounts of data to change your processes into plans for these systems if you have not really given it some thought at the starting point.

It creates all kinds of downward challenges that you may not be in a position to change very easily because your initial startup costs are very high, both in actual money and time. When you have to retool your systems and your infrastructures that simply dereigns your process so it definitely is good if you consider a lot of these issues from the start.

Additionally, of course, all of the components must work together. As Carrie will quickly discuss briefly in a few minutes when she discusses implementation. IT integration can be very, very challenging. In the ACO context this can be quite problematic because share savings payments linked to quality performance are based on a sliding scale that rewards attainment.

As Carrie mentioned earlier, if you don’t have the numbers, if your tools, your reporting tools are not working properly and you can't show the numbers then you're not going to get the rewards because you need to have proof that you have met the quality metrics. If those are not demonstrated and documented there’s going to be some real problems. It really does, when you start looking at the way your systems are integrated, this really does become quite problematic with the numerous HR systems out there, not to mention E-prescribing, lab systems and build force. Again, it takes, it will work out well for you if you consider some of these challenges from the start and work with someone whose an experienced project implementation specialist, and of course, make sure you have the supporting documentation in terms of your contracts to actually back up all of these requirements. Carrie, I’ll send it over to you now.

Carrie: Sure, given the importance of data with respect to accountable care organizations, it shouldn't come as any surprise that technology vendors are absolutely key and integral. It’s a little bit counterintuitive because everyone's saying so and accountable care organization they're a provider, what does technology really have to do with it and isn’t the, aren’t the technology vendors the side IT guys on the outside?

It’s just not the case in this new era of health care. The technology vendors are key and I'm going to take this opportunity to get up on my soap box a little bit and I know Tatiana joins me in this. I cannot underscore enough the importance of contracts between technology vendors and providers on both sides. The importance is often overlooked. These contracts, it’s a pain to go through and read them word by word, they're long; they're often confusing, everyone is feeling the urgency of simply getting the project up and running and they don’t want to deal with really paying attention to the contracts.

I get it, it’s understandable but the contracts really lay the foundation for a lot of these things and it is important because these contacts [sic] can be drafted in a way that is very, very one sided, either on the provider side or on the vendor side. Both parties need to be aware of that and pay attention to that. Have people counsel, take an active role in reviewing the contracts before you sign them, before they are signed, definitely before they are signed because your counsel can’t help you really too much after they're signed.

Get input on what these contracts actually say, ask for help in understanding what implications are for your business model. The bottom line is that working with vendors raises your risks and the risks can be minimized through the way your contracts are developed. Next slide.

Part of minimizing your risk is putting project implementation at the forefront of your thinking. Let’s take healthcare.gov as an example and let’s actually look at it as an example of how not to do technology infrastructure development and project implementation; no surprise there. With over 50 vendors involved in the project, the healthcare.gov project, the question arises, were there just too many cooks in the kitchen? The answer is maybe, but maybe not. If there had been a real point person at the helm overseeing the integration of these different systems and the implementation of the infrastructure as a whole. Part of the problem was there really wasn’t that point person playing that role. As Tatiana mentioned, each technology is very specific. The clinical technology is different in a lot of ways than the financial technology. Sometimes some of that is different than the reporting technology that is used to directly get things to get information to CMS.

The point is, how does all of this stuff interact together and how does it implement together? Healthcare.gov had problems with their interface, they had problems with capacity, they had lots and lots of problems which all of you on this call, on this webinar are aware. We just wanted to look to this as a lesson underscoring the importance of carefully thinking through your ACO technology infrastructure development and incorporating that planning into your contracts. Next slide please.

If we look more closely at project implementation the first thing I think you have to emphasize is outlining the scope of work with as much detail as possible but also in a way that doesn't box you in unnecessarily. For example, we've all probably been in situations where we think we understand the scope of work of a project and we get into it and there are unforeseen circumstances or something changes, and suddenly your scope of work isn’t really the right scope of work anymore.

How can you be as detailed as possible but not box yourself in? One way, as you're thinking it through how to structure your contract, it might include periodic check-ins between the parties on the status of a project with the ability for either party to renegotiate for unforeseen circumstances or for additional scope of work. You might consider dividing a project into phases with an evaluation period at the end. You should certainly include clear deadlines for stages of the work and for payment for that work.

A very important thing to think about in vendor contacting which is something that people don’t like to focus on is termination clauses. The fact of the matter is that breakups happen, they are often unpredictable. No one ever goes into a relationship expecting that something's going to go wrong. It’s important in this kind of a relationship to make sure there’s a termination provision that protects your interest and doesn't leave you high and dry if a problem arises. Next slide.

Tatiana: Another important consideration when we're looking at legal risks and vendor contracting is privacy and security. HIPAA issues always arise in the health care setting and given the number of penalties and enforcement actions, it’s important that these requirements are discussed thoughtfully. There’s been some commentary over just recently over the last few days stemming from remarks made by a chief regional civil rights counsel at HHS. He was speaking on his own behalf at the recent American Bar Association conference in Chicago. Speaking on OCR enforcement, he told the conference attendees that the past 12 months of enforcements will likely pale in comparison to the next 12 months. He said that knowing what’s in the pipeline. I suspect that the number will be low compared to what’s coming up.

Take that from, take from that what you will but certainly it’s something that you can see, everyone's taking quite seriously and these issues really do need to be addressed in your contracts. Please don’t take for granted that someone has signed a business associate agreement and now that means they're living up to their expectations because, speaking from someone who has read a large number of various business associate agreements, they're all different, they all say different things and they all have different requirements, there’s no one size fits all.

Those agreements need to be negotiated to fit your specific requirements. On that note, friends don’t let friends have data breeches, remember that. Friends don’t let have friends have data breeches because everyone ends up getting burned in the end. As we recently saw in the New York Presbyterian and Columbia University settlement with the OCR, when two parties are in an affiliation agreement and one of them has a data breech, the other may get investigated as well which is what happened with Columbia where the breach happened at New York Presbyterian.

Similarly, the SCC has made clear that it expects parties to flow down privacy and security obligations to their vendors. For an example of that you should take a look at the settlement agreement with GMR Transcription Services that was entered into earlier this year where SCC specifically called out the lack of the agreement. The lack of GMR failing to require its contractors to implement security measures. This basically says that they failed to flow down appropriate privacy and security considerations.

In considering privacy and security, make sure that you're reviewing the business associate agreement in conjunction with any master services agreement and the statements of work. It’s rare that a business associate agreement is segregated from other contracts, although this is something that I generally try to do so I know it happens; trying to box in those requirements. Most of the time though the obligations are going to be split over a number of agreements and what happens in one agreement is going to impact the rights and obligations under the other agreements. It’s really important to understand how those pieces play together. As penalties and compliance obligations increase, it’s also important to consider whether your agreement should require that your vendors carry several liability insurance and whether or not, probably more from the vendor side, whether or not you want to implement damages cap and how much those caps can be, certainly they can be different and they can be one size in the master services agreement and then a different size in the business associate section.

Another issue to consider is secondary uses of data. With data being the new commodity, many IT vendors and data analytics firms want the right to use the data outside a specific contract relationship. I know I've seen this come up in a number of agreements. Is this permissible under HIPAA? Maybe, it may depend on who’s doing the D identification. It’s really something that is case specific and has to be analyzed carefully.

The D identification that happens must be measurable. There are many tools available and many of them have been developed with HIPA considerations in mind because HIPA standards are generally considered to be a strong source of guidance for anonymization. There’s certainly a lot of options out there when you're looking at D identification.

Really, you have to look at this issue a little bit more broadly and in the scope of the entire process of how data flows throughout your network and that includes how it got to you if you're the provider. Consider whether if you allow this whether now that means it’s a sale and that should have been disclosed in the notice of privacy practices. If it wasn’t, and the data falls within the definition of a sale, that can be quite problematic for providers.

In terms of big data, there are also issues that are specific to analytics. Having big data means that you'll need analytics to make sense of the data that you have, there’s just no getting around that because no person can handle that much information. Data science is not perfect, information is going to be missed and organizations must be prepared to live with a certain amount of uncertainty when you're using analytics. It’s important to understand though; the algorithms that are being used so that the good and the bad of the specific algorithm and the kind of results they will produce or fail to produce are understood. This is particularly an issue with predictive analytics where you're using your existing data to predict what’s going to happen in the future or segregate patients based on X, Y, Z requirements and then predict what’s going to happen with those patients. It’s really important to have someone on staff that can explain those nuances to you so you know when you're purchasing a particular product how it’s going to work out or at least you have some ideas for how it may work out.

If analytic methods are applied improperly then mistakes are likely to propagate through the entire analysis. As I mentioned earlier, bad data gives you bad results so it’s important to understand how again, those methods are used and what those methods are and how they impact your data. For this reason it’s really important that contract terms specify the analysts have the skills and experience necessary to perform the services and that the services are going to comply with prevailing industry standards.

You need to use language like that, it’s even better if you can generally define with specificity what you want those standards to be. Certainly that makes it easier to say, “Hey, you didn't follow these X, Y, Z requirements, this is a problem for us.” It’s certainly; it’s much more helpful to have someone comply with the requirements if you identify expressly what those requirements are.

It’s also important that both sides consider damages caps and providers should definitely require that their vendors carry E&O insurance because that’s a big deal. Again, if you have bad data, you use that tool; you use the wrong analysis that can cause some serious repercussions when the ACU goes to report the data to CMS. Carrie?

Carrie: Sure. I want to run briefly through some other contract considerations before we turn to any questions that you all have which we'll be eager to hear and eager to answer. Who owns the data that an ACO is dealing with? Is it the ACO participants? Is it the management services organization? Is it the vendor that’s crunching the data? That’s an important consideration and as you know, sometimes people may instinctually think of course it’s the ACO participants that own the data.

Not the case, what I'm seeing in some structures is that for example, the management services organization may own the data and the participants are contracting with that in mind and the management services organization may then do with the data what it wishes. This is not bad, it’s not a bad thing at all, it’s just a consideration when a provider is entering into a relationship. Where is the data being stored? Who’s storing the data for you? Is your vendor storage data outsourcing that to others; the storage to others? Does it even matter? Should there be flow down requirements? Do you need to detail the system infrastructure? What happens if you want to leave, if you want to exit the relationship? We talked about that a little bit as a break up issue. Are there transition costs to doing that? The fact of the matter is there usually are some transition costs but sometimes the cost is worth it for the long term but you need to be sure that you know what happens to your data when you want to leave either a particular vendor provider relationship.

Of course you need to pay attention to issues of identification, representations and warrantees, jurisdiction. I would say you also may want to consider having a mediation or arbitration clause in there where if a dispute arises the parties are agreeing to arbitrate rather than taking it to court right away. Those are some other things to think about. With that for now, I'm sure you all have some interesting questions that you'd like to pose and I'd be happy to turn it over to April to facilitate some of those questions for us.

April: Great, thank you so much Tatiana and Carrie. A question about the insurance expectations. You mentioned a couple different types of insurance, one was a basic cyber insurance or liability insurance against data breeches and the other one I think you mentioned was E&O, which I believe is errors and omissions insurance; is that correct Tatiana?

Tatiana: Yes, that’s correct.

April: Can you describe if you have an example of what is a reasonable expectation for those insurance policies?

Tatiana: Sure, in terms of the cyber liability insurance, it really depends on the amount of data that’s being processed and the number, amount of data that’s being handled. If you look at the Ponemon Institute data breech studies, I think the most recent one came out last month, where it talks about the cost of remediation per record and that’s, from my perspective, that’s really the way you want to look at it. If there’s a breech, how much is it going to cost me to fix whatever happened?

You want to use the cost to remediate per record as your at least your initial point of discussion or your initial point for internal analysis for how much insurance you want to carry and how much insurance you want your vendor to carry. I think that the Ponemon Institute report said that the cost per record was maybe around $250. Again, use that times the number of records you're handling and I think you can see that it’s going to be millions of dollars so it can be quite frightening.

In terms of errors and omissions, again, you want to think about if someone makes a mistake in the analytics, really truly a serious mistake that causes harm to the ACO, how much money are you losing? Not only how much money are you losing, how much is it going to cost for you to get back to where you were supposed to be?

When you're looking at data analytics and the kind of processes you're using for improvement, some of the things may not be as harmful because you can compare and see over time whether or not that analysis is working. If you're looking at a continuous improvement model which is what you're supposed to be doing, the ramifications may not be so bad. If you're looking for example, on the other hand, where you're looking at the quality reporting and now all of your numbers are wrong, those requirements that can be quite harmful because now the shared savings that you're expecting to get, you didn't get as a direct result of this company’s mistake.

You want to really, truly view an internal analysis of what this will cost you and to the extent you can, you also want to take a look at your internal policies, not internal policies, your internal insurance policy; what do you carry? See what your insurance companies will cover versus what you want the other side to carry because certainly, you should be insured as well.

Do not rely on the other side to be well insured unless you're getting a routine at the station from there and a routine proof that they continue to carry that insurance because if they drop that policy right now, maybe they don’t have millions of dollars to deal with the data breech and you're left holding the bag.

You want to certainly assess and when you're looking at who are good partners, please, please consider these issues because it doesn't matter if you're getting the lowest price point, if that partner is judgment proof. Your partners are only as good as what they can live up to and it’s not, your determining factor should not be who is the most cost effective for what I need, it should be who is going to be the best partner long term for how I want to grow.

April: That makes a lot of sense, Tatiana, thanks very much for that. Carrie, anything to add to that question before we move on to the next one?

Carrie: No, I think Tatiana covered it quite well, thanks.

April: The next question is a bit broader. This is one for each of you. What are the trends that you see in the physician lead ACO market and do you think they’ll be able to compete with the larger ACOs? Carrie, do you want to take it first?

Carrie: Sure, I’ll take a first crack at that. Interestingly, the prediction at the outset I think was that most ACOs would be hospital driven. What we have found thus far is that the majority of ACOs actually are physician driven. I think this is a trend that is going to continue. I think as physicians have gotten on board with health care reform and as they continue to get on board with health care reform, they are going to be looking more carefully at technologies that can help them achieve what they need to achieve.

I think that physician groups are coming to realize that the old way of doing business isn’t necessarily going to continue to be the best way of doing business. I think we're seeing a new openness to technology. Technology in practices these days is not just electronic medical records. The fact of the matter is there’s a lot more to it than that.

Last week or perhaps the week before I attended the Heath Datapalooza event conference in Washington DC and it is amazing to see the new technologies that are popping up out there. Many of them specifically around accountable care and ACOs. I think … I'm seeing physicians particularly entrepreneurial physicians of course, attending these types of things and wanting to test out these technologies.

I think that physicians are beginning to realize the important role that some of these technologies can play in allowing them to engage in an easy way with their patients and improve the quality of care in that way. Particularly when we're talking about mobile devices, the use of mobile devices by patients and when we're talking about tele help type technologies as well.

I think there has been some resistance to technology at the outset but I think particularly as a new generation of physician is coming on board and acceptance of the new era of health care reform is growing we're seeing an increased acceptance of technologies.

Tatiana: As you heard me mention a couple of times, data is power and I've seen physicians really start to recognize that and embrace it. Hospital groups need them because the patients are going to the physicians first; they have all of that information. To the extent that physicians are really embracing control over their patient data, that’s really the key to them staying independent and then continuing to run their practices as they see fit. In terms of the successes, I think there’s been successes on both sides. Some ACOs, some hospital led ACOs have been successful and some physician led ACOs have been successful. It really comes down to who is at the helm and do they grasp the influence of data? The kind of control that it gives them and the kind of opportunities it gives them to improve the lives of their patients.

Carrie: Yeah, I would agree with that. I think to the degree that physicians want to maintain their autonomy and control over their own practices, and they become convinced that data is the way to do that, they are even more likely to embrace it.

April: Great feedback, thanks so much. All right, we have time for one more question here before we close out today’s webinar. It goes like this: With the diversified EHR adoption, the cost of integration is higher; can you compare some of the pros and cons of an isolated perhaps more affordable and easy to implement data analytics solution and compare that with a more expensive and completely integrated solution. Where do you see the advantages falling out between those two?

Tatiana: That’s a really great question and from my perspective it really comes down to the skill set that staff members have in handling the technology. Doing data analytics and I'm not talking necessarily about designing the algorithm but reading the reports, running the reports requires a certain specific skill set that I don't think most physician groups or even most hospital groups will necessarily have on site.

To the extent that that question suggests should we look at using internal systems or using external systems, then it really depends on the kind of staff members and the skills of those staff members that are in place at a particular organization. Certainly I've heard of a number of organizations using an outside vendor to come in and run the analytics and to set up the entire process and then giving the internal staff members training so that they can then control the processes more directly.

It’s true that the integration costs in dealing with a lot of these systems are very expensive. For those of you that have an interest in this space in the cost of integration and some of the maybe potentially antitrust and anti-competitive concerns surrounding this, please pay attention to what the Federal Trade Commission is doing on that front because they're definitely interested in how IT vendors are behaving in our current system.

If you're looking at I think in analyzing the costs, you want to look and see what kind of outputs you can get from your existing data sources and then figure out what processes, what technologies and what analytics tools are going to be best able to read those out with. A big problem that I've seen in terms of being able to truly analyze the data is that the data is locked out and when you go to export you don't actually get all of the sources of data that you probably need to run good analytics and so you're stuck at that point using whatever rudimentary systems the software vendors can provide you.

As Carrie mentioned earlier, there are a lot of other vendors who are in this space who are building other systems and building wrap around type systems in place where they can help you pull out some of those additional data sources. I think it goes to your benefit to really look at what’s out there and the kind of opportunities you have and certainly work with someone who’s experienced in the space. It can help you navigate some of these very difficult issues.

Carrie: Great feedback Tatiana had. Yeah, I would just say very briefly to the degree that a solution is relatively inexpensive and easy to implement. I talked before about how ease of use is really important because if people aren’t using the technology or they aren’t using it appropriately then it’s no good. If you can find an inexpensive easy to use system that can easily be implemented then maybe that’s okay. By the same token, integration is incredibly important and you have to be sure that you have the full wrap around of the data and that you're taking a comprehensive look at all of the data.

To the degree that an integrated system may be more expensive but it is also easy to use, then great. If it’s an expensive system that’s fully integrated and it’s not easy to use, it’s a real pain to use, the interface is terrible then it’s no good, no one’s going to use it anyway and your money is not going to good use, you're getting bad data. If it’s a fully integrated system, if it’s highly user friendly but it is more expensive, you may want to consider taking a longer term view of the investment.

April: Great feedback. Thank you so much Carrie and Tatiana and thank you to our guests today, we had some great questions to discuss. We will be posting a copy of the recording and of the slides. We'll send out a link to everyone who attended today. In the meantime, if you have any other questions that come up, please feel free to reach out to Tatiana or Carrie or us at Online Tech. We'll make sure that you get connected with them. Have a great Tuesday everyone and we'll see you on another Tuesdays at Two webinar soon. Thank you everyone. Bye.

Tatiana Melnik, Attorney
Tatiana MelnikTatiana Melnik is an attorney concentrating her practice on IT, data privacy and security, and regulatory compliance. Ms. Melnik regularly writes and speaks on IT legal issues, including HIPAA/HITECH, cloud computing, mobile device policies, telemedicine, and data breach reporting requirements, is a Managing Editor of the Nanotechnology Law and Business Journal, and a former council member of the Michigan Bar Information Technology Law Council.

Ms. Melnik holds a JD from the University of Michigan Law School, a BS in Information Systems and a BBA in International Business, both from the University of North Florida.

Carrie Nixon, President of Accountable Care Law & Policy
Carrie NixonCarrie Nixon is the CEO of Nixon Law Group and the President of Accountable Care Law & Policy. She is a founding member of Healthcare Solutions Connection, a network of expert consultants providing integrated service solutions for the healthcare industry. As a longtime attorney for a variety of clients in the assisted living and long-term care industry, Carrie has on-the-ground experience with the unique challenges facing those who serve our aging population. She has successfully defended these clients against malpractice claims and deficiency citations, helping them to navigate the ever-changing regulatory and risk management landscape.

Carrie holds a JD from the University of Virginia Law School.


Webinars    |    Online

Get started now. Exceptional service awaits.