Achieving Compliance: How to Prevent a Security Breach

Join Jason Karn of Total HIPAA Compliance and Dan Brown of Taylor English as they discuss how to prevent a security breach in this informative webinar.

Earlier in this series, we discussed How to Prepare for a HIPAA Audit offered tips for Updating Your Plan & Training Your Staff, and reviewed the pros and cons of using Electronic Devices in Your Practice. In the final webinar of this series we will provide tips and next steps for How to Respond to a Security Breach if one occurs.



Download the Slide Deck


HIPAA Resources



Q: When do you have to report a breach to the HHS?​

A: Jason Karn: We call this the Rule of 500. If it's under 500 people who have been affected, you don't have to report to HHS immediately. However, you then have to collect those names and report it to HHS at the end of the year. Let's say you lost say a spreadsheet that had 200 names on it, that would be 200 people whose information was lost, so you would need to report that to HHS at the end of the calendar year. Though you would need to notify those people immediately and depending on what your state notification rules are like, or if you've had them sign off on notification processes, you could set that up so that you can email them. You would need to contact them or give them a phone call and let them know what's happened. If it's over 500 people, or 500 people in two different breaches, and it adds up to over 500 in a calendar year, then you need to report that to HHS within 60 days of knowing of that breach, or reasonably knowing when that breach occurred.

An important caveat on this is make sure you know what your state notification rules are because if you're in the state of California and you have a breach of 500 or over, you have to also notify the state attorney general, and notify their office. You don't notify their office you could be subject to fines and penalties from the state attorney general of California. There are other states that have those rules I think, California is the most prominent one that I know of. If you have offices in multiple states, you need to make sure you know what the different reporting, and different rules are per state, because you can get into trouble with that also. 


Q: If you report a breach to HHS, are you automatically going to be audited?

A: Dan Brown: If you report a breach under HIPAA, to the government, that does not necessarily mean you're going to get audited. It's likely there could be an investigation of your facility, of your program. Having said that, I think it's a good practice that before you notify a breach voluntarily ... Right now, we're talking about, we are volunteering a breach does not otherwise require for us to report. Make sure you understand the facts. It's not a security breach of 500, so we have no choice but to notify. If we find out that we haven't done our risk assessment, let's say, at that point are we obligated to report ourselves to the government? The answer's "No." I think what you're obligated to do is to correct the problem as soon as you become aware of it.  

If you do self-report, I believe you are opening yourself up to a much greater risk of having an investigation made against you. That will really depend on how much resources the government has at that time in that location.

Jason Karn: To add to that, I would make sure, because at the time of self-reporting, which is, if you do have a release of information, that those breaching notifications to the government are mandated. It just depends on how soon you have to do it.  I think the important thing is, if you have to self-report, is showing also what you did to mitigate that breach: to show that you did another risk assessment, that you analyzed where the problem was, and what you did to then mitigate that breach.

Some breaches are easier to handle than others. If you mis-faxed somebody, that you contacted the person that you mis-faxed to, and that you did the best you could to retrieve that information or have that information destroyed, and then turned around and started programming fax numbers so that people wouldn't dial the wrong fax number, that you show a process of what you've done. I think that will go a long way to helping you out, also, and protecting you.


Q: I frequently receive PHI by fax through an app on my phone, but if the app has a bypassed login, is the data still safe if my phone is encrypted?

A: Jason Karn: You definitely need to turn the authentication back on. Although the phone is encrypted and password protected, which does give you that layer of protection, and I'm assuming that they've done their due diligence, if the fax is being sent over an SSL secure socket layer with transport layer security, meaning the data is encrypted in transit, then yes, technically it would be. From a safety standpoint however, I would reactivate the security bypass and make sure that's a secondary point of authentication. That would sort of serve as your two-factor authentication for any faxes that are coming in. Just because you're going to have PHI in there, I think you should do everything you can to protect that information. 


Q: What should I do if I've lost a thumb drive in my home that contains PHI? 

A: Jason Karn: That's a tricky one. It brings up two things. How do you really know that you lost it in the home? I think you would have to do a Risk Analysis on that and figure out where you think the drive might be and do your best to track it down. If you're unable to find it and are certain it cannot be recovered, you may need to report that as a breach. That's why I think it's so important that we start getting away from external drives and flash drives, and start using web tools in the cloud. Keeping things either in our EHR, or keeping things in Box or Sookasa accompanied by a signed Business Associate Agreement. 


Q: If a device with encrypted PHI is lost, why is that loss not considered a breach?

A: Jason Karn: As long as that information is password protected and encrypted, and the encryption keys and passwords are not with the device, that's basically looked at as saying, "That information is unreadable and unusable in its current format, and therefore it's not a breach." This is actually addressed in the law, and it states that if data is in an unreadable or unusable format, that it is not considered to be a breach.




Olive Lynch: Welcome again everyone, thanks for joining us today. My name is Olive Lynch and I'm the Creative Marketing Manager here at NueMD. We're happy to have Jason Karn and Dan Brown back with us for today's HIPAA presentation.

Jason Karn: Hello everybody, welcome to the webinar today. My name is Jason Karn. See I'm the Chief Compliance Officer over at HIPAA Compliance. I'm going to pass off to Dan here he's going to do a little introduction. Dan.

Dan Brown: Good afternoon and good morning to those in the West Coast. My name is Dan Brown, I'm delighted to be participating today in our HIPAA webinar on security breach and what that means and how to mitigate it. Just as a little disclaimer today, I am an attorney, Jason is a HIPAA compliance professional, we'll be discussing some of the laws that affect you and your compliance with HIPAA and security breaches. Even though I am licensed and attorney, I'm not speaking today as your attorney. This is education, this program is educational only and intended just to go over some of the high points to make you familiar with some of the laws are. With that legal disclaimer, I'm going to go ahead and pitch it back to Jason who's going to talk a bit about what a security breach is.


What is a Security Breach?

Jason Karn: Thank you so much, Dan. A security breach. What is that? That's any unauthorized release of information. We're going to talk about security, and we're also going to talk a little bit about privacy breaches also, so any physical releases of information. This is at any point that you would accidentally send maybe a fax to the wrong person, or a hack or when somebody makes a mistake.

We're going to talk about the main sources. Breaches often occur due to theft. These are thefts of when we look at computers, flash drives, any sort of media that may have information on it. This could also be paper files, anything like that. We have the loss of a device. Recently, we have seen a lot of these and we've seen some pretty hefty fines from these loss of devices. This could be just somebody leaving a device in an airport or in a coffee shop and walking away and not realizing that, and this information is unencrypted and not password protected.

As you guys probably have heard more and more about, we've got a lot of hacks happening, especially with medical professionals. There's a lot of ransom ware hacks that are starting to happen, and there's been some really high profile ones. We've had a couple that the big one was out of Los Angeles, and they ended up paying I think about $18,000 to get their information back because they didn't have a good security plan on that. It's going to be interesting to see how that plays out in the next couple of months or years, if that gets investigated by HHS, what kind of penalties we're going to see for not having a good plan in place for something like this.

The one that captures everything is mistakes. I wish I could say I never made any mistakes, but unfortunately, we all make mistakes and sometimes those can be very detrimental to our HIPAA compliance. Mistakes would be, again, faxing to the wrong person, could be leaving your laptop. There's a whole slew of ... speaking too loudly while in the hallway and other patients overhearing what's happening. These are the kind so things we need to look at, and those would be breaches of sorts just because you're releasing that information to somebody who's not authorized to access that.


Preventing a Breach

Jason Karn: How do you prevent a breach? This is ... I wish there were fail safes for every situation and we could make this a hundred percent secure, but I think the old adage is, when they talk with computer programmers, they say it's not if you get hacked, it's when you get hacked. What we want to to is we really want to try to stop this as soon as possible, and hopefully prevent anything happening from the start. These are what I call the low hanging fruit of things that you can do, and do fairly effectively and easily, and if you're on top of a lot of these things, it really will take care of a lot of your problems.

Updating operating systems, that's one of the key things you can do. You want to make sure that any software you have, so I've got it twice here, one for software, just so if you're using your EHR, you want to make sure that EHR is always up-to-date, that you're using the most recent version. Your operating systems need to be updated. There are security updates that come out fairly regularly, and those need to be installed as soon as possible. Now, you don't want to do it in the middle of the work day. It's best to do it in the evening, maybe over a weekend? Sometimes, if you're doing a major system upgrade, say going from Windows Eight to Windows Ten, you're going to want a whole weekend for your device to index your hard drive, because it's going to re-index everything. That can slow your systems down, so just keep that in mind when you do these security updates, or these major updates, to your system.

It's very important that you do these. These can solve a lot of problems, because usually when these security updates come out, that's when there's been a hack that's been released. That's when somebody has noticed that something is happening, and that means that it needs to be addressed fairly quickly. Putting an anti-malware software on your computer. You want to make sure that all attachments are being scanned. You want to make sure that if you're allowing people to use things like flash drives, that they're scanned when they're plugged into the systems. I would recommend that you start migrating away from those since there's great file sharing programs like Box, and Sookasa, and those sorts of things that will encrypt that information. They'll sign the BA agreement with you. I would start migrating away from those because you have to keep in mind once that information goes onto an external device, you lose control of that information. At that point, your job is really to keep control of that information, know what's happening with it. If it's off your system and you can't log it, it's the same as if you don't have it at all.

Making sure that you do, back to software, making sure you have that malware software in there. It's constantly, and you want to keep that up to date, because new definitions are coming out constantly for that, so it's very important to keep that up-to-date. Training your staff, that is probably one of the most important. That's probably your weakest link in the chain of everything. If you've got your software up-to-date, you've got your malware software running, your operating system is up-to-date, the staff really becomes a crucial cog in that whole chain. You really want to look at it and say, "Okay, do they know I'm not supposed to open attachments from unexpected attachment from people? Random emails? Those sorts of things? Do they know what information they're allowed to talk about? What information ..." Maybe making sure that you have a social media policy so they know what they can release and what they can't release?

We actually have seen some people get for posting things improperly on websites. You want to be really, really careful about that and make sure you've trained your staff on that. Also, they can be also a first line of defense, because if they notice screens are up that aren't supposed to be up, or something looks weird in the system, or something's changed, they can alert you and your IT staff that there's a problem, so making sure that you empower them also to notice and to speak out when they see changes in the systems. Probably the most important, and I think I've harped on this in probably every webinar we've done, is password security, and training your staff on that password security. That is the easiest way in and it's the best way to keep somebody out.

You want a difficult password that should be at least ten characters long. It should have upper, lower case letters, numbers, special characters. You should change those fairly regularly. It's a good practice also to make sure that people, if possible, are not using the same password for banking as well as social media, as well as to get in access to your EHR. You should have distinct passwords for different items, making sure that you're protecting them and that you're changing them regularly. Here, what I do, and what I recommend for a lot of people, is using password management programs. I use them. I think they're great. I think it's a great way to keep control of those passwords, make sure that they're constantly being updated, and they're using really difficult passwords. Personally, I use passwords there in the twenty to twenty-five range if I can. If the system won't support it, I use the max of what a system will support. You basically just have to remember your master password, which you want to make sure you use a very difficult password for that, but those programs really help out.

Some of those programs will be like Last Pass1Password is another one, but you want to make sure that there's some pretty strict authentication on that also. With passwords, I speak about authentication, if you have the ability to turn on what we call, "Two-factor authentication," which means you're using a secondary device to authenticate who you are. That would be, there's a program called Google Authenticator that works really nicely. It's very universal. It could be the system sends an email so that there's a secondary code that has to go in so that you make sure that person is who they say they are, and that they're actually in control of the devices that they say they have.

Encrypting information. Now, I know most people will say, "HIPAA doesn't explicitly say that you have to encrypt information. It's a recommendation, or something that's addressable versus being a standard, or something that's required." In that same sentence it actually says that you need to do a thorough Risk Assessment and determine if it's appropriate for your practice or your business. In actuality, I've done a lot of Risk Assessments. I've done thousands of Risk Assessments for people, and I've yet to see somebody who's said to me, "This is why we don't encrypt information," and said, "Okay, that's all right." I would say it's your get out of jail free card, because if encrypted information is released, and there is no password, and the key is not with it, that is considered not to be a breach. You don't have to report that, and it's a simple thing to do.

All operating systems now, I think Seven is the last one that, when it comes to Windows Seven, you have to have I think it's Elite or Professional in order to have what is called Bit Locker. That's a free program. It'll encrypt your drive. You put password protection on there. If that device is stolen, or you lose that device, you don't have to report that as a breach, because there is really no way that somebody can get in and steal that information, because it's password protected and that information's encrypted. When it comes to the Mac operating system, it's Fillable Two, like the number two. Again, free program. You'll notice little to no lag on your system at that point, one you encrypted information. Chips today are fast enough that they can process that information without any problems.

Then, making sure you're encrypting those mobile devices. Your phones should be password protected at all times. Think of the information you have on those devices. You have emails, if you're allowing emails to be accessed from your EHR, you could have encrypted and un-encrypted emails, depending on how patients are contacting you. We want to try to migrate them all to using an encrypted email platform, but that information is there. You may have personal financial information on that phone. You may have other identifying information that you don't want getting out, so it's really important that those are encrypted and they're password protected. IPhones are natively encrypted, you just want to make sure you turn on that password protection, that you keep it on. I recommend that you have somebody log in every time. That, with the thumbprint authentication on a lot of these phones, it's very easy to do and to set up, and it should be on at all times.

If you're using Android, or Windows phones, you do have to set up encryption. I've said this before to people, make sure that you've backed up that phone before you do this, because if at any point you stop the encryption process, it will brick your phone, and you'll have to start from scratch, and you'll lose any information you had. You want to make sure you do that. Do that at night, set up the encryption and make sure that runs over night, that way you won't be interrupting it. Also, you want to encrypt any external SD cards that you have. This information can end up in those hands. Again, encrypt that information. Password protect your devices. You've got to make sure all those devices are password protected and encrypted. That just, it's again, will save you so much.

Review those Business Associates. We have an example coming up. I'm going to talk a little bit more about some specific examples that we've seen, what the fines look like, and what they could have done to stop that from happening in the first place, sort of use it as a learning experience. It's really important that you make sure that Business Associates are doing what they're say they're going to do, and you have a signed and counter-signed agreement, and that you keep that in your files, and that you can access that in case there's a problem. 


Penalties for a Security Breach

Jason Karn: With that information I'm going to pass back over to Dan here. He's going to talk to us a little bit more about what the penalties look like for breach before we go into some actual examples. Dan?

Dan Brown: Hey Jason. Thank you very much. Appreciate it. I'm going to talk a little bit about the penalty process for a breach of HIPAA violation. This goes more than just a security breach. HIPAA requires that all covered entities put on certain privacy requirement as well as certain security requirements. If there's a breach or non-compliance by a covered entity, healthcare provide of any of the privacy or security regulations, you're exposed to some risk.

There are two types of risk that HIPAA will impose upon folks. Somebody will call me up all the time and say, "The hospital told my pa about my ma's healthcare condition and I'm going to sue the hospital because they breached HIPAA." I say, "Well, they may have already messed up HIPAA for you but there's no private right of action of HIPAA." Basically that means a individual cannot bring a lawsuit against a hospital or a physician for violating HIPAA. The only persons who can enforce a HIPAA violation are the office of civil rights of the Department of Health and Human Services, the Health and Human Services itself, and perhaps the State Attorney General. If you do feel like you've been violated by HIPAA, you're only remedy is to go ahead and file a complaint with the government, in essence.

It's interesting, there is some new law being made out there that says it's negligent for a physician's office to let the cat out of the bag with regard to your protected health information by not following HIPAA. You may be able to get your physician or hospital on a negligence claim quite apart from violating HIPAA. You can say, "Well they were negligent because HIPAA set the standard at x, they were below the standard. I'm going to go after them for negligence." That's something else to talk about on another day.

Let's assume that you're there, and you're a healthcare provider, and you violate HIPAA. You don't do the things HIPAA requires. One thing that could be, let's say you did not do a Risk Assessment for your privacy or security rules. You adopted regulations, but you just never did that Risk Assessment. Believe it or not, that fact alone, failure to do a Risk Assessment at the beginning, that is a violation of a required regulation of HIPAA.

The consequence if you're ever investigated, you will be in breach and you will be liable for two types of penalties. There are two types of penalties we'll talk about. One are civil monetary penalties, CMPs. A civil monetary penalty is just what it says. It's a civil penalty. It's a fine. You have to pay cash, you write a check and you're done. The other types of penalties are criminal penalties. The criminal penalty, not only must you write a check but you could also lose your freedom. You could go to jail. We need to keep those two different types of penalties in different buckets. We're going to talk first about the civil penalty, write a check only. Then we'll talk about the criminal penalties. That's when you go to jail.

What are the civil monetary penalties. There are four categories of civil monetary penalties. They kind of increase in your culpability and increase in the amount you have to pay. We're going to look at this first one here. If you do know about a violation and you violate it, it will cost you between 100 and 50,000 dollars per occurrence. Let's break that down. If you do not know about a violation ... This particular penalty provision only relates to individuals. That's your individual case. If you did not know about the violation, you can still get fined.

What does it mean you didn't know? Does that mean you just never heard of HIPAA before? No, unfortunately that's not what it means at all. It's pretty clear that when we talk about knowing, at least at a criminal liability it requires not that you didn't know about HIPAA, it just means that you didn't know it was a violation at the time. What does that mean? It basically means is your failure to perform the Risk Assessment at the beginning, that failure in of itself is a knowing violation. You actually knew that you did the action or you didn't do the action. I know that I did not do a Risk Assessment. That alone in of itself is enough to get you some culpability under HIPAA. Just because you don't know about the need to do a Risk Assessment does not excuse you, at least here, from any type of culpability.

What else does it say? It's going to cost between 100 and 50,000 dollars per occurrence. What's per occurrence? Does that mean every time I didn't do a Risk Assessment? No. What it really says is, and while there's such very ability in the amount, it's very clear that if you do a violation of HIPAA and it's a continuing violation, "I should've done a Risk Assessment the first day and I didn't do and here it is a year later and I still haven't done it," that's a continuing violation of HIPAA. What does it mean to have 100 dollars per occurrence. The rules tell us that per occurrence means every single day during which the continuing violation occurs. Oh my gosh. That means in one year you've done 365 separate violations of HIPAA, just because you didn't do a Risk Assessment at the beginning.

Now you've got some real money. Now you're talking 365 violations times 100, up to 50,000 dollars. How come there's this big huge range between 100 and 50,000 dollars per day? The rules give us some kind of factors that the government will think about. They'll think about how many individuals were affected, the time period, the nature of the harm if there was any harm, whether it resulted in financial harm. Somebody lost their job because of your HIPAA violation. Whether it resulted in individual's reputation harm. Your patient was made fun of because they had a particular disease. They couldn't get a job, let's say. Whether there was physical harm. Gosh, maybe your inability to keep the information secure meant that ... Let's say you failed to keep the information secure and a doctor down the street say I have to find your information. Lo and behold, it wasn't there, it was gone or inaccessible. The patient had an adverse reaction because the doctor couldn't get the information. That's physical harm. if your breaches brings activity that cause physical harm or financial harm, that's going to be more than 100 bucks per occurrence.

Now you can see, even though we have one slide ... We only have 12 words on it, it kind of bristles with all types of legal meaning. We have to understand what it means to have a violation, what it means per occurrence, why the range is that way, and what it means to know.

That's the first of four categories. Next category is, if you're an entity and you have a violation due to reasonable cause, you can expect to pay between 1,000 to 50,000 dollars per occurrence. What is reasonable cause? If you have a violation due to reasonable cause, which is defined to mean that you knew or should have known that it was a violation. It wasn't willful but you should've known that your lack of activity would cause a violation of HIPAA. You can see where we have a little bit higher limits there.

The next category is violation caused by willful neglect that is corrected. That's going to cost between 10,000 to 50,000 per occurrence. What's willful neglect? Willful neglect means the conscious or intentional failure, or most important, reckless indifference to the obligation to comply. Let's say you attend a seminar about HIPAA. The people at the seminar indicate that, "Hey, you need to have a Risk Assessment," and you say, "I'm not going to do it because it's just a pain." Let's talk about that. If there was a breach and someone could prove that you actually attended a seminar or webinar where the requirement to have a Risk Assessment was discussed, then you knew about it and you decided for whatever reason not to do it. I'm not dissuading from attending webinars, but I am suggesting that you keep these legal obligations that you have in mind as you move about your HIPAA compliant activity.

If you have a willful neglect that you don't correct, it's going to cost you some bucks. If you look at the next slide over, you have a violation due to willful neglect that you don't correct ... Oh my gosh, you knew that you shouldn't have been doing it. For whatever reason, when you found out about it you said, "Oh, I guess I could correct it but I really don't want to. I don't want to do that. In that case, you have even greater monetary liability. Each category has a maximum penalty of 1.5 million dollars in a year.

What about the criminal activity? We'll talk about that now. These are fines for individuals. It's interesting, if you have criminal activity, not only can individuals be failed liable but institutions and entities. Let's just say that you have a physician practice. They're the ones who are the criminals, not me. I'm just a doctor who owns it. The government will likely and can likely go ahead the officers and directors of the entity personally on these criminal fines.

One of the criminal fines, we're talking about not only about writing a check but also possibly going to jail. That's if you knowingly obtain or disclose protected health information. That means basically you knew that you were getting your hands on PHI and disclosing it in a method that you know is violative of HIPAA.

The next one is if you conduct an activity involving a protected health information under false pretenses. That's basically calling up someone saying I'm an insurance agent, I need your PHI. You basically get all their protected health information and then you sell it or otherwise improperly use it. That's a pretty hefty fine, $100,000 plus five years in prison. That's not continuing, that's just kind of $100,000 and prison time. The biggest, and the worst, and the most egregious thing that could be done is if you intend to sell PHI for financial gain or harm. That would get you $250,000 penalty plus 10 years in prison. These, again, are brought by the state as a crime against you. The worst you could possibly do is break in to a hospital's medical records system and then sell the information on the open market for whatever you can get. That's the very worst activity under HIPAA and the type of activity that would get you into the most trouble.

Let's assume for a minute after listening to my talk that, "Wait a minute, I have gone to work everyday, I have done everything I thought I was supposed to do, and now I'm being told I could write a check for a lot of money." Before you get too scared about it there is provision in the law that says, "Hey, there's ways to mitigate these damages." We don't necessarily have to go to the extreme of writing a huge check, even for a violation. The government will look at a range of activities and facts that will mitigate your exposure under HIPAA.

For example, if it can be proved that you had a violation ... This is the civil monetary violation not the criminal. Criminal is a different. They say, "Oh, you didn't do a Risk Assessment and we came by and did an audit. You don't have it." You say, "Well, nobody's been harmed," and you can prove that it wasn't due to willful neglect. I just didn't know. I thought I got some bad information. If you can show that you corrected the failure within of your becoming aware of the failure, you kind of get off pretty much free.

The rule there is, "Wait a minute, I didn't know that I had this problem. I probably should have known that I had a problem. It's a civil monetary penalty. There wasn't really any harm. It's just a few people in my organization. It's a small organization. Once I found out that I was not in compliance, I took immediate steps to correct it." If you can show all of those facts to an investigator now or later, if one ever shows up ... "Yeah, back in 2013 we didn't do the Risk Assessment. I found out I should've done and within 30 days we actually had one accomplished." That should probably go very far in mitigating if not absolutely extinguishing some of your civil monetary penalty, depending on a bunch of other facts.

That kind of gives you an opportunity to think about what some of the risk are in your failure adopt a HIPAA compliance plan and have it be vibrant and living in your practice. Jason, you have some examples?


Breach Examples and What Went Wrong

Jason Karn: Yes, thank you so much Dan really appreciate that. I'm going to talk a little bit about some examples and these are actually all the fines we've seen in the last, or most of the fines we've seen in the last 6 months. We've seen a definite uptake in what's happening with fines and with basically agreements that are coming out, and what we're seeing from HHS as they move forward.

The first one, and this one just came out actually on the 21st. New York Presbyterian was fined 2.2 million dollars for filming patients without permission. This is actually for a TV show on ABC and they didn't get the patient's permission beforehand, 2 patients were in distress and they ended up being fined for that. They tried to get the camera crews to stop.

The reason I brought this up actually and is, we may say this is kind of a weird example to use, but I think this is really important because as physicians you guys probably see local celebrities. You might see some other celebrities or people that are known throughout the community. You need to be very careful about information that you're releasing to news organization and you need to be making sure you're protecting that information.

There are 2 other examples that go with. If you're a football fan, both of these are actually a football example. You had Jason Pierre-Paul from the New York Giants when he had his fireworks accident, turns out that he's not suing one of the people from ESPN because of an improper disclosure of information. We might see some fines that come from that. About 2 years ago Cam Newton had an ankle surgery and it turns out the nurse, the husband of the nurse who did the surgery actually called a local sports station and disclosed that he had surgery on his ankle.

It's things like that you want to make sure you're training your staff properly that that information doesn't get out to the public. It's really important that you keep that information and that you keep  that information private for your patients. It is your job to protect them, not just to treat them.

Moving forward this is another one that just come out actually on April 14th. This is they were fined Raleigh Orthopedic, which is a small orthopedic clinic I think they have about 16 to 20 sites in the North Carolina area. They were fines $750,000 for not having BAAs. They had contracted with a company and the company was supposed to transfer all their x-rays over to digital, basically into digital format. They went through the whole process; they didn't properly vet the Business Associate. They gave the x-rays to the Business Associate who then turned around and mined the x-rays. Turns out there was somewhere between 130 to 250 thousand dollars worth of silver in those x-rays.

That was a major breach and the lesson that we learned here is vet your Business Associates. Ask to see HIPAA compliance plans. You need to view training records. You need to make sure that the staff at this Business Associate, that they're actually training their people. You want to make sure you have counter signed agreement and that you're keeping it in your files, and that you can easily access that. That can be either in digital format or it can be in physical format, you just need to make sure you have that.

Business Associates were seen more and more of them becoming the issue or becoming the reason why these breaches are happening. It's very important because we all need Business Associates. We all need our file sharing. We need are EHR company. We need all these companies to help us do business because we don't have time to do it any other way. It's up to you to make sure you that you're properly vetting them. That you're signing people up that are valid and real Business Associates and that they do what they say they're going to do.

Going on in December 14th of last year, so not to long ago, we had a $750,000 fine. This is when an employee downloaded Malware into a system. This effected about 90,000 patients. What we saw in this was they had a very limited Risk Assessment. They had only done a Risk Assessment essentially on the EHR system. There's actually the OCR Director, Jocelyn Samuels actually came out and said, "This is an example of too limited of a Risk Assessment."

If you've done a Risk Assessment on maybe just your electronic systems you really need to look at everything. We need to look at physical. We need to look at what we're doing, so that physical will be like alarm systems, fire suppression systems if you have them. How you maybe physically are securing your servings if you have a server on site. How you may secure work stations, any of those things. We need to talk about that and we need to know administratively what's going on. Who to contact when there's a problem. Who the Privacy Officer is? Who the Security Officer is? Those can be the same people if you're a small practice, but you need to have both of those.

The key to this message that we got here was that everybody and everything needs to be looked at. That Risk Assessment really needs to be thorough. That means comprehensive and effective. That means you've looked at practically everything you can think of to look at. That may be that you contracted an external company to help do that for you or if you have a good IT person they might be able to look at least the security systems, but you're going to maybe need to find another way to both the privacy and your administrators, more your physical administrative items.

We have this next one, this is one it goes back to when I talked, about in the beginning, about password protecting and encrypting your laptops. This is an $850,000 fine. This was a laptop that had information of 599 individuals. Now they had a list of items that was wrong with this. I think we can learn from this. They didn't do a thorough Risk Assessment, so which we all know we have to do.

They didn't physically secure or safeguard that work station that was stolen. They failed to implant policies and procedures regarding safeguarding that. They also did not have unique usernames and passwords for tracking who's accessing the information on that work station. That's important that everybody has to have a unique username and a unique password so that you can track when somebody's accessing different information.

When we look at this we really say, okay we need to make sure that we are password protecting our work stations. Everybody has a unique identifier so that's a username, password hopefully two-factor authentication, so maybe they're logging in and using their password from their cell phone. Whatever it is but we need to make sure that those devices are protected and that we can log what's happening. There was a lot to learn from this specific example.

Again, you really want to make sure, and I think every one of these really can tell you that you need to make sure you've done that Risk Assessment. That's your blueprint for everything. That really will help open your eyes to potential issues that you might run in a long the way. You don't have a policy, let's say you don't have a policy for updating your operating systems. It makes you sit down and say, wait a second we need to know who's in change of doing that? When it's going to happen, and how that process is going to go forward? Having that plan, and having that written out so that you know what to follow is key not only for incidences where you might into HHS which would audit, but it's also something that's helpful because it tells people this what my job is. This is how I'm supposed to do it.

Just having something orally or saying hey this is what we might do doesn't really cover that. We really want to make sure that we've analyzed those risk. That we've looked at them and that we've decided what we're going to mitigate. What we're going to change in our process.

We find that when we go through these processes with people is that they're very eye-opening. There's a lot of really easy things we can take care of that are major red flags. It doesn't have to be something that's going to cost you thousand and thousands of dollars to do this. It's something that you can actually build up into and a lot of these items that I'm taking about when it comes to login in, or it comes to unique usernames and those sorts of things are built into your systems that you already have. It's using the tools that you already have and realizing you're not using them to their full extent.

Going forward you really want to make sure as we go into this is that we want to try to keep things from getting out in the first place obviously. It's making sure we have documented this process and that we've implanted those process moving forward and so that we know not only how we're trying to stop the breach from happening, but what we're going to do when a breach happens.


We appreciate your interest and know that maintaining compliance with HIPAA can be a big task. If you're still a bit behind schedule, our partners at Total HIPAA Compliance and Taylor English are available to provide expert HIPAA compliance training and consultation.