Cook County Clerk on Election Security

Enigma Forensics’ CEO Interviews Cook County Illinois Clerk Karen Yarbrough on election security. The two discuss progress made in securing the vote against cyber attacks over the last several years.

Clerk Yarbrough has been working to streamline and improve the efficiency of the Clerk’s office while ensuring that the next 202o election is protected against rogue nation states that may want to compromise our next election cycle.

Transcript of the interview is as follows:

Lee Neubecker: I am here today with Karen Yarbrough she is our Recorder of Deeds and Clerk in Cook County here in Chicago.

Clerk Karen Yarbrough: Well not quite Recorder of Deeds anymore Lee, I am now the Cook County Clerk and will be taking over the Recorder of Deeds office in about a year. We actually went to the voters and the voters decided that they were going to do a consolidation of the two offices and so I will pick up the Recorders job in about a year.

Lee Neubecker: So you must have a lot of integration going on with technical resources.

Clerk Karen Yarbrough: You can imagine, and yes we do. I have a very capable staff and we’re trying to get our arms around you know in the clerk’s office there are a number of duties and responsibilities we have elections of course, we have vital records and then we also are involved with taxes, and so I’ve been in this job since December. And what I’m trying to do now is get ready for 2020 and the big election for sure. But also we are absorbing the duties of the recorder of deeds. Big undertaking.

Lee Neubecker: So with all the talk of election hacking and whatnot by different nation states and foreign entities. What kind of things are you involved with, with Cook County with helping to defend against the voting system being attacked the next election cycle?

Clerk Karen Yarbrough: Well for starters Lee, our approach is a multi-leveled risk management approach. We know that there’s no system is foolproof. I mean you know it’s not a perfect system. No system is. Knowing that, we tend to look at every aspect of our system. We have these guiding principles. Defend Detect and Recover. What that simply means is we have a plan we have a plan A plan B all the way to Z.

Lee Neubecker: So its more than just putting your head under the covers.

Clerk Karen Yarbrough: Oh, no, no, no. I noticed when we were in the Recorder Deeds office our systems were attacked on a daily basis. People scraping our sites and in all of these kinds of things. So I am aware of this business of you know people trying to steal data and and what-have-you. But the elections are absolutely positively important. People need to understand that their vote does count and it will count. All the noise we’re hearing from Washington DC really makes people nervous.

Lee Neubecker: What kind of hings have happened to help make sure that wasn’t going to happen. Let’s say if the computers all get zapped to make sure that votes that are casted get counted.

Clerk Karen Yarbrough: Well first of all I have a team of experts. On staff. We’re sharing a gentleman with the city of Chicago who is at the top of the food chain when it comes to people who know about this kind of thing. Having those people on board working with the city of Chicago, we also have a two-factor login authentication of course the firewalls VPN and dedicated private data networks. Then we’re going to be able to lock down our systems both on the hardware and software lock them down before and after elections. So those are the kinds of things that we’re doing. And I think we’re going to be ready coming 2020.

Lee Neubecker: I understand that you’re currently doing some projects to seek outside computer forensic experts. What is your office looking for assistance with right now?

Clerk Karen Yarbrough: I think we’re putting something right now, I might want to defer to John Mirkovic who’s with me here today, on how that’s going. John’s been with me since I was actually in Springfield as a legislator and he has been working on the Blockchain Initiative and certainly this, and so, if you would, could you defer to him, so he can talk about what we’re doing there because John keeps up with this more than I do.

Lee Neubecker: Sure absolutely. What, in the event that a data breach were to happen, what kind of things are in place to make sure that you can recover and get back?

Clerk Karen Yarbrough: Sure. Okay having those plans certainly are important. But you know the Cook County just spent 32 million dollars on new voting equipment. That voting equipment that we have it’s almost like going back to the future,you know all the talk about, you know,voting on the internet and all these kinds of things,up come at some time, at some point in the future. But today we need to know that those votes are safe. So with the system that we have now. I don’t know if you remember,but you would have a system where you have on the side this kind of ticker tape thing that would show you how you voted.

Lee Neubecker: Paper audit trail.

Clerk Karen Yarbrough: Okay yeah well nobody noticed it. I mean I shouldn’t say nobody. But many people didn’t notice that with the new equipment, and we piloted it actually in your suburb and a couple of others. So we ran it through, and people loved it. It was so simple. So you know, you vote, you can either vote, the same way you vote now. So you could use your stylus or what have you. You place your vote, but then it’s going to shoot your ballot out to you. You’ll be able to hold that in your hand. You’ll be able to see if everything you voted for is there. And then you, not somebody else, but you will be able to post and cast your ballot.

Lee Neubecker: So the key thing is, well while the votes are being stored electronically there’s also be printed, they’re also being verified in a print out, that people can see. And then they can take it over and feed it and then scan it so you have another level of detection done, you’ve got the paper vote locked up in a box.

Clerk Karen Yarbrough: Exactly. And let’s say you mentioned something about the whole system blowing up. Okay so if the whole system blows up we still have that paper ballot locked away so that if we have to go back and let’s say everything blew up and people are running all around, with what have you. We can go and retrieve those documents and by hand we can actually,you know, count those those votes, so people should feel confident.

Lee Neubecker: It’s a great Improvement.

Clerk Karen Yarbrough: It is.

Lee Neubecker: I was brought in to consider bidding on the suburban voter audit project for the forensic project. At the time, what I was concerned about, is there wasn’t a simultaneous printout. And at certain points in time, the votes only existed electronically in storage media. They would be transferred to a consolidator that would transmit it. There was a potential at the time, that someone could have a USB device preloaded with 118 votes but in a different distribution. They could swap that device out and put it in the consolidator. But that doesn’t doesn’t exist now with the new equipment.

Clerk Karen Yarbrough: Not at all. So we’re happy about that. Let me tell you, we’re happy about that. The voters who voted in the last election, both the voters and our folks who run the elections, the judges, and what have you, just absolutely love the new system. They liked the fact that they were going to have that ballot in their hand. We shared with them, what happens now? I said well your votes are going to be counted. I said well what if? That’s the same questions that you ask. Well what if? Well we’ve taken all those precautions. But, Lee, I know, like you know, while you have a better mousetrap today, you always have to stay on your P’s and Q’s. The young man I was talking about Raoul, is his name, we share with city Chicago, everyday he’s checking our system, right now, we’re just about we’re ready to go. I think if we had to have an election today, we could have that election and have the confidence that we need to know that we’re going to have a good election, it’s going to be safe, people are going to feel good about how they’re gonna be able to cast their ballot. I’m just excited about the whole thing.

Lee Neubecker: I appreciate everything you’re doing to help secure the vote in Cook County and all your effort to streamline the government.
Clerk Karen Yarbrough: Well thank you so much for the invitation to come on. I’m just thrilled and I know that you’re a real geek and you know all of this stuff. But thank you so very much for having me on.

Lee Neubecker: Thank you Karen Yarbrough!

Watch the second part in this two part series on Cook County Election Security here.

Please follow and like us:

Office 365 Chameleon Spearfish Malware Attacking Microsoft Users

Enigma Forensics cyber security and computer forensics expert, Lee Neubecker discovered a morphing piece of malware code named Chameleon Spearfish, that targets Microsoft Office 365 users. This notice is an effort to help Microsoft exchange administrators running Microsoft Office 365 identify the malware and protect their users from compromise. Microsoft issued an advisory last week alleging that Iranian hackers have been targeting Office 365 accounts.

Characteristics of the malware

The malware is spread when an Office 365 end user clicks on an emailed pdf attachment. Users who do not open the attachment but reply to the compromised sender may receive an auto reply directing them to a sharepoint.com subdomain website. The page appears to be the compromised organization’s download site and displays a protected by Norton logo.

Be Aware of Spearfish Malware

We have observed both the original inbound attachment and the outbound attachment that gets sent onward to the compromised user’s address book. Thus far, only users of Office 365 appear to be targeted. It appears that the malware checks the compromised user’s contacts and performs an mx record query to determine which contacts in the compromised user’s contact address book are hosting their email with Microsoft.

The inbound pdf conforms to an identifiable schema.

  1. The message uses the compromised user’s signature at the bottom of the email.
  2. The file attachment has a name similar to the following:
    “Proposal Invitation 10-7-2019.pdf”, “Proposal Note 10-8-2019.pdf”
  3. The hash values of the file attachment are unique and not reported as problematic at the time the malware is morphed.
  4. The body content of the message varies, but is designed to induce the user to click on the pdf suggesting it is a proposal for business.
  5. Users clicking the pdf are directed to the following website where the user is asked to provide their Office 365 Exchange Credentials.
  6.  One of the samples directed the user to a specific url on the following domain, https://adswbellc-my.sharepoint.com (Pinging this address resolves to 40.108.203.33, an Akamai IP address which may vary depending on the source computer performing the ping).
  7. Another of the samples when clicked directed the user to a link on the following subdomain https://netorgft2768825-my.sharepoint.com (Pinging this address resolves to 13.107.136.9 a microsoft.com IP address).
  8. Future instances of this may be uploading further documents to other compromised Office 365 SharePoint websites.

Once the pdf attachment is clicked on, the malware appears to morph itself making it undetectable by any of the common antivirus solutions and begins further distribution and propagation.

Analysis of email headers on inbound and outbound messages containing the compromised pdf indicates the MAPI protocol is used to relay the message onwards to the compromised user’s contacts. Only Outlook.com and Office 365 users appear to be targeted by Chameleon Spearfish. Analysis of the malware code is in progress, but it appears that the emails are distributed from software running on the compromised end user’s machine using the MAPI protocol to connect to Office 365.

Items in the compromised user’s sent folder are purged by the malware, making it difficult to understand who received the morphed copy of the malware. Organizations using Office 365 Compliance functions should be able to determine any outbound messages sent by a compromised account by searching their enterprise.

Protective Recommended Measures

  1. Make a local DNS entry or local machine HOSTS file entry to sandbox adswbellc-my.sharepoint.com to 0.0.0.0.
  2. Consider blocking all sharepoint.com traffic outbound with an exception for your internal sharepoint.com subdomain if applicable.
  3. Search your mailbox and Outlook 365 compliance for “Proposal*10-*-2019.pdf”
  4. Search firewall traffic logs for users visiting any sharepoint.com website, but especially adswbellc-my.sharepoint.com.

What to do if you are compromised?

  • Rotate end user passwords for any user that clicked on the pdf and do this from a machine that is secure.
  • Back up data from compromised computer and deploy fresh image of the operating system and programs.
  • Notify any downstream impacted users about the compromise by sending them a link to this article if you or anyone in your organization was compromised.
  • Consider hiring our firm to assist you if you have a severe outbreak.

Please follow and like us:

Frederick Lane on Youth Cybertraps

Author, privacy expert and computer forensics expert Frederick Lane sat down with me recently to discuss his book, “Cyber Traps for the Young”. Lane has published three Cybertrap books thus far. Lane shares the risks associated with youth having tools given to them by their parents that may put their children at risk of committing crimes. Lane shares his insights from the book and expresses concerns that applications and games on phones are being used to harvest information about kids. Lane provides recommendations to parents on trying to delay the use of electronic communications devices as long as possible. Society presses kids to get online, but that may not be the best for children.

The transcript of the video interview follows:

Please follow and like us:

Robocall Legislative Update

Are robocalls driving you nuts?

Cyber Security & Computer Forensics Expert Lee Neubecker and Data Privacy Expert Debbie Reynolds discuss recent efforts to pass legislation in the House and Senate that would hold telecommunication providers responsible for addressing the ever growing tide of robocalls disrupting consumers and businesses. Existing laws such as the Telephone Consumer Protection Act (TCPA) have proven effective in blocking off shore robocalls. VOIP technology allows for robocall centers to systematically dial U.S. consumers and businesses from beyond the legal reach of our court system. Popular spoofing techniques such as Neighborhood Calling often impersonate the first 6 digits of the call receiver’s phone number in the hope of enticing that call receiver to answer a call. Neubecker and Reynolds both share their frustrations with the current situation and are hopeful the U.S. Senate and the President will take immediate action to pass updated privacy legislation protecting us all from spam robocalls.

The transcript of the video follows:

Lee Neubecker: I’m here today with Debbie Reynolds. We’re going to be talking a little bit about robocall and some new legislation coming our way. Those annoying phone calls we all get on our cellphones.

Debbie Reynolds: That’s right.

Lee Neubecker: Have you gotten any calls where it’s the first six digits of your phone number?

Debbie Reynolds: Yes!

Lee Neubecker: That’s called “neighborhood calling”. And basically, what the bad guys are doing is that they’re using VOIP technology to spoof, and they’re plugging in any number. So they can actually impersonate people you know. But they do this because they think that it increases the likelihood that you’ll answer the phone. In fact, for me, when I see those first six digits, I’m not even going to answer it.

Debbie Reynolds: Oh, absolutely. Absolutely. It’s wrong or what now?

Lee Neubecker: One of the big problems we have is no one’s taking accountability for this. I heard AT&T is trying to force some authentication mechanisms, but there needs to be some more teeth on this so that people can’t just impersonate phone numbers, or we’ll never get through this.

Debbie Reynolds: Absolutely, absolutely. Actually, so, thankfully this law passed, right?

Lee Neubecker: Well, it’s going through. It passed under the House, overwhelmingly

Debbie Reynolds: Overwhelming, yeah.

Lee Neubecker: They’re hoping that… It said it could happen by 2020, perhaps?

Debbie Reynolds: Okay, that’d be good.

Lee Neubecker: But it’s got to… I think they have to reconcile the two bills, the House and senate, and then the President has to sign it. But by the show of votes, I think everyone’s in favor of let’s tackle all these annoying robocalls.

Debbie Reynolds: Absolutely. So the FCC, they really made a lot of headway many years ago on the Do Not Call Registry, so this will be sort of another layer to that, that the FCC is sort of looking at. I don’t know about you, but I’m very annoyed when I get robocalls, so I’m not happy about this. Maybe it will happen after the election, because the election, people like to be robocalled.

Lee Neubecker: I get tons of calls from people wanting to lend me money, They will ring my phone once and then it will hit my voicemail. This woman keeps calling, saying, I want to speak to you. It’s like, and it’s not even a real person, It’s all automated. It’s annoying.

Debbie Reynolds: Oh, my goodness. Well, one interesting thing about the law, or the one that they’re anticipating, or trying to pass, that I haven’t seen in other laws like this, they’re trying to force companies to create technology, to be able to tell a robocall.

Lee Neubecker: The carriers need to enforce it. The carriers have to stop allowing unsecured VOIP to impersonate calls.

Debbie Reynolds: Right. The House does not allow it, but they specifically said they have to create, if it does exist, they have to create some technology to make sure they can tell a robocall from a normal call?

Lee Neubecker: It’s basically like, we’re going to block any call that isn’t using a means of identity verification. Right now, it’s about a bust.

Debbie Reynolds: And they can’t charge for it, so it’s not like an extra fee. I’m sure what’ll happen is they’ll do you another fee and then call it something else, but it’ll be probably just robocalls.

Lee Neubecker: The act also increased the penalty. Current legislation, the TCPA, the Telephone Consumer Protection Act, dealt with spam faxes, calls, and what-not, but the robocall act is going to produce penalties I think to ten thousand dollars each.

Debbie Reynolds: Per incident.

Lee Neubecker: Per incident.

Debbie Reynolds: So that’s a lot.

Lee Neubecker: So that’s going to drive my TCPA consulting business, because that’s work.

Debbie Reynolds: Yeah, absolutely. Well, if it actually makes it, I’m sure the thing about the $10,000 per incident and also, forcing companies to create technology to be able to tell what’s a robocall, corporations or the carriers are probably going to fight that. So, we’ll see.

Lee Neubecker: Yeah. So Debbie, what are the likely impacts on the litigation environment, as you see it? If this legislation goes through?

Debbie Reynolds: Well, first of all, there will be companies that will, uh, I’m sure there will be consumer groups that want to bundle together consumer complaints and probably go after these carriers to try to get these big fines or whatever. So, this could be tying up the legislation for a while. Once the lawyers get their fees, You’ll probably want to get the $10,000 per incident.

Lee Neubecker: It’s going to make it a lot more, in my opinion, they will make it much easier to actually identify who’s behind it, because right now people are using proxy phone numbers to call and many of them are just total scams run out of the country. You can’t– A Nigerian spam call center, we can’t really go after, but if our carriers say they’re going to block these rogue, foreign VOIP connections, then it will make it more secure. Ultimately, you’ll probably have people who opt in to the insecure network, and people who want a secure-only platform where it’s no use calling them.

Debbie Reynolds: I agree.

Lee Neubecker: Thank you for being on the show today. It was great to have you on again. I love your scarf.

Debbie Reynolds: Thank you.

Lee Neubecker: You always have interesting scarves.

Debbie Reynolds: Thank you. A pleasure.

Lee Neubecker: We’ll see you soon.

Debbie Reynolds: Okay, bye bye.

Debbie Reynolds Contact Info

datadiva at debbiereynoldsconsulting dot com
312-513-3665
https://www.linkedin.com/in/debbieareynolds/
https://debbiereynoldsconsulting.com/

Please follow and like us:

Computer Forensics in Medical Malpractice

Importance of Computer Forensics in Medical Malpractice Litigation by revealing patient electronic medical records.

Computer Forensics Wins Litigation

Enigma Forensics CEO & President Lee Neubecker interviews James Meyer a personal injury attorney from Ialongo and Meyer. Computer Forensics uncovers answers to important questions such as; what orders may or may not have been entered as a result of that medical test. In this video, Lee and Jim share some of the changes that have ocurred that impact medical malpractice litigation. Tune in to find out how using computer forensics can make or break a case.

The transcript of the video interview follows:

Lee Neubecker: Hi this is Lee Neubecker, I’m here with Jim Meyer from Ialongo and Meyer, and we’re here today talking about patient medical records, specifically electronic medical records. Some of the changes that have happened that impact medical malpractice litigation. So Jim, can you tell me a little bit about EMR and how computer forensics plays a role in cases that you’re litigating, where you’re trying to get a result for your client?

Jim Meyer: Well EMR has changed everything, in regards to medical records. HIPAA is required that the electronic medical records, both be secure and private, that requirement provides that a lot of metadata is collected with every electronic medical record. That metadata itself is very important in… Capturing information about where, when, how and whom, made the medical record, can be crucial in any medical investigation.

Lee Neubecker: Look, can you tell me an example of what type of metadata you might be asking for, and why it would be relevant to the outcome of litigation?

Jim Meyer: Well… The metadata that is most interesting in most cases is, when certain events occurred in a medical record. When a test was ordered, when it was performed, when the results were placed in the patient’s medical record, when the physician saw those results, what orders may or may not have been entered as a result of that medical test. When medication is prescribed, when it’s administered, who administered the medication. Many of these details are now electronically captured, as opposed to being physically noted, as they were in old written medical records. It can make a… Big difference in trying to determine when events occurred in a case.

Lee Neubecker: I know one of the cases I was involved in, I discovered that many of the different default reports that are provided with these medical software packages, don’t necessarily show all available metadata. In fact, what we had to do on one of the cases, we had to work through discovery to try to get the scheme of the database. And then we discovered in once instance that there was something known as a sticky note, that the nurses and physicians could type little comments in, but there was a presumption that would never get printed because it’s not in any of the default reports. So what we actually had to do is find the table that had these notes, and then work to get the data dumped. And as soon as we found that, the case quickly settled, because obviously, the hospitals don’t want everyone knowing what’s going on.

Jim Meyer: That’s a disadvantage that a plaintiff in a case may have. Hospitals often times have entire departments in medical informatics, departments in which they have experts that know the in’s and out’s of the EMR, the metadata collected, often times plaintiffs do not, but they should be aware of the fact that that metadata exists. Extracting it from the record is often times… It is a need for an expert at computer forensics, expert, an IT expert. But it’s important that plaintiffs, and all attorneys, defense attorneys and plaintiffs attorneys realize that that information exists as metadata in these records, it can be obtained. We take great deal of effort to obtain it, but it’s there.

Lee Neubecker: And Jim and I co-authored a paper along with another attorney that appeared in the Illinois State Bar Association on EMR patient medical records, the audit trail and other things impacting HIPPAA and medical malpractice regulations. We’ll put that up here too so that you check that out. Anything else you’d like to add about your practice, Jim?

Jim Meyer: No, we’re happy practicing attorneys in Chicago, Illinois. I would recommend any attorney who is involved in any issue similar to this, to take a look at the article that Lee was kind enough to co-author with me and John Tomes, it really is a lot of information. Detailed information that attorney’s should know.

Lee Neubecker: Great, thank you.

Jim Meyer: You’re welcome.

To Learn More about Computer Forensics and Patient Electronic Medial Records

Read the Illinois State Bar Article co-authored by the interviewed subjects on Patient Medical Records.

Please follow and like us:

Computer Fraud & Abuse Act Charges Filed

Capital One Data Breach

Capital One Data Breach – Interview of Data Privacy & eDiscovery expert on the fallout

Cyber Security &  Computer Forensics Expert Lee Neubecker interviews Data Privacy Expert Debbie Reynolds on the fallout from the recently disclosed Capital One Data Breach that occurred following alleged hacking of the company’s data stored in the cloud.  Issues discussed include an assessment of how the CEO of Capital One managed the crisis, pending charges filed against Paige Thompson and the Computer Fraud and Abuse Act in the government’s complaint filed earlier this week.

Transcript of video follows

Lee Neubecker: Hi, I’m here today with Debbie Reynolds from Debbie Reynolds Consulting and we’re going to be talking today about the recent news involving the Capital One Data Breach Thank you for being on the show Debbie.

Debbie Reynolds: Thank you for inviting me. It’s such a thrill, you’re such a joy to be around to talk to so it’s great to do this

Lee Neubecker: Well it’s great to have you here. So, trial’s expected this Thursday in the case. Can you tell everyone a little bit about what happened this week?

Debbie Reynolds: So this week is in the news that Capital One had a data breach. There was a woman who used to be, I believe she’s worked Amazon if I’m not mistaken, who had found a vulnerability in Capital One’s cloud system, and was able to obtain private or digital information on over a hundred billion customers or potential customers for Capital One so as far as I can tell they say that she may have gathered social security numbers and other private information about individuals who had even applied, who may not even be customers of Capital One, who have even applied for a Capital One credit card back as far as 2005.

Lee Neubecker: Yep.

Debbie Reynolds: So the vulnerability that was discovered and part of the reason why it was discovered was because she had apparently bragged about it on Twitter and she used her real name and so they were able to pull this stuff together. And I think the SWAT team went to her house?

Lee Neubecker: Yeah, so she was using the IP, iPredator, which is supposed to anonymize and protect you. When she was using that she created her online GitHub accounts and other accounts and it had that IP, the iPredator IP address range in her profile linked to her name. So she wasn’t really being smart about it.

Debbie Reynolds: No. So yeah, I think that she was bragging about what she had, I guess she was proud of what she had done and apparently someone who had seen something she had post on some forum contacted Capital One. This wasn’t a breach in which Capital One found out about; someone from the outside said, “Hey, this girl says that she has your data” and now it’s a really big thing.

Lee Neubecker: Yeah so now she’s charged with a computer fraud and abuse act which I think she’ll probably end up …

Debbie Reynolds: Yeah.

Lee Neubecker: Do you think she’ll get a plea?

Debbie Reynolds: She’s probably going to go to the slammer. It seems like especially when the SWAT team showed up at her house, they’re definitely going to make an example out of her with this. It’s pretty bad because I think right now the reports and what’s coming out from Capital One are different than what she said or what other people said they have. Because at one point they were saying that Capital One in their statement said that certain people’s social security numbers weren’t breached but then we know that they did get people’s social security numbers.

Lee Neubecker: It was mostly Canadian social security numbers, around a million–

Debbie Reynolds: Right.

Lee Neubecker: And then I think it was somewhere around 100,000 or so U.S. citizens.

Debbie Reynolds: Right, exactly.

Lee Neubecker: So it doesn’t necessarily impact the entirety of U.S. customers, but it still is–

Debbie Reynolds: It doesn’t, it doesn’t make you feel good. Yeah so basically over a hundred million people were touched in some way, shape or form. Even though not everyone’s personal data was taken to the same extent as everyone else, but I think this incident illustrates for us a couple of different things. First of all, they were saying that they had credit card information or information on people who had applied for credit cards going back as far as 2005. I’m not sure if they can make a justification for why they even had some of that stuff.

Debbie Reynolds: It’s first place. Especially if and I wonder what rights someone would have if they weren’t actually didn’t translate to being a customer of Capital One. The law’s kind of murky about how they should do that. I guess that’s the same issue with Equifax where not everyone who was touched by Equifax are customers of Equifax, they just happened to have their data.

Lee Neubecker: What would, how would you have advised Capital One had you gotten in there before the data breach?

Lee Neubecker: You think you might have been able to–

Debbie Reynolds: Well, you know–

Lee Neubecker: Get them in a better situation?

Debbie Reynolds: I think a lot of corporations, my view is that a lot of corporations have this mindset or business has this mindset of does it work? Does the computer work? Can I do the thing I need to do on a computer? The question that they’re not asking is is it secure? So a lot of them have a blind spot in terms of securing things because as long as it doesn’t impact their ability to work, they don’t really care how it works. So now companies have to ask how does it work? Is it secure? A lot of companies have these issues where they’re moving from internal infrastructure to the cloud and we know that the cloud infrastructure would typically be more secure quote unquote than someone’s on premise infrastructure but that all depends on how it was configured. The vulnerability that this woman was able to exploit in Capital One had to do with how the permissions and things were configured on a cloud infrastructure.

Lee Neubecker: And she had worked in that environment.

Debbie Reynolds: Right. So she had a little bit of extra insight–

Debbie Reynolds: Exactly.

Lee Neubecker: In this process.

Debbie Reynolds: Exactly. But I don’t know if you probably run into the same thing where you’re having clients that have cloud issues and they may feel more secure in themselves. Okay, we think our native is more safe than the cloud, not to say that the cloud is not safe, but if we have someone who doesn’t know how to fill those gaps and stop those vulnerabilities, it could be a huge problem.

Lee Neubecker: What do you think of the CEO’s response from Capital One?

Debbie Reynolds: I saw CEO’s response. I don’t know, someone needs to do a series about this where you compare all the response letters from these data breaches or whatever.

Lee Neubecker: That’s a great idea.

Debbie Reynolds: Not a bad response at all. I think the danger though is there may be an issue with consumer confidence obviously because no one wants their data breached, but if the things that are being said by the CEO or other leadership it becomes evident that it’s different than what actually happened, that’s going to be a problem.

Lee Neubecker: Yeah, cool.

Debbie Reynolds: I think rushing, the desire is to rush. To put out as much information as you possibly can but already the news reports are contradicting what the company is saying about what was actually breached.

Lee Neubecker: Well the complaint is available, I’ll post that on my website as well. I read the complaint and there’s a lot of detail in there and you’re right, in the news story they’re talking about Amazon cloud, they talk about a company that presumably is a subsidiary of Amazon inside the complaint.

Debbie Reynolds: Right.

Lee Neubecker: But they didn’t specifically mention Amazon in the complaint.

Debbie Reynolds: No, no so it’s going to be customers when they feel like they’ve had a data breach they definitely want, you know there’s attention that has to happen where the company wants to be as forthright and forthcoming as possible about what’s happened, but the facts may still be rolling out.

Lee Neubecker: Yeah.

Debbie Reynolds: The drip, drip, drip of it all may be tough I think.

Lee Neubecker: But I thought at least it was good that they public acknowledged it. It didn’t take forever to acknowledge it.

Debbie Reynolds: Oh, right exactly.

Lee Neubecker: And apologize, I mean–

Debbie Reynolds: Oh, absolutely. It does goes a long way–

Lee Neubecker: They just did that so I applaud them for not–

Debbie Reynolds: Absolutely.

Lee Neubecker: Sitting on it like Equifax.

Debbie Reynolds: Right. They didn’t say, “Well I’m sorry that you were hurt or you felt hurt,” or something where it’s like oh yeah, you know there is harm there so you might as well acknowledge it and try to at least be forthright about what you know and we know it.

Lee Neubecker: And from what I read too, not all of the data, some of the data was tokenized but there were birth dates, there were some socials. Debbie Reynolds: Right.

Lee Neubecker: And some other information that certainly if that were you or me, well we’re kind of becoming used to this all the time. It’s sad, but.

Debbie Reynolds: Right, well I mean and what we’re seeing, what I’m seeing, what companies are trying to argue in the U.S. having to do with data privacy is if you put, let’s say you’re on Facebook and you say, “Hey, today’s my birthday!” You know so if Lee puts his birthday on Facebook, is Lee’s birthday private? So let’s say you’re a Capital One customer, they could argue you know your birthday is not private because you put it on Facebook. That’s going to be an interesting theme.

Lee Neubecker: Well thanks so much for being on the show today.

Debbie Reynolds: It was fantastic, thank you.

Debbie Reynolds Contact Info

datadiva at debbiereynoldsconsulting dot com
312-513-3665
https://www.linkedin.com/in/debbieareynolds/
https://debbiereynoldsconsulting.com/

Please follow and like us:

Crypto Currency Lending

Crypto currency lending and cyber security issues

Cyber Security Forensics Expert, Lee Neubecker and Draw Bridge Lending CEO Jason Urban describe crypto currency and the security issues as it relates to Bitcoin and

The transcript of the interview follows:

Lee Neubecker: Hi, I have Jason Urban on the show today. He’s the President and CEO of DrawBridge Lending. Thanks for being on the show Jason.

Jason Urban: Thanks for having me, Lee. This is great, glad to be here today.

Lee Neubecker: Jason, I’ve known you for awhile. You’ve been doing some innovative things in the lending industry as it relates to bitcoin and block chain. Tell us a little bit about that. Jason Urban : Sure, so what we do is we’re a lender against secured digital asset holdings and what we are providing is the draw bridge, or the bridge, from these traditional lending sources, or pools of liquidity, into this new ecosystem where everybody is trying to figure out how that landscape works.

Lee Neubecker: What type of people would have a need for your service? Jason Urban: I think they’re are a wide variety of people. People who have these digital assets and because of the way they’re categorized here in the States from the IRS perspective, when you spend them, when you use them, you encounter a taxable situation, but to the extent that you might need to pay your power bill or to go on a vacation or buy that boat you always wanted, you need fiat, you need US dollars, and what we provide is a mechanism or platform for people to borrow against the digital asset holders.

Lee Neubecker: So, if someone’s sitting on say 100 bitcoin, which is quite a bit of money, you’d allow them to take out a loan against that bit coin and use that for short term cash expense or whatever?

Jason Urban: Yes

Lee Neubecker: What is the duration of your loans typically?

Jason Urban: We typically focus one to six months. It’s a very volatile asset, and our backgrounds are managing that volatility, but there’s only so much you can do when something moves as rapidly as that does, which is an advantage to the asset, but it’s also difficult from a lending capacity. So our loans are one to six months in duration, and we offer renewal options, so you can re-up and renew. Just the strike price of that loan to value, think about your home moving 50% in a six month period, you might want to refi or you might need to put more money up. We try to mitigate a lot of those risks by offering the durations we do.

Lee Neubecker: So, your clients actually give you their cryptocurrency and you escrow it for them?

Jason Urban: Yes, so what we do is we don’t like to take possession of their currency. What we like to do is use a qualified third party custodian so that their digital assets are resting there, so they know they’re there, and I can’t take them unless they default on a loan or something unfortunate happens. All we want to do is provide a mechanism or a platform for someone to monetize their holdings. We don’t want to take possession of them. We don’t want their private keys. We’ll only take those in the event that they default or want us to satisfy their loan.

Lee Neubecker: So in this business, what measures do you take to help ensure that these digital assets are safe from a cyber attack perspective?

Jason Urban: Well, part of it, the key for us, is cold storage. And cold storage is basically storing these things on a server or computer where it’s not connected to the internet. It can’t be taken, so we require that all our custodians deploy a cold storage method as opposed to a warm storage or a hot storage. That way we know that the gold is in the vault so to speak but that it’s not going to be readily accessible to anybody out there.

Lee Neubecker: Have you had a situation where a customer gets angry because a price fluctuates and they feel that they were cheated out of there value?

Jason Urban: Interestingly we don’t have that problem because of the mechanisms that we deploy on the back end. So all our loans are no margin call and non-recourse unlike a lot of people in the business that will have you retop. Think about it this way, if I issue you a loan on an asset that’s worth $10,000, and I give you 50% of that asset in cash, if the value of that asset goes from 10,000 to 5,000, I now need to create that cushion again, so you need to pay me more money or reup or figure out. What we’ve developed, and our methodology, is a way to never have to worry about that, and we use the financial markets. We’re markets experts, and we’re risk managers, so we have mechanisms by which we can ensure that you don’t have to worry about topping off your loan.

Lee Neubecker: Are there any restrictions on the type of customers you can have based on what the SEC imposes on you?

Jason Urban: We are very compliant, so we are registered by the CFDC, and we follow all the rules and regs imposed on us by them. We have to do AMLKYC, anti-money laundering know your customer. We’re registered as a non-bank lender in all 50, or in 31 states. We operate in all 50 states so that we’re following not only consumer lending laws but also securities laws and commodities laws.

Lee Neubecker: Are there any requirements you have on customers before you can take them as a client? Well one, we have to do the AMLKYC on them. Right now, our products are geared towards accredited investors. Because of the way we do the hedging on the back end we need to make sure that those customers are sophisticated enough to understand what we’re doing. And so in order to do that, we need to put that accredited investor cap on things. It’s a little different under the CFDC umbrella. They call them qualified exchange participants, or ECPs, so there’s a couple of different buckets you wear, but it’s a little different than the SEC’s accredited investor, but effectively it’s the same thing.

Lee Neubecker: Is there a minimum net worth that your customer’s have to have?

Jason Urban: And that’s part of it, a minimum net worth of a million dollars, or an entity that’s a million dollars that’s what we require.

Lee Neubecker: What sectors do you see that this type of lending is getting the most interest in terms of where your clients are coming from?

Jason Urban: A wide variety, if you really think about it, bitcoin, or digital assets as a whole, can be held by anyone. It isn’t a single group that says, “Hey, I’m really into this.” So we see funds, minors, people who were early adopters of the technology, they’ve all kind of stepped forward. Additionally, we’ve got a product that’s geared towards people who would like to buy bitcoin and want to employ some of the same methodologies that we’re employing right now.

Lee Neubecker: Do you have any closing thoughts you’d like to share?

Jason Urban: I think that people often confuse block chain and decentralized ledgers with bit coin. I think the block chain technology is interesting on so many levels. I think that as the world becomes more tokenized, and I think you’re going to see more and more of that, everything from the artwork that you see on the walls to buildings to physical assets like gold, silver, oil. The world is moving towards that technology and that methodology, and I think that being an early adopter and understanding it is so important. If you want to make the same parallels, this is the internet in 1990 or 1995. The difference is the world moves much faster today than it did back then.

Lee Neubecker: So are you taking investors?

Jason Urban: We’re always willing to have strategic investors come into the space, and we’re not opposed to that. We’re very well capitalized, but we do recognize the value in being partners with people. And part of being partners is financial as well.

Lee Neubecker: Well thanks again for being on the show.

Jason Urban: Thank you very much.

Please follow and like us:

Neubecker Presents at Chicago Science Writers

How secure are consumer IoT devices?

Lee Neubecker, Enigma Forensics President & CEO, will present on the potential impact of vulnerable consumer IoT devices as it relates to the security of the U.S. Power Grid.

The event will take place at the Medill School of Journalism Chicago Newsroom, 303 East Upper Wacker Drive Suite 1600, Chicago, IL 60601.
Date: Thursday, January 10th, 2019, from 5:30PM – 7:00PM.

The Chicago Science Writers organization is composed of writers that report on more technical topics. The Chicago Science Writers group provides a forum for people in the Chicago area who communicate science to the public. It organizes professional development programs and social gatherings. CSW provides a point of contact to national science organizations and local science groups interested in connecting with science writers in the Chicago area.

The public may register for this event at the following link:
https://www.eventbrite.com/e/chicago-science-writers-presents-hacking-the-power-grid-tickets-54182573536?aff=mcivte

Please follow and like us:

Lee Neubecker to present at CyberSecurity International Symposium

Enigma Forensics’ CEO, Lee Neubecker will be presenting on Infrastructure Vulnerabilities relating to the potential for power outages to be caused by indirect cyber attacks on the power grid.
The Second CyberSecurity International Symposium will take place all day on Tuesday, November 13th, at Conference Chicago located at 525 South State Street Chicago, Illinois 60605. Neubecker will be presenting the topic, ” Hacking the Power Grid, Why We Should All Be Concerned About IoT Security” from 11:30-noon. A 40% discount code to Enigma Forensics clients is available to those wishing to attend. Please call Lee Neubecker for details.

The complete conference agenda is available at http://www.cybersecurity-symposium.com/agenda.htm1

Please follow and like us:

Patient Medical Records: Metadata as Evidence in Litigation

ELECTRONIC MEDICAL RECORDS:

Metadata As Evidence in Litigation

By James G. Meyer* Jonathan P. Tomes** and Lee Neubecker***
As published: Vol. 101 #8, August 2013. Copyright by the Illinois State Bar Association www.isba.org

Doctor and hospital records are changing. The paper medical records that we have been familiar with, along with the rest of the “written” world, are becoming electronic —that is, written, maintained, and retrieved as digital data.

Because of many emerging “after entry” benefits, federal and state governments, insurance companies, and medical institutions are heavily promoting the adoption of Electronic Medical Records (“EMR”).[1] For example, the HITECH Act (American Recovery and Reinvestment Act of 2009[2]) includes both incentives and penalties in its calculations to encourage adoption of electronic records, versus continued use of paper records. The Act allows benefits of up to $44K per physician under Medicare or up to $65K over six years under Medicaid for adoption of electronic records. Additionally, Congress decreased Medicare/Medicaid reimbursements to doctors who fail to use electronic medical records by 2015 for covered patients.

This change in medical record keeping and changes in the laws and regulations associated with electronic medical record keeping are creating significant changes in what and how information may become evidence in litigation.

Attorneys who deal with medical records in any type of litigation should be aware of the changes in the following areas:

I. Electronic Medical Records and HIPAA

II. PHI as Electronically Stored Information

III. What is Discoverable: Metadata and Computer Forensics

IV. A Word about Encryption

V. Discoverability and Admissibility of Electronic Medical Records and Metadata

I. ELECTRONIC MEDICAL RECORDS AND HIPAA

Before the advent of electronic medical records, The Illinois Administrative Code itemized the minimum requirements for the content, management, and administration of medical records.[3]

The Health Insurance Portability and Accountability Act of 1996 (“HIPAA”)[4] sets out a comprehensive set of rules, safeguards, and definitions that are, effectively, applicable to most health care providers that use computers and electronic storage devices to store or transmit patient medical records. Excepted from the statute are institutions that do not transmit billing transmissions to and from Medicare/Medicaid or other health plans, an uncommon circumstance. With the HITECH Act’s incentives to use electronic health records, more and more providers will do so.

What we have understood to be doctor and hospital medical records, HIPAA defines more comprehensively as health information: “any information, whether oral or recorded in any form or medium, that:

i. Is created or received by a health care provider, health plan, public health authority, employer, life insurer, school or university, or health care clearinghouse; and

ii. Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual.”[5]

Under HIPAA, Protected Health Information(“PHI’) is “individually identifiable health information” that is:

i. Transmitted by electronic media;

ii. Maintained in electronic media; or

iii. Transmitted or maintained in any other form or medium.”[6]

II. PHI AS ELECTRONICALLY STORED INFORMATION

To understand where and how EMR systems “transmit” and “maintain” PHI, it is helpful to use the terminology of computer experts. From their viewpoint, HIPAA’s PHI is Electronically Stored Information (“ESI”).

ESI is data stored, processed, retrieved or transferred by “Electronic Storage Devices.”[7] Electronic Storage Devices – a subclass of Electronic Media – are commonly known as diskettes, Flash Drives and CD/DVD Disk media. Both Electronic Storage Devices and Electronic Media are capable of containing ESI (thus PHI).

Electronic Storage Devices capable of storing ESI can be classified into two main categories – Non-Volatile Electronic Storage Devices and Volatile Electronic Storage Devices.

Non-Volatile Electronic Storage Devices store data on a more or less permanent basis, but can often be deleted or destroyed. These can be grouped into several categories – Primary Storage Devices, Secondary Storage Devices, Offline Backup/Archival, and “In the Cloud.” Examples of each are:

Primary Storage Devices

(1) Hard Disk Drives

(2) Disk Media

(3) ROM / PROM / EPROM

(4) Solid State Drives (Flash Storage)

(5) SIM Cards

(6) Multi Media Cards (SD, SDHC, SDXC, SDIO, and Others)

(7) Smart Cards, Chip Cards or Integrated Circuit Card

(8) Paper Based Storage (Punch Cards, Bar Codes, Scantron)

Secondary Storage Devices

(1) USB Thumb Drives / Flash Drives

(2) External Hard Disk Drives

(3) Disk Media (Floppy Disk, CD, DVD, Blue Ray)

(4) Radio-Frequency Identification (RFID) Tags

Offline Backup / Archival

(1) Magnetic Tape

(2) Disk Media (Floppy / CD / DVD / Blue Ray)

(3) Bar Code Paper Records

(4) CD / DVD Disk Media

In the Cloud (Utilizes all types of Storage)[8]

Volatile[9] Electronic Storage Devices retain a good deal of ESI for a discrete period of time, e.g. until such time that the Volatile source loses power. The RAM in a computer is an example of Volatile Electronic Storage Devices.

ESI may be transmitted between Electronic Storage Device sources via the internet, extranets, infrared, radio, Wi-Fi, Satellite, Cable, Broadband, cellular, leased lines, barcode, dial-up telephone lines, private networks, connected external devices, and devices that are physically moved from one location to another using magnetic tape, disc, or compact disc media.[10]

A patient’s PHI maintained in any of these Electronic Storage Devices or transmitted by any of these means of electronic transmission are potential sources of discoverable information. Smart phones and PDAs are increasingly used in association with electronic health data. Industry sources estimate that “in 2010, more that 50 percent of physicians were using smartphones or PDAs on a regular basis in clinical decision making.”[11] As an indication of how important mobile devices have become in healthcare, the Healthcare Information and Management Systems Society (“HIMSS”), a leading non-profit industry group, has formed a separate entity, mHIMSS, to focus exclusively on the use of mobile and wireless technologies in healthcare.[12]

III. WHAT IS DISCOVERABLE: METADATA AND COMPUTER FORENSICS

The Department of Health and Human Services (“DHHS”) regulations implementing HIPAA govern PHI with both a Privacy Rule[13] and a Security Rule[14]. As their names imply, the rules require adoption of enumerated standards and safeguards so that covered entities protect a patient’s electronic (and paper) medical records from unauthorized access,[15] tampering, or destruction[16].

Attorneys that have been involved with medical records in litigation since the enactment of HIPAA and the implementation of the DHHS regulations are generally aware that the Privacy Rule enumerates the ways to obtain PHI from health care providers during discovery by the use of written authorization or subpoena.[17]

In addition to delineating how to obtain PHI, HIPAA’s Privacy Rule also requires that covered entities have procedures in place to give individuals an accurate accounting of disclosures of their PHI in cases in which an accounting is required.[18]

HIPAA’s Security Rule requires that a covered entity “ensure the confidentiality, integrity and availability of all electronic PHI the covered entity creates, receives, maintains or transmits”.[19] The standard specifically defines “confidentiality” as “the property that data or information is not made available or disclosed to unauthorized persons or processes” and “integrity” as “the property that data or information have not been altered or destroyed in an unauthorized manner.”[20]

In order to implement the Privacy and Security Rules, HIPAA requires covered entities to use “audit controls,” such as “hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic protected health information”[21] and to “implement procedures to regularly review records of information system activity, such as audit logs, access reports and security tracking reports.”[22] The Metadata generated by these audit control systems, about the access and use of a patient’s records and the use and operation of the computer device maintaining or transmitting the records, is typically not part of the formal medical record. But it can often be a gold-mine of important information that would not otherwise be obtainable in discovery.[23]

For example, Metadata in the form of an audit log or audit trail may be helpful with faulty or incomplete memories. An audit trail is a record of who, when, where, how and sometimes why a person used a computer program or accessed a patient’s medical record. Typically, the identity of the user who accesses the patient’s record, the time of access, the terminal or device used for access, the action taken by the user (i.e., viewing the record, changing the record), and the substance of anything added to the record and any changes or corrections made by the user are recorded in the Metadata which can be reproduced in the form of an audit trail or log. In a case known to the authors, a hospital audit trail produced during discovery, showing the “terminal identifier” for an EMR entry (the unique number assigned to each computer terminal in the EMR system) resulted in a nurse changing her testimony when it disclosed she was using a computer terminal in another part of the hospital, and was not with the patient, as she had testified.

Metadata, such as in an audit trail, is captured automatically by the EMR system. As a result, the audit trail should correspond, entry by entry, to the patient’s medical chart or record. If an entry in the audit trail shows data was added, changed or deleted, a corresponding entry should appear in the patient’s chart, and vice versa.

Metadata found in a forensic image of a medical record may be more helpful. A “forensic image” is not simply a copy of the electronic record; it is a bit-for-bit copy of all sectors of the media involved and must be done properly.[24] In a case known to the authors, the analysis of the Metadata on a video disk of a surgical procedure produced during discovery showed that the several of the video clip files in the series of video files that were generated during the procedure were deleted, with the remaining video clips renumbered in an apparent attempt to conceal what transpired during the missing video clips. An analysis of the DICOM video clip embedded Metadata within the contents of each of the DICOM video files revealed the original clip sequence numbers were different for the last few video clips. The file Metadata compared to the DICOM video clip embedded Metadata implied an intentional manipulation of the data in order to alter the events that actually occurred.

IV. A WORD ABOUT DATA ENCRYPTION

Data encryption does not ensure the confidentiality or integrity of PHI. HIPAA’s data encryption standards allow health care providers, health insurance companies and business associates who transmit, store or access protected health information in electronic form to utilize a standardized level of data encryption when encryption is reasonable and appropriate. The Advanced Encryption Standard (AES) is an Federal Information Processing Standards (FIPS) approved cryptographic algorithm used to protect electronic data and is quite prevalent in the healthcare industry to secure data-at-rest, data-in-motion and data-in-transit.[25]

PHI data is vulnerable when actively used and stored in volatile memory. Much of a patient’s information is stored unencrypted in volatile memory when a computer device is actively working with a patient’s record or following the access of a patient’s record until such time that the data is discarded automatically or the computer device shuts off. Anyone with physical or network access to the device or a strong hacker skill set would have a reasonable opportunity to capture the non-encrypted information stored in volatile memory.

Another vulnerable area of risk is when PHI is in transit without the appropriate encryption safeguards. Encrypted ESI using today’s standards is unlikely to be compromised while in a data-at-rest, data-in-motion and data-in-transit state. But, ESI containing PHI is unencrypted at the point of service on a portable or fixed computing device. These devices are sometimes not properly secured with the appropriate physical and network security protections required, providing an opportunity to manipulate the unencrypted data.

V. Discoverability and Admissibility of Electronic Medical Records and Metadata

Illinois Supreme Court Rules make electronic data discoverable. Under Rule 201, “General Discovery Provisions,” discoverable “documents” include “all retrievable information in computer storage.”[26] Rule 214, “Discovery of Documents, Objects, and Tangible Things,” specifically requires production of “all retrievable information in computer storage in printed form.”[27]

Medical records have long been admissible as an exception to the hearsay rule. Before adoption of the Illinois Rules of Evidence (effective January 1, 2011), Illinois Supreme Court Rule 236(b), as amended in 1992, was generally accepted as permitting the admission into evidence of medical and hospital treatment records, in written or computer form, as business records. That rule is silent, however, as to computer generated “data” or “data compilations.” Any confusion in that regard seems resolved in the new Rules of Evidence.

In the first instance, much of the Metadata recorded in an electronic medical record may not be hearsay at all. Rule 801 defines a hearsay “statement” as the oral or written assertion or conduct of a “person.”[28] Automatically imprinted Metadata, is not the assertion or conduct of a person. See, People v. Holowko, 486 N.E.2d 877, 109 Ill. 187 (1985) (recognizing the difference between computer stored information, which may be hearsay, and computer generated information, which is not hearsay). Recorded Metatdata in an EMR system is similar to images recorded on surveillance cameras, which are not hearsay. People v. Tharpe-Williams, 676 N.E. 2d 717, 286 Ill. App. 3d 605 (1997). Because Metadata involves no human input in its creation, other than the actions taken by the user in creating or manipulating the file or record referenced by the Metadata, it is non-hearsay evidence.[29]

To the extent that Metadata does include human input, the new rules provide a hearsay exception for “a memorandum, report, record, or data compilation, in any form, of acts, events, conditions, opinions, or diagnoses” kept as part of a regularly conducted business activity.[30] In addition, the new rules make “writings” and “recordings,” defined to include “numbers . . . set down by . . . magnetic impulse, mechanical or electronic recording, or other form of data compilation,”[31] admissible as “duplicates”[32] or when offered “in the form of a chart, summary, or calculation.”[33]

Although Illinois decisions on the admission of electronic data are not as common as cases in the federal courts, Illinois cases predating the new rules have approved its admission. See, for example, Bachman v. General Motors, 776 N.E.2d 262, 332 Ill.App.3d 760, 267 Ill. Dec. 125 (2002), (approving admission of data retrieved from an automobile crash sensor in a personal injury case).

CONCLUSION

Medical records are in a state of transition from paper records to electronic data. Being aware of the changes to HIPAA, the HITECH Act, the DHHS Privacy Rule and Security Rule, and the capabilities of computer forensics, are necessary in dealing with electronic medical records as evidence.

*James G. Meyer is an attorney who practices in the law firm of Ialongo & Meyer in Chicago.

**Jonathan P. Tomes is an attorney admitted in Illinois, Missouri, Kansas, and Oklahoma who practices in the law firm of Tomes & Dvorak, Chartered, in Overland Park, Kansas, and consults around the country on HIPAA and the HITECH Act. He has also served as an expert witness on HIPAA, medical records, and the Federal Tort Claims Act in cases in Illinois, Washington, DC, and Colorado.

***Lee Neubecker is a computer forensics expert and the principal of Enigma Forensics, a Chicago based computer forensics & expert witness consulting firm.

Notes

[1] We mean “EMR” to include Electronic Medical Records (digital information created, gathered, managed and consulted by clinicians and staff within one health care organization), Electronic Health Records (“EHR”) (digital information that may be operated by clinicians and staff across more than one healthcare organization – sometimes referred to as “interoperability”) and Personal Health Records (“PHR”) (digital information that can be accessed and created by patients themselves). See, http://www.healthit.gov/providers-professionals/faqs/what-difference-between-personal-health-record-electronic-health-record

[2] U.S. Department of Health and Human Services Centers for Medicare & Medicaid Services, 42 C.F.R. Parts 412, 413, 422, et seq., Medicare and Medicaid Programs; Electronic Health Records Incentive Program; Final Rule; Title XIII of the American Recovery and Reinvestment Act of 2009, the Health Information Technology for Economic and Clinical Health Act, Subtitle A, Part 2, Subtitle C (hereinafter “HITECH Act”).

[3] 77 Ill. Admin. Code § 250.1510(b)(2).

[4] Public Law 104-191, 110 Stat. 1396 (1996).

[5] 45 C.F.R. §160.103.

[6] Id. (Note that PHI may also consist of paper records and oral communications).

[7] storage media

[8] The National Institute of Standards and Technology (“NIST”) of the U.S. Department of Commerce has defined cloud computing as follows:

Cloud computing has been defined by NIST as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or cloud provider interaction.

Peter Mell, Tim Grance, The NIST Definition of Cloud Computing, Version 15, October 7, 2009 at http://csrc.nist.gov/groups/SNS/cloud-computing. More and more large health care providers are hiring outside hosts to maintain their electronic health records “in the cloud,” using large companies like Google, Microsoft, or Amazon or smaller companies that provide hosting only for medical records.

[9] http://en.wikipedia.org/wiki/Volatile_storage

[10] Id.

[11] Putzer, J. MD, Park, Y, Are Physicians Likely to Adopt Emerging Mobile Technologies? Attitudes and Innovation Factors Affecting Smartphone Use in the Southeastern United States, Perspectives in Health Information Management, Spring 2012. p. 2, at http://www.perspectives.ahima.org/attachments/article/241/ArePhysiciansLikelyTo AdoptEmergingMobileTechnologies_final.pdf (last visited January 14, 2013).

[12] http://www.mhimss.org/about-us (last visited February 25, 2013).

[13] 45 CFR §164.500, Subpart E, Privacy of Individually Identifiable Health Information. (The Privacy Rule applies to both paper and electronic medical records.)

[14] 45 CFR §164.302, Subpart C, Security Standards for Protection of Electronic Protected Health Information.

[15] 45 CFR §164.502 Uses and disclosures of protected health information: general rules.

“(a) Standard. A covered entity may not use or disclose protected health information, except as permitted or required by this subpart or by subpart C of part 160 of this subchapter.”

[16] 45 CFR §164.306 Security standards: general rules.

“(a) General requirements. Covered entities must do the following:

(1) Ensure the confidentiality, integrity, and availability of all electronic protected health information he covered entity creates, receives, maintains, or transmits.”

[17] See generally, 45 CFR §§ 164.506, 164.508, 164.510, 164.512.

[18] 45 C.F.R. § 164.528.

[19] 45 CFR §164.306(a)(1).

[20] 45 CFR §164.304.

[21] 45 C.F.R. § 164.312 (b) Standard: Audit controls.

[22] 45 C.F.R. § 164.308(a)(1)(D).

[23] See Thomas R. McLean, EMR Metadata Use and E-Discovery, 18 Ann. Of Health Law 75 (2009).

[24] hard drive imaging

[25] http://www.hipaacompliancejournal.com/2011/03/knowing-about-advanced-encryption-standard-aes/

[26] Ill. Sup. Ct. Rule 201 (b)(1).

[27] Ill. Sup. Ct. Rule 214. The Committee Comments to Rule 214 further clarify. “The first paragraph has also been amended to require a party to include in that party’s production response all responsive information in computer storage in printed form. This change is intended to prevent parties producing information from computer storage or computer discs or in any other manner that tends to frustrate the party requesting discovery from being able to access the information produced. Rule 201(b) has also been amended to include in the definition of ‘documents’ all retrievable information in computer storage, so that there can be no question but that a producing party must search its computer storage when responding to a request to produce documents pursuant to this rule.”

[28] Illinois Rule of Evidence 801(a).

[29] See generally, The Sedona Conference Commentary on ESI Evidence & Admissibility 10 (2008).

[30] Illinois Rule of Evidence 803(6) “Records of Regularly Conducted Activity.”

[31] Illinois Rule of Evidence 1001.

[32] Illinois Rule of Evidence 1003.

[33] Illinois Rule of Evidence 1006.

Reprinted with permission of the Illinois Bar Journal,

Vol. 101 #8, August 2013. Copyright by the Illinois State Bar Association www.isba.org

Related Electronic Medical Records Posts:

Please follow and like us: