Robocall Legislative Update

Are robocalls driving you nuts?

Cyber Security & Computer Forensics Expert Lee Neubecker and Data Privacy Expert Debbie Reynolds discuss recent efforts to pass legislation in the House and Senate that would hold telecommunication providers responsible for addressing the ever growing tide of robocalls disrupting consumers and businesses. Existing laws such as the Telephone Consumer Protection Act (TCPA) have proven effective in blocking off shore robocalls. VOIP technology allows for robocall centers to systematically dial U.S. consumers and businesses from beyond the legal reach of our court system. Popular spoofing techniques such as Neighborhood Calling often impersonate the first 6 digits of the call receiver’s phone number in the hope of enticing that call receiver to answer a call. Neubecker and Reynolds both share their frustrations with the current situation and are hopeful the U.S. Senate and the President will take immediate action to pass updated privacy legislation protecting us all from spam robocalls.

The transcript of the video follows:

Lee Neubecker: I’m here today with Debbie Reynolds. We’re going to be talking a little bit about robocall and some new legislation coming our way. Those annoying phone calls we all get on our cellphones.

Debbie Reynolds: That’s right.

Lee Neubecker: Have you gotten any calls where it’s the first six digits of your phone number?

Debbie Reynolds: Yes!

Lee Neubecker: That’s called “neighborhood calling”. And basically, what the bad guys are doing is that they’re using VOIP technology to spoof, and they’re plugging in any number. So they can actually impersonate people you know. But they do this because they think that it increases the likelihood that you’ll answer the phone. In fact, for me, when I see those first six digits, I’m not even going to answer it.

Debbie Reynolds: Oh, absolutely. Absolutely. It’s wrong or what now?

Lee Neubecker: One of the big problems we have is no one’s taking accountability for this. I heard AT&T is trying to force some authentication mechanisms, but there needs to be some more teeth on this so that people can’t just impersonate phone numbers, or we’ll never get through this.

Debbie Reynolds: Absolutely, absolutely. Actually, so, thankfully this law passed, right?

Lee Neubecker: Well, it’s going through. It passed under the House, overwhelmingly

Debbie Reynolds: Overwhelming, yeah.

Lee Neubecker: They’re hoping that… It said it could happen by 2020, perhaps?

Debbie Reynolds: Okay, that’d be good.

Lee Neubecker: But it’s got to… I think they have to reconcile the two bills, the House and senate, and then the President has to sign it. But by the show of votes, I think everyone’s in favor of let’s tackle all these annoying robocalls.

Debbie Reynolds: Absolutely. So the FCC, they really made a lot of headway many years ago on the Do Not Call Registry, so this will be sort of another layer to that, that the FCC is sort of looking at. I don’t know about you, but I’m very annoyed when I get robocalls, so I’m not happy about this. Maybe it will happen after the election, because the election, people like to be robocalled.

Lee Neubecker: I get tons of calls from people wanting to lend me money, They will ring my phone once and then it will hit my voicemail. This woman keeps calling, saying, I want to speak to you. It’s like, and it’s not even a real person, It’s all automated. It’s annoying.

Debbie Reynolds: Oh, my goodness. Well, one interesting thing about the law, or the one that they’re anticipating, or trying to pass, that I haven’t seen in other laws like this, they’re trying to force companies to create technology, to be able to tell a robocall.

Lee Neubecker: The carriers need to enforce it. The carriers have to stop allowing unsecured VOIP to impersonate calls.

Debbie Reynolds: Right. The House does not allow it, but they specifically said they have to create, if it does exist, they have to create some technology to make sure they can tell a robocall from a normal call?

Lee Neubecker: It’s basically like, we’re going to block any call that isn’t using a means of identity verification. Right now, it’s about a bust.

Debbie Reynolds: And they can’t charge for it, so it’s not like an extra fee. I’m sure what’ll happen is they’ll do you another fee and then call it something else, but it’ll be probably just robocalls.

Lee Neubecker: The act also increased the penalty. Current legislation, the TCPA, the Telephone Consumer Protection Act, dealt with spam faxes, calls, and what-not, but the robocall act is going to produce penalties I think to ten thousand dollars each.

Debbie Reynolds: Per incident.

Lee Neubecker: Per incident.

Debbie Reynolds: So that’s a lot.

Lee Neubecker: So that’s going to drive my TCPA consulting business, because that’s work.

Debbie Reynolds: Yeah, absolutely. Well, if it actually makes it, I’m sure the thing about the $10,000 per incident and also, forcing companies to create technology to be able to tell what’s a robocall, corporations or the carriers are probably going to fight that. So, we’ll see.

Lee Neubecker: Yeah. So Debbie, what are the likely impacts on the litigation environment, as you see it? If this legislation goes through?

Debbie Reynolds: Well, first of all, there will be companies that will, uh, I’m sure there will be consumer groups that want to bundle together consumer complaints and probably go after these carriers to try to get these big fines or whatever. So, this could be tying up the legislation for a while. Once the lawyers get their fees, You’ll probably want to get the $10,000 per incident.

Lee Neubecker: It’s going to make it a lot more, in my opinion, they will make it much easier to actually identify who’s behind it, because right now people are using proxy phone numbers to call and many of them are just total scams run out of the country. You can’t– A Nigerian spam call center, we can’t really go after, but if our carriers say they’re going to block these rogue, foreign VOIP connections, then it will make it more secure. Ultimately, you’ll probably have people who opt in to the insecure network, and people who want a secure-only platform where it’s no use calling them.

Debbie Reynolds: I agree.

Lee Neubecker: Thank you for being on the show today. It was great to have you on again. I love your scarf.

Debbie Reynolds: Thank you.

Lee Neubecker: You always have interesting scarves.

Debbie Reynolds: Thank you. A pleasure.

Lee Neubecker: We’ll see you soon.

Debbie Reynolds: Okay, bye bye.

Debbie Reynolds Contact Info

datadiva at debbiereynoldsconsulting dot com
312-513-3665
https://www.linkedin.com/in/debbieareynolds/
https://debbiereynoldsconsulting.com/

Computer Forensics in Medical Malpractice

Importance of Computer Forensics in Medical Malpractice Litigation by revealing patient electronic medical records.

Computer Forensics Wins Litigation

Enigma Forensics CEO & President Lee Neubecker interviews James Meyer a personal injury attorney from Ialongo and Meyer. Computer Forensics uncovers answers to important questions such as; what orders may or may not have been entered as a result of that medical test. In this video, Lee and Jim share some of the changes that have ocurred that impact medical malpractice litigation. Tune in to find out how using computer forensics can make or break a case.

The transcript of the video interview follows:

Lee Neubecker: Hi this is Lee Neubecker, I’m here with Jim Meyer from Ialongo and Meyer, and we’re here today talking about patient medical records, specifically electronic medical records. Some of the changes that have happened that impact medical malpractice litigation. So Jim, can you tell me a little bit about EMR and how computer forensics plays a role in cases that you’re litigating, where you’re trying to get a result for your client?

Jim Meyer: Well EMR has changed everything, in regards to medical records. HIPAA is required that the electronic medical records, both be secure and private, that requirement provides that a lot of metadata is collected with every electronic medical record. That metadata itself is very important in… Capturing information about where, when, how and whom, made the medical record, can be crucial in any medical investigation.

Lee Neubecker: Look, can you tell me an example of what type of metadata you might be asking for, and why it would be relevant to the outcome of litigation?

Jim Meyer: Well… The metadata that is most interesting in most cases is, when certain events occurred in a medical record. When a test was ordered, when it was performed, when the results were placed in the patient’s medical record, when the physician saw those results, what orders may or may not have been entered as a result of that medical test. When medication is prescribed, when it’s administered, who administered the medication. Many of these details are now electronically captured, as opposed to being physically noted, as they were in old written medical records. It can make a… Big difference in trying to determine when events occurred in a case.

Lee Neubecker: I know one of the cases I was involved in, I discovered that many of the different default reports that are provided with these medical software packages, don’t necessarily show all available metadata. In fact, what we had to do on one of the cases, we had to work through discovery to try to get the scheme of the database. And then we discovered in once instance that there was something known as a sticky note, that the nurses and physicians could type little comments in, but there was a presumption that would never get printed because it’s not in any of the default reports. So what we actually had to do is find the table that had these notes, and then work to get the data dumped. And as soon as we found that, the case quickly settled, because obviously, the hospitals don’t want everyone knowing what’s going on.

Jim Meyer: That’s a disadvantage that a plaintiff in a case may have. Hospitals often times have entire departments in medical informatics, departments in which they have experts that know the in’s and out’s of the EMR, the metadata collected, often times plaintiffs do not, but they should be aware of the fact that that metadata exists. Extracting it from the record is often times… It is a need for an expert at computer forensics, expert, an IT expert. But it’s important that plaintiffs, and all attorneys, defense attorneys and plaintiffs attorneys realize that that information exists as metadata in these records, it can be obtained. We take great deal of effort to obtain it, but it’s there.

Lee Neubecker: And Jim and I co-authored a paper along with another attorney that appeared in the Illinois State Bar Association on EMR patient medical records, the audit trail and other things impacting HIPPAA and medical malpractice regulations. We’ll put that up here too so that you check that out. Anything else you’d like to add about your practice, Jim?

Jim Meyer: No, we’re happy practicing attorneys in Chicago, Illinois. I would recommend any attorney who is involved in any issue similar to this, to take a look at the article that Lee was kind enough to co-author with me and John Tomes, it really is a lot of information. Detailed information that attorney’s should know.

Lee Neubecker: Great, thank you.

Jim Meyer: You’re welcome.

To Learn More about Computer Forensics and Patient Electronic Medial Records

Read the Illinois State Bar Article co-authored by the interviewed subjects on Patient Medical Records.

Computer Fraud & Abuse Act Charges Filed

Capital One Data Breach

Capital One Data Breach – Interview of Data Privacy & eDiscovery expert on the fallout

Cyber Security &  Computer Forensics Expert Lee Neubecker interviews Data Privacy Expert Debbie Reynolds on the fallout from the recently disclosed Capital One Data Breach that occurred following alleged hacking of the company’s data stored in the cloud.  Issues discussed include an assessment of how the CEO of Capital One managed the crisis, pending charges filed against Paige Thompson and the Computer Fraud and Abuse Act in the government’s complaint filed earlier this week.

Transcript of video follows

Lee Neubecker: Hi, I’m here today with Debbie Reynolds from Debbie Reynolds Consulting and we’re going to be talking today about the recent news involving the Capital One Data Breach Thank you for being on the show Debbie.

Debbie Reynolds: Thank you for inviting me. It’s such a thrill, you’re such a joy to be around to talk to so it’s great to do this

Lee Neubecker: Well it’s great to have you here. So, trial’s expected this Thursday in the case. Can you tell everyone a little bit about what happened this week?

Debbie Reynolds: So this week is in the news that Capital One had a data breach. There was a woman who used to be, I believe she’s worked Amazon if I’m not mistaken, who had found a vulnerability in Capital One’s cloud system, and was able to obtain private or digital information on over a hundred billion customers or potential customers for Capital One so as far as I can tell they say that she may have gathered social security numbers and other private information about individuals who had even applied, who may not even be customers of Capital One, who have even applied for a Capital One credit card back as far as 2005.

Lee Neubecker: Yep.

Debbie Reynolds: So the vulnerability that was discovered and part of the reason why it was discovered was because she had apparently bragged about it on Twitter and she used her real name and so they were able to pull this stuff together. And I think the SWAT team went to her house?

Lee Neubecker: Yeah, so she was using the IP, iPredator, which is supposed to anonymize and protect you. When she was using that she created her online GitHub accounts and other accounts and it had that IP, the iPredator IP address range in her profile linked to her name. So she wasn’t really being smart about it.

Debbie Reynolds: No. So yeah, I think that she was bragging about what she had, I guess she was proud of what she had done and apparently someone who had seen something she had post on some forum contacted Capital One. This wasn’t a breach in which Capital One found out about; someone from the outside said, “Hey, this girl says that she has your data” and now it’s a really big thing.

Lee Neubecker: Yeah so now she’s charged with a computer fraud and abuse act which I think she’ll probably end up …

Debbie Reynolds: Yeah.

Lee Neubecker: Do you think she’ll get a plea?

Debbie Reynolds: She’s probably going to go to the slammer. It seems like especially when the SWAT team showed up at her house, they’re definitely going to make an example out of her with this. It’s pretty bad because I think right now the reports and what’s coming out from Capital One are different than what she said or what other people said they have. Because at one point they were saying that Capital One in their statement said that certain people’s social security numbers weren’t breached but then we know that they did get people’s social security numbers.

Lee Neubecker: It was mostly Canadian social security numbers, around a million–

Debbie Reynolds: Right.

Lee Neubecker: And then I think it was somewhere around 100,000 or so U.S. citizens.

Debbie Reynolds: Right, exactly.

Lee Neubecker: So it doesn’t necessarily impact the entirety of U.S. customers, but it still is–

Debbie Reynolds: It doesn’t, it doesn’t make you feel good. Yeah so basically over a hundred million people were touched in some way, shape or form. Even though not everyone’s personal data was taken to the same extent as everyone else, but I think this incident illustrates for us a couple of different things. First of all, they were saying that they had credit card information or information on people who had applied for credit cards going back as far as 2005. I’m not sure if they can make a justification for why they even had some of that stuff.

Debbie Reynolds: It’s first place. Especially if and I wonder what rights someone would have if they weren’t actually didn’t translate to being a customer of Capital One. The law’s kind of murky about how they should do that. I guess that’s the same issue with Equifax where not everyone who was touched by Equifax are customers of Equifax, they just happened to have their data.

Lee Neubecker: What would, how would you have advised Capital One had you gotten in there before the data breach?

Lee Neubecker: You think you might have been able to–

Debbie Reynolds: Well, you know–

Lee Neubecker: Get them in a better situation?

Debbie Reynolds: I think a lot of corporations, my view is that a lot of corporations have this mindset or business has this mindset of does it work? Does the computer work? Can I do the thing I need to do on a computer? The question that they’re not asking is is it secure? So a lot of them have a blind spot in terms of securing things because as long as it doesn’t impact their ability to work, they don’t really care how it works. So now companies have to ask how does it work? Is it secure? A lot of companies have these issues where they’re moving from internal infrastructure to the cloud and we know that the cloud infrastructure would typically be more secure quote unquote than someone’s on premise infrastructure but that all depends on how it was configured. The vulnerability that this woman was able to exploit in Capital One had to do with how the permissions and things were configured on a cloud infrastructure.

Lee Neubecker: And she had worked in that environment.

Debbie Reynolds: Right. So she had a little bit of extra insight–

Debbie Reynolds: Exactly.

Lee Neubecker: In this process.

Debbie Reynolds: Exactly. But I don’t know if you probably run into the same thing where you’re having clients that have cloud issues and they may feel more secure in themselves. Okay, we think our native is more safe than the cloud, not to say that the cloud is not safe, but if we have someone who doesn’t know how to fill those gaps and stop those vulnerabilities, it could be a huge problem.

Lee Neubecker: What do you think of the CEO’s response from Capital One?

Debbie Reynolds: I saw CEO’s response. I don’t know, someone needs to do a series about this where you compare all the response letters from these data breaches or whatever.

Lee Neubecker: That’s a great idea.

Debbie Reynolds: Not a bad response at all. I think the danger though is there may be an issue with consumer confidence obviously because no one wants their data breached, but if the things that are being said by the CEO or other leadership it becomes evident that it’s different than what actually happened, that’s going to be a problem.

Lee Neubecker: Yeah, cool.

Debbie Reynolds: I think rushing, the desire is to rush. To put out as much information as you possibly can but already the news reports are contradicting what the company is saying about what was actually breached.

Lee Neubecker: Well the complaint is available, I’ll post that on my website as well. I read the complaint and there’s a lot of detail in there and you’re right, in the news story they’re talking about Amazon cloud, they talk about a company that presumably is a subsidiary of Amazon inside the complaint.

Debbie Reynolds: Right.

Lee Neubecker: But they didn’t specifically mention Amazon in the complaint.

Debbie Reynolds: No, no so it’s going to be customers when they feel like they’ve had a data breach they definitely want, you know there’s attention that has to happen where the company wants to be as forthright and forthcoming as possible about what’s happened, but the facts may still be rolling out.

Lee Neubecker: Yeah.

Debbie Reynolds: The drip, drip, drip of it all may be tough I think.

Lee Neubecker: But I thought at least it was good that they public acknowledged it. It didn’t take forever to acknowledge it.

Debbie Reynolds: Oh, right exactly.

Lee Neubecker: And apologize, I mean–

Debbie Reynolds: Oh, absolutely. It does goes a long way–

Lee Neubecker: They just did that so I applaud them for not–

Debbie Reynolds: Absolutely.

Lee Neubecker: Sitting on it like Equifax.

Debbie Reynolds: Right. They didn’t say, “Well I’m sorry that you were hurt or you felt hurt,” or something where it’s like oh yeah, you know there is harm there so you might as well acknowledge it and try to at least be forthright about what you know and we know it.

Lee Neubecker: And from what I read too, not all of the data, some of the data was tokenized but there were birth dates, there were some socials. Debbie Reynolds: Right.

Lee Neubecker: And some other information that certainly if that were you or me, well we’re kind of becoming used to this all the time. It’s sad, but.

Debbie Reynolds: Right, well I mean and what we’re seeing, what I’m seeing, what companies are trying to argue in the U.S. having to do with data privacy is if you put, let’s say you’re on Facebook and you say, “Hey, today’s my birthday!” You know so if Lee puts his birthday on Facebook, is Lee’s birthday private? So let’s say you’re a Capital One customer, they could argue you know your birthday is not private because you put it on Facebook. That’s going to be an interesting theme.

Lee Neubecker: Well thanks so much for being on the show today.

Debbie Reynolds: It was fantastic, thank you.

Debbie Reynolds Contact Info

datadiva at debbiereynoldsconsulting dot com
312-513-3665
https://www.linkedin.com/in/debbieareynolds/
https://debbiereynoldsconsulting.com/

Crypto Currency Lending

Crypto currency lending and cyber security issues

Cyber Security Forensics Expert, Lee Neubecker and Draw Bridge Lending CEO Jason Urban describe crypto currency and the security issues as it relates to Bitcoin and

The transcript of the interview follows:

Lee Neubecker: Hi, I have Jason Urban on the show today. He’s the President and CEO of DrawBridge Lending. Thanks for being on the show Jason.

Jason Urban: Thanks for having me, Lee. This is great, glad to be here today.

Lee Neubecker: Jason, I’ve known you for awhile. You’ve been doing some innovative things in the lending industry as it relates to bitcoin and block chain. Tell us a little bit about that. Jason Urban : Sure, so what we do is we’re a lender against secured digital asset holdings and what we are providing is the draw bridge, or the bridge, from these traditional lending sources, or pools of liquidity, into this new ecosystem where everybody is trying to figure out how that landscape works.

Lee Neubecker: What type of people would have a need for your service? Jason Urban: I think they’re are a wide variety of people. People who have these digital assets and because of the way they’re categorized here in the States from the IRS perspective, when you spend them, when you use them, you encounter a taxable situation, but to the extent that you might need to pay your power bill or to go on a vacation or buy that boat you always wanted, you need fiat, you need US dollars, and what we provide is a mechanism or platform for people to borrow against the digital asset holders.

Lee Neubecker: So, if someone’s sitting on say 100 bitcoin, which is quite a bit of money, you’d allow them to take out a loan against that bit coin and use that for short term cash expense or whatever?

Jason Urban: Yes

Lee Neubecker: What is the duration of your loans typically?

Jason Urban: We typically focus one to six months. It’s a very volatile asset, and our backgrounds are managing that volatility, but there’s only so much you can do when something moves as rapidly as that does, which is an advantage to the asset, but it’s also difficult from a lending capacity. So our loans are one to six months in duration, and we offer renewal options, so you can re-up and renew. Just the strike price of that loan to value, think about your home moving 50% in a six month period, you might want to refi or you might need to put more money up. We try to mitigate a lot of those risks by offering the durations we do.

Lee Neubecker: So, your clients actually give you their cryptocurrency and you escrow it for them?

Jason Urban: Yes, so what we do is we don’t like to take possession of their currency. What we like to do is use a qualified third party custodian so that their digital assets are resting there, so they know they’re there, and I can’t take them unless they default on a loan or something unfortunate happens. All we want to do is provide a mechanism or a platform for someone to monetize their holdings. We don’t want to take possession of them. We don’t want their private keys. We’ll only take those in the event that they default or want us to satisfy their loan.

Lee Neubecker: So in this business, what measures do you take to help ensure that these digital assets are safe from a cyber attack perspective?

Jason Urban: Well, part of it, the key for us, is cold storage. And cold storage is basically storing these things on a server or computer where it’s not connected to the internet. It can’t be taken, so we require that all our custodians deploy a cold storage method as opposed to a warm storage or a hot storage. That way we know that the gold is in the vault so to speak but that it’s not going to be readily accessible to anybody out there.

Lee Neubecker: Have you had a situation where a customer gets angry because a price fluctuates and they feel that they were cheated out of there value?

Jason Urban: Interestingly we don’t have that problem because of the mechanisms that we deploy on the back end. So all our loans are no margin call and non-recourse unlike a lot of people in the business that will have you retop. Think about it this way, if I issue you a loan on an asset that’s worth $10,000, and I give you 50% of that asset in cash, if the value of that asset goes from 10,000 to 5,000, I now need to create that cushion again, so you need to pay me more money or reup or figure out. What we’ve developed, and our methodology, is a way to never have to worry about that, and we use the financial markets. We’re markets experts, and we’re risk managers, so we have mechanisms by which we can ensure that you don’t have to worry about topping off your loan.

Lee Neubecker: Are there any restrictions on the type of customers you can have based on what the SEC imposes on you?

Jason Urban: We are very compliant, so we are registered by the CFDC, and we follow all the rules and regs imposed on us by them. We have to do AMLKYC, anti-money laundering know your customer. We’re registered as a non-bank lender in all 50, or in 31 states. We operate in all 50 states so that we’re following not only consumer lending laws but also securities laws and commodities laws.

Lee Neubecker: Are there any requirements you have on customers before you can take them as a client? Well one, we have to do the AMLKYC on them. Right now, our products are geared towards accredited investors. Because of the way we do the hedging on the back end we need to make sure that those customers are sophisticated enough to understand what we’re doing. And so in order to do that, we need to put that accredited investor cap on things. It’s a little different under the CFDC umbrella. They call them qualified exchange participants, or ECPs, so there’s a couple of different buckets you wear, but it’s a little different than the SEC’s accredited investor, but effectively it’s the same thing.

Lee Neubecker: Is there a minimum net worth that your customer’s have to have?

Jason Urban: And that’s part of it, a minimum net worth of a million dollars, or an entity that’s a million dollars that’s what we require.

Lee Neubecker: What sectors do you see that this type of lending is getting the most interest in terms of where your clients are coming from?

Jason Urban: A wide variety, if you really think about it, bitcoin, or digital assets as a whole, can be held by anyone. It isn’t a single group that says, “Hey, I’m really into this.” So we see funds, minors, people who were early adopters of the technology, they’ve all kind of stepped forward. Additionally, we’ve got a product that’s geared towards people who would like to buy bitcoin and want to employ some of the same methodologies that we’re employing right now.

Lee Neubecker: Do you have any closing thoughts you’d like to share?

Jason Urban: I think that people often confuse block chain and decentralized ledgers with bit coin. I think the block chain technology is interesting on so many levels. I think that as the world becomes more tokenized, and I think you’re going to see more and more of that, everything from the artwork that you see on the walls to buildings to physical assets like gold, silver, oil. The world is moving towards that technology and that methodology, and I think that being an early adopter and understanding it is so important. If you want to make the same parallels, this is the internet in 1990 or 1995. The difference is the world moves much faster today than it did back then.

Lee Neubecker: So are you taking investors?

Jason Urban: We’re always willing to have strategic investors come into the space, and we’re not opposed to that. We’re very well capitalized, but we do recognize the value in being partners with people. And part of being partners is financial as well.

Lee Neubecker: Well thanks again for being on the show.

Jason Urban: Thank you very much.

Neubecker Presents at Chicago Science Writers

How secure are consumer IoT devices?

Lee Neubecker, Enigma Forensics President & CEO, will present on the potential impact of vulnerable consumer IoT devices as it relates to the security of the U.S. Power Grid.

The event will take place at the Medill School of Journalism Chicago Newsroom, 303 East Upper Wacker Drive Suite 1600, Chicago, IL 60601.
Date: Thursday, January 10th, 2019, from 5:30PM – 7:00PM.

The Chicago Science Writers organization is composed of writers that report on more technical topics. The Chicago Science Writers group provides a forum for people in the Chicago area who communicate science to the public. It organizes professional development programs and social gatherings. CSW provides a point of contact to national science organizations and local science groups interested in connecting with science writers in the Chicago area.

The public may register for this event at the following link:
https://www.eventbrite.com/e/chicago-science-writers-presents-hacking-the-power-grid-tickets-54182573536?aff=mcivte

Lee Neubecker to present at CyberSecurity International Symposium

Enigma Forensics’ CEO, Lee Neubecker will be presenting on Infrastructure Vulnerabilities relating to the potential for power outages to be caused by indirect cyber attacks on the power grid.
The Second CyberSecurity International Symposium will take place all day on Tuesday, November 13th, at Conference Chicago located at 525 South State Street Chicago, Illinois 60605. Neubecker will be presenting the topic, ” Hacking the Power Grid, Why We Should All Be Concerned About IoT Security” from 11:30-noon. A 40% discount code to Enigma Forensics clients is available to those wishing to attend. Please call Lee Neubecker for details.

The complete conference agenda is available at http://www.cybersecurity-symposium.com/agenda.htm1

Patient Medical Records: Metadata as Evidence in Litigation

ELECTRONIC MEDICAL RECORDS:

Metadata As Evidence in Litigation

By James G. Meyer* Jonathan P. Tomes** and Lee Neubecker***
As published: Vol. 101 #8, August 2013. Copyright by the Illinois State Bar Association www.isba.org

Doctor and hospital records are changing. The paper medical records that we have been familiar with, along with the rest of the “written” world, are becoming electronic —that is, written, maintained, and retrieved as digital data.

Because of many emerging “after entry” benefits, federal and state governments, insurance companies, and medical institutions are heavily promoting the adoption of Electronic Medical Records (“EMR”).[1] For example, the HITECH Act (American Recovery and Reinvestment Act of 2009[2]) includes both incentives and penalties in its calculations to encourage adoption of electronic records, versus continued use of paper records. The Act allows benefits of up to $44K per physician under Medicare or up to $65K over six years under Medicaid for adoption of electronic records. Additionally, Congress decreased Medicare/Medicaid reimbursements to doctors who fail to use electronic medical records by 2015 for covered patients.

This change in medical record keeping and changes in the laws and regulations associated with electronic medical record keeping are creating significant changes in what and how information may become evidence in litigation.

Attorneys who deal with medical records in any type of litigation should be aware of the changes in the following areas:

I. Electronic Medical Records and HIPAA

II. PHI as Electronically Stored Information

III. What is Discoverable: Metadata and Computer Forensics

IV. A Word about Encryption

V. Discoverability and Admissibility of Electronic Medical Records and Metadata

I. ELECTRONIC MEDICAL RECORDS AND HIPAA

Before the advent of electronic medical records, The Illinois Administrative Code itemized the minimum requirements for the content, management, and administration of medical records.[3]

The Health Insurance Portability and Accountability Act of 1996 (“HIPAA”)[4] sets out a comprehensive set of rules, safeguards, and definitions that are, effectively, applicable to most health care providers that use computers and electronic storage devices to store or transmit patient medical records. Excepted from the statute are institutions that do not transmit billing transmissions to and from Medicare/Medicaid or other health plans, an uncommon circumstance. With the HITECH Act’s incentives to use electronic health records, more and more providers will do so.

What we have understood to be doctor and hospital medical records, HIPAA defines more comprehensively as health information: “any information, whether oral or recorded in any form or medium, that:

i. Is created or received by a health care provider, health plan, public health authority, employer, life insurer, school or university, or health care clearinghouse; and

ii. Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual.”[5]

Under HIPAA, Protected Health Information(“PHI’) is “individually identifiable health information” that is:

i. Transmitted by electronic media;

ii. Maintained in electronic media; or

iii. Transmitted or maintained in any other form or medium.”[6]

II. PHI AS ELECTRONICALLY STORED INFORMATION

To understand where and how EMR systems “transmit” and “maintain” PHI, it is helpful to use the terminology of computer experts. From their viewpoint, HIPAA’s PHI is Electronically Stored Information (“ESI”).

ESI is data stored, processed, retrieved or transferred by “Electronic Storage Devices.”[7] Electronic Storage Devices – a subclass of Electronic Media – are commonly known as diskettes, Flash Drives and CD/DVD Disk media. Both Electronic Storage Devices and Electronic Media are capable of containing ESI (thus PHI).

Electronic Storage Devices capable of storing ESI can be classified into two main categories – Non-Volatile Electronic Storage Devices and Volatile Electronic Storage Devices.

Non-Volatile Electronic Storage Devices store data on a more or less permanent basis, but can often be deleted or destroyed. These can be grouped into several categories – Primary Storage Devices, Secondary Storage Devices, Offline Backup/Archival, and “In the Cloud.” Examples of each are:

Primary Storage Devices

(1) Hard Disk Drives

(2) Disk Media

(3) ROM / PROM / EPROM

(4) Solid State Drives (Flash Storage)

(5) SIM Cards

(6) Multi Media Cards (SD, SDHC, SDXC, SDIO, and Others)

(7) Smart Cards, Chip Cards or Integrated Circuit Card

(8) Paper Based Storage (Punch Cards, Bar Codes, Scantron)

Secondary Storage Devices

(1) USB Thumb Drives / Flash Drives

(2) External Hard Disk Drives

(3) Disk Media (Floppy Disk, CD, DVD, Blue Ray)

(4) Radio-Frequency Identification (RFID) Tags

Offline Backup / Archival

(1) Magnetic Tape

(2) Disk Media (Floppy / CD / DVD / Blue Ray)

(3) Bar Code Paper Records

(4) CD / DVD Disk Media

In the Cloud (Utilizes all types of Storage)[8]

Volatile[9] Electronic Storage Devices retain a good deal of ESI for a discrete period of time, e.g. until such time that the Volatile source loses power. The RAM in a computer is an example of Volatile Electronic Storage Devices.

ESI may be transmitted between Electronic Storage Device sources via the internet, extranets, infrared, radio, Wi-Fi, Satellite, Cable, Broadband, cellular, leased lines, barcode, dial-up telephone lines, private networks, connected external devices, and devices that are physically moved from one location to another using magnetic tape, disc, or compact disc media.[10]

A patient’s PHI maintained in any of these Electronic Storage Devices or transmitted by any of these means of electronic transmission are potential sources of discoverable information. Smart phones and PDAs are increasingly used in association with electronic health data. Industry sources estimate that “in 2010, more that 50 percent of physicians were using smartphones or PDAs on a regular basis in clinical decision making.”[11] As an indication of how important mobile devices have become in healthcare, the Healthcare Information and Management Systems Society (“HIMSS”), a leading non-profit industry group, has formed a separate entity, mHIMSS, to focus exclusively on the use of mobile and wireless technologies in healthcare.[12]

III. WHAT IS DISCOVERABLE: METADATA AND COMPUTER FORENSICS

The Department of Health and Human Services (“DHHS”) regulations implementing HIPAA govern PHI with both a Privacy Rule[13] and a Security Rule[14]. As their names imply, the rules require adoption of enumerated standards and safeguards so that covered entities protect a patient’s electronic (and paper) medical records from unauthorized access,[15] tampering, or destruction[16].

Attorneys that have been involved with medical records in litigation since the enactment of HIPAA and the implementation of the DHHS regulations are generally aware that the Privacy Rule enumerates the ways to obtain PHI from health care providers during discovery by the use of written authorization or subpoena.[17]

In addition to delineating how to obtain PHI, HIPAA’s Privacy Rule also requires that covered entities have procedures in place to give individuals an accurate accounting of disclosures of their PHI in cases in which an accounting is required.[18]

HIPAA’s Security Rule requires that a covered entity “ensure the confidentiality, integrity and availability of all electronic PHI the covered entity creates, receives, maintains or transmits”.[19] The standard specifically defines “confidentiality” as “the property that data or information is not made available or disclosed to unauthorized persons or processes” and “integrity” as “the property that data or information have not been altered or destroyed in an unauthorized manner.”[20]

In order to implement the Privacy and Security Rules, HIPAA requires covered entities to use “audit controls,” such as “hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic protected health information”[21] and to “implement procedures to regularly review records of information system activity, such as audit logs, access reports and security tracking reports.”[22] The Metadata generated by these audit control systems, about the access and use of a patient’s records and the use and operation of the computer device maintaining or transmitting the records, is typically not part of the formal medical record. But it can often be a gold-mine of important information that would not otherwise be obtainable in discovery.[23]

For example, Metadata in the form of an audit log or audit trail may be helpful with faulty or incomplete memories. An audit trail is a record of who, when, where, how and sometimes why a person used a computer program or accessed a patient’s medical record. Typically, the identity of the user who accesses the patient’s record, the time of access, the terminal or device used for access, the action taken by the user (i.e., viewing the record, changing the record), and the substance of anything added to the record and any changes or corrections made by the user are recorded in the Metadata which can be reproduced in the form of an audit trail or log. In a case known to the authors, a hospital audit trail produced during discovery, showing the “terminal identifier” for an EMR entry (the unique number assigned to each computer terminal in the EMR system) resulted in a nurse changing her testimony when it disclosed she was using a computer terminal in another part of the hospital, and was not with the patient, as she had testified.

Metadata, such as in an audit trail, is captured automatically by the EMR system. As a result, the audit trail should correspond, entry by entry, to the patient’s medical chart or record. If an entry in the audit trail shows data was added, changed or deleted, a corresponding entry should appear in the patient’s chart, and vice versa.

Metadata found in a forensic image of a medical record may be more helpful. A “forensic image” is not simply a copy of the electronic record; it is a bit-for-bit copy of all sectors of the media involved and must be done properly.[24] In a case known to the authors, the analysis of the Metadata on a video disk of a surgical procedure produced during discovery showed that the several of the video clip files in the series of video files that were generated during the procedure were deleted, with the remaining video clips renumbered in an apparent attempt to conceal what transpired during the missing video clips. An analysis of the DICOM video clip embedded Metadata within the contents of each of the DICOM video files revealed the original clip sequence numbers were different for the last few video clips. The file Metadata compared to the DICOM video clip embedded Metadata implied an intentional manipulation of the data in order to alter the events that actually occurred.

IV. A WORD ABOUT DATA ENCRYPTION

Data encryption does not ensure the confidentiality or integrity of PHI. HIPAA’s data encryption standards allow health care providers, health insurance companies and business associates who transmit, store or access protected health information in electronic form to utilize a standardized level of data encryption when encryption is reasonable and appropriate. The Advanced Encryption Standard (AES) is an Federal Information Processing Standards (FIPS) approved cryptographic algorithm used to protect electronic data and is quite prevalent in the healthcare industry to secure data-at-rest, data-in-motion and data-in-transit.[25]

PHI data is vulnerable when actively used and stored in volatile memory. Much of a patient’s information is stored unencrypted in volatile memory when a computer device is actively working with a patient’s record or following the access of a patient’s record until such time that the data is discarded automatically or the computer device shuts off. Anyone with physical or network access to the device or a strong hacker skill set would have a reasonable opportunity to capture the non-encrypted information stored in volatile memory.

Another vulnerable area of risk is when PHI is in transit without the appropriate encryption safeguards. Encrypted ESI using today’s standards is unlikely to be compromised while in a data-at-rest, data-in-motion and data-in-transit state. But, ESI containing PHI is unencrypted at the point of service on a portable or fixed computing device. These devices are sometimes not properly secured with the appropriate physical and network security protections required, providing an opportunity to manipulate the unencrypted data.

V. Discoverability and Admissibility of Electronic Medical Records and Metadata

Illinois Supreme Court Rules make electronic data discoverable. Under Rule 201, “General Discovery Provisions,” discoverable “documents” include “all retrievable information in computer storage.”[26] Rule 214, “Discovery of Documents, Objects, and Tangible Things,” specifically requires production of “all retrievable information in computer storage in printed form.”[27]

Medical records have long been admissible as an exception to the hearsay rule. Before adoption of the Illinois Rules of Evidence (effective January 1, 2011), Illinois Supreme Court Rule 236(b), as amended in 1992, was generally accepted as permitting the admission into evidence of medical and hospital treatment records, in written or computer form, as business records. That rule is silent, however, as to computer generated “data” or “data compilations.” Any confusion in that regard seems resolved in the new Rules of Evidence.

In the first instance, much of the Metadata recorded in an electronic medical record may not be hearsay at all. Rule 801 defines a hearsay “statement” as the oral or written assertion or conduct of a “person.”[28] Automatically imprinted Metadata, is not the assertion or conduct of a person. See, People v. Holowko, 486 N.E.2d 877, 109 Ill. 187 (1985) (recognizing the difference between computer stored information, which may be hearsay, and computer generated information, which is not hearsay). Recorded Metatdata in an EMR system is similar to images recorded on surveillance cameras, which are not hearsay. People v. Tharpe-Williams, 676 N.E. 2d 717, 286 Ill. App. 3d 605 (1997). Because Metadata involves no human input in its creation, other than the actions taken by the user in creating or manipulating the file or record referenced by the Metadata, it is non-hearsay evidence.[29]

To the extent that Metadata does include human input, the new rules provide a hearsay exception for “a memorandum, report, record, or data compilation, in any form, of acts, events, conditions, opinions, or diagnoses” kept as part of a regularly conducted business activity.[30] In addition, the new rules make “writings” and “recordings,” defined to include “numbers . . . set down by . . . magnetic impulse, mechanical or electronic recording, or other form of data compilation,”[31] admissible as “duplicates”[32] or when offered “in the form of a chart, summary, or calculation.”[33]

Although Illinois decisions on the admission of electronic data are not as common as cases in the federal courts, Illinois cases predating the new rules have approved its admission. See, for example, Bachman v. General Motors, 776 N.E.2d 262, 332 Ill.App.3d 760, 267 Ill. Dec. 125 (2002), (approving admission of data retrieved from an automobile crash sensor in a personal injury case).

CONCLUSION

Medical records are in a state of transition from paper records to electronic data. Being aware of the changes to HIPAA, the HITECH Act, the DHHS Privacy Rule and Security Rule, and the capabilities of computer forensics, are necessary in dealing with electronic medical records as evidence.

*James G. Meyer is an attorney who practices in the law firm of Ialongo & Meyer in Chicago.

**Jonathan P. Tomes is an attorney admitted in Illinois, Missouri, Kansas, and Oklahoma who practices in the law firm of Tomes & Dvorak, Chartered, in Overland Park, Kansas, and consults around the country on HIPAA and the HITECH Act. He has also served as an expert witness on HIPAA, medical records, and the Federal Tort Claims Act in cases in Illinois, Washington, DC, and Colorado.

***Lee Neubecker is a computer forensics expert and the principal of Enigma Forensics, a Chicago based computer forensics & expert witness consulting firm.

Notes

[1] We mean “EMR” to include Electronic Medical Records (digital information created, gathered, managed and consulted by clinicians and staff within one health care organization), Electronic Health Records (“EHR”) (digital information that may be operated by clinicians and staff across more than one healthcare organization – sometimes referred to as “interoperability”) and Personal Health Records (“PHR”) (digital information that can be accessed and created by patients themselves). See, http://www.healthit.gov/providers-professionals/faqs/what-difference-between-personal-health-record-electronic-health-record

[2] U.S. Department of Health and Human Services Centers for Medicare & Medicaid Services, 42 C.F.R. Parts 412, 413, 422, et seq., Medicare and Medicaid Programs; Electronic Health Records Incentive Program; Final Rule; Title XIII of the American Recovery and Reinvestment Act of 2009, the Health Information Technology for Economic and Clinical Health Act, Subtitle A, Part 2, Subtitle C (hereinafter “HITECH Act”).

[3] 77 Ill. Admin. Code § 250.1510(b)(2).

[4] Public Law 104-191, 110 Stat. 1396 (1996).

[5] 45 C.F.R. §160.103.

[6] Id. (Note that PHI may also consist of paper records and oral communications).

[7] storage media

[8] The National Institute of Standards and Technology (“NIST”) of the U.S. Department of Commerce has defined cloud computing as follows:

Cloud computing has been defined by NIST as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or cloud provider interaction.

Peter Mell, Tim Grance, The NIST Definition of Cloud Computing, Version 15, October 7, 2009 at http://csrc.nist.gov/groups/SNS/cloud-computing. More and more large health care providers are hiring outside hosts to maintain their electronic health records “in the cloud,” using large companies like Google, Microsoft, or Amazon or smaller companies that provide hosting only for medical records.

[9] http://en.wikipedia.org/wiki/Volatile_storage

[10] Id.

[11] Putzer, J. MD, Park, Y, Are Physicians Likely to Adopt Emerging Mobile Technologies? Attitudes and Innovation Factors Affecting Smartphone Use in the Southeastern United States, Perspectives in Health Information Management, Spring 2012. p. 2, at http://www.perspectives.ahima.org/attachments/article/241/ArePhysiciansLikelyTo AdoptEmergingMobileTechnologies_final.pdf (last visited January 14, 2013).

[12] http://www.mhimss.org/about-us (last visited February 25, 2013).

[13] 45 CFR §164.500, Subpart E, Privacy of Individually Identifiable Health Information. (The Privacy Rule applies to both paper and electronic medical records.)

[14] 45 CFR §164.302, Subpart C, Security Standards for Protection of Electronic Protected Health Information.

[15] 45 CFR §164.502 Uses and disclosures of protected health information: general rules.

“(a) Standard. A covered entity may not use or disclose protected health information, except as permitted or required by this subpart or by subpart C of part 160 of this subchapter.”

[16] 45 CFR §164.306 Security standards: general rules.

“(a) General requirements. Covered entities must do the following:

(1) Ensure the confidentiality, integrity, and availability of all electronic protected health information he covered entity creates, receives, maintains, or transmits.”

[17] See generally, 45 CFR §§ 164.506, 164.508, 164.510, 164.512.

[18] 45 C.F.R. § 164.528.

[19] 45 CFR §164.306(a)(1).

[20] 45 CFR §164.304.

[21] 45 C.F.R. § 164.312 (b) Standard: Audit controls.

[22] 45 C.F.R. § 164.308(a)(1)(D).

[23] See Thomas R. McLean, EMR Metadata Use and E-Discovery, 18 Ann. Of Health Law 75 (2009).

[24] hard drive imaging

[25] http://www.hipaacompliancejournal.com/2011/03/knowing-about-advanced-encryption-standard-aes/

[26] Ill. Sup. Ct. Rule 201 (b)(1).

[27] Ill. Sup. Ct. Rule 214. The Committee Comments to Rule 214 further clarify. “The first paragraph has also been amended to require a party to include in that party’s production response all responsive information in computer storage in printed form. This change is intended to prevent parties producing information from computer storage or computer discs or in any other manner that tends to frustrate the party requesting discovery from being able to access the information produced. Rule 201(b) has also been amended to include in the definition of ‘documents’ all retrievable information in computer storage, so that there can be no question but that a producing party must search its computer storage when responding to a request to produce documents pursuant to this rule.”

[28] Illinois Rule of Evidence 801(a).

[29] See generally, The Sedona Conference Commentary on ESI Evidence & Admissibility 10 (2008).

[30] Illinois Rule of Evidence 803(6) “Records of Regularly Conducted Activity.”

[31] Illinois Rule of Evidence 1001.

[32] Illinois Rule of Evidence 1003.

[33] Illinois Rule of Evidence 1006.

Reprinted with permission of the Illinois Bar Journal,

Vol. 101 #8, August 2013. Copyright by the Illinois State Bar Association www.isba.org

Related Electronic Medical Records Posts:

Cyber Security Summit in Chicago

WGN Midday News Steve Sanders interviews Chicago Cyber Security Expert Lee Neubecker.

Chicago’s Enigma Forensics CEO & President Lee Neubecker Video Interview with WGN on Cyber Security

WGN’s Midday News Reporter Steve Sanders, interviewed Enigma Forensics CEO Lee Neubecker and Cyber Security Chicago Conference Event Director David Juniper today.  The conference debuted last year and was successful.  Chicago is becoming a National Cyber Security and technology hub. 

Tomorrow’s event is taking place at McCormick Place on Sept. 26 and 27 featuring 90+ speakers and 4,000+ attendees.

Watch the interview on video by clicking below:

More on Cyber Security

Top Ways to Protect Your Home from Cyber Attacks

Top 10 Ways to Secure your Home from Cyber Attack

  1. Make sure you have a firewall that blocks outsiders from getting into your home network
  2. Patch your computers and devices at least monthly
  3. Buy IoT devices from vendors that build in security by default
  4. Purchase IoT devices that auto-update or can easily be patched
  5. Don’t purchase computing devices that use default username = admin, password = static default password
  6. Consider carefully if you really need a WiFi enabled toilet (or other appliance)
  7. Segregate your IoT devices by putting them on the guest network that many routers offer
  8. Purchase devices from manufacturers that publish the firmware updates online with verifying hash value
  9. Don’t buy devices from manufacturers that lack https secure encryption on their own website
  10. Discard out dated IoT devices that do not have patch updates available

Top Online Resources for Securing your Home Against Cyber Attacks

USA Department of Homeland Security CISA on Securing your home network security

USA Department of the Navy on Securing your Home Against Cyber Attacks

WGN Cyber Security Chicago Conference 2018

WGN on Cyber Security Conference

WGN Cyber Security Chicago Conference 2018 Video Interview 

WGN News is running a midday news segment promoting the Cyber Security Chicago Conference happening this Wednesday and Thursday (September 26th & 27th, 2018) at the McCormick Convention Center.  Neubecker will be sharing a preview of the featured presentation he is giving this Wednesday at the Conference on IoT security.

Tune in tomorrow for the 11AM – 12PM live broadcast.

More details on the conference available at https://leeneubecker.com/chicago-cyber-conference-2018/

Read More about Cyber Security Expert Lee Neubecker

Neubecker also is the founder of IT Security Blog leeneubecker.com. Before starting Great Lakes Forensics, Neubecker had served as CISO for HaystackID and following the acquisition of Envision Discovery and Inspired Review by HaystackID, Neubecker was promoted to serve as CIO over the combined entities. Neubecker was named one of the top Global Computer Forensics and Cyber security experts by Who’s who Legal in 2019 and many years prior to that.