Contact Tracing APPs are they ethical?

Are Contact Tracing APPs ethical? Are you willing to give up your private data to help slow the spread of the Coronavirus? Check out what these experts have to say!

Contact Tracing is it Ethical?

Apple and Google have the capability that allows cell phones to communicate with each other. Contact Tracing Apps use this capability and have been developed to find and alert the contacts of people infected with the Coronavirus / COVID-19. As soon as someone gets sick with Coronavirus, the APP could alert you if this is someone you have been in contact with. Alleviating the length of time it takes for a real live Contact Tracer who is doing the tracing. Basically, this is widespread human GPS tracking, that presents many privacy issues involving potential data breach, information storage, and sharing sensitive personal data. Should sensitive medical information and individual locations be available on an APP? Do you believe this type of electronic contact tracing is ethical?

Check out this video to listen in on experts as they consider the amount of data that is being collected and what it means for your data when you download a Contact Tracing APP.

Video Transcripts Follow

Lee Neubecker (LN): Hi this is Lee Neubecker from Enigma Forensics and I have Debbie Reynolds back on the show, thanks for coming back Debbie.

Debbie Reynolds (DR): Thank you for having me, very nice to be here.

LN: So I’m very interested to hear more of what your research is regarding contact tracing apps, and what you think that means for individuals that might put these apps in their phone. Tell me a little bit about what’s happening right now with the industry and how contact tracing apps are working.

DR: Yeah, so Apple and Google created a capability so that phones can communicate with each-other via beacon. So that they can store information on phones, or have phones bounce off of one another, so that if someone downloads a contact tracing app or registers there, if anyone who also has the app, it will be able to trace back, y’know, how long they spent with certain people and tell them whether they feel like they may have been exposed in some way, and tell them either to quarantine or go seek treatment in some way, or get tested. So it’s pretty controversial, the contact tracing app, for a couple of different reasons. One is, people are very concerned about privacy, like giving their potential medical information to a company that’s not a medical provider, meaning that they’re not protecting the data the same way. Also, as you know, Bluetooth technology isn’t exactly super accurate in terms of the distance that you are from someone, so the delta, in terms of how accurate it can be, may be way off. It may be several meters off, the phone can’t tell if you’re six feet apart or whatever, so I think that they’ve tried to tune that up with this new API that they created, but still, based on the science, we don’t know that it’s actually accurate or not.

LN: So you could still have a situation where, if you put one of these apps on and you’re outside biking, and you bike within 8 to 10 feet of someone who later does have it that you’re getting notified that you have to quarantine on a false basis. That’s a potential outcome of using an app like that, correct?

DR: Yeah, but I think that the way they having it now is that it’s supposed to register you spent more than 15 minutes near that person, so, y’know.

LN: Okay, that’s good to know.

DR: But let’s say you’re parked in your car and someone’s parked next to your car, so you aren’t physically near, y’know, you aren’t in any danger from that person but you wouldn’t know, just because your phone says you’re close to them. They don’t understand the circumstance that you’re in, to be able to tell that, so. I think people are concerned about, a lot about privacy, them taking the data or how the app is actually going to work, and it’s going to work differently in different countries. So what they’ve done is create this API, this capability that’s put on everyone’s phone, and then if you download the app, the app which you use will use that API to actually do this beacon exchange on people’s phones. So, that’s kind of what’s happening right now, is different countries and different places are implementing it in different ways, and some are really pushing back on them because they don’t have really any good guarantees about privacy, or data breach, data breach is a huge issue.

LN: Yeah, I mean, our Government’s never had data in their custody compromised ever, right? wink..wink

DR: Right, that never happened, exactly, so-

LN: You’re having your maps of where you’re walking, your GPS records-

DR: Yeah.

LN:time of day, your movement and that is going to Google and Apple, and under certain conditions they’re passing that data on to the CDC or other entities, law enforcement, enforcement groups.

DR: Well their concern is that data, because it’s at a private company, will get merged with other things, like let’s say your insurance carrier, or your medical, y’know, you get dropped from your insurance because you have this app-

LN: You drive too fast.

DR: No because you have this app, and they think that you may have been exposed, or you’re a higher risk, or a bank doesn’t want to give you a loan or something, because you have this app on your phone. I’ve been hearing a lot of different scenarios people are concerned about. But I’m curious, from your perspective, in terms of how certain things are stored on phones. I know beacons is a really big idea, but maybe you can explain a little bit about how Bluetooth actually works?

LN: Yeah, well Bluetooth is a near band wavelength that allows for peer-to-peer networking. Bluetooth has been exploited in the past to be able to take over devices, so it’s, a lot of people don’t like to have their Bluetooth on continuously because you’re opening your phone up to potential attacks, cyber attacks, via Bluetooth. You’re also broadcasting, when you have Bluetooth on you’re also broadcasting your MAC address identifier, your Bluetooth unique address and there have already been issues where retailers in London at one time, they had kiosks outside that would track the shoppers and they’d know how long they were at certain stores, and they’d use that information to serve custom video ads to people as they’re shopping and walking by.

DR: Right.

LN: So there’s privacy implications and security implications of having Bluetooth on all the time.

DR: Yeah, and that’s a big concern. So I know when I first heard this, about them doing this contact tracing, I was wondering like how exactly would they get the proximity right, and because we have no visibility to that we really don’t know, right?

LN: No.

DR: So we just have to sort of trust the black box and see what happens, to some extent, but I, for me I think my opinion is that contact tracing is a profession, it’s not an app. So, there are people who do this as a profession, only, let’s see, 55% of people in the world don’t even have smart phones, so you’re talking about a capability that’s only for 45% of the people, and not all those people are going to actually volunteer to get these apps.

LN: Yeah.

DR: So it doesn’t really help to contact, for people who do contact tracing, except it adds another layer that they have to work with because they still have to track people whether they have cell phones or not.

LN: It’s interesting stuff, thanks for bringing that to our viewers’ attention and thanks for being on the show again.

DR: All right, thank you so much, I really appreciate it.

LN: Okay.

Check out these related Blogs

Security Risks When Working From Home

Working from home? Have you been transferring files between work and personal computers? Be aware of the security risks that are out there. Experts talk about how to protect your company’s private data. Where should you start to make sure your remote workforce is secure? Listen to these experts!

Using Your Personal Computer to Work From Home

What are implications when working from home?

Let’s face it, these are weird times! Never before have we had the bulk of the country’s work force sheltering-in-place and working from home. We’re going on four months battling the spread of COVID-19. Workers have resigned, been terminated and furloughed and many have sensitive trade secrets loaded on their personal computers. Experts Lee Neubecker and the Data Dive Debbie Reynolds discuss currents situations and different audits they have performed for companies to retrieve intellectual property and company data. Check out this blog with transcripts.

Video Transcripts Follows

Lee Neubecker(LN): Hi, this is Lee Neubecker from Enigma Forensics. And I have Debbie Reynolds, the data diva back on the show from Reynolds consulting. Thanks for being on. Thank you so much for having me Lee. So what are your thoughts about the shift and changes that have happened over the last couple of months with everyone being stuck at home with their computers?

Debbie Reynolds(DR): I think it’s a interesting issue now, because as you know, even before the pandemic, there were people working at home. But now since there’s so many more people at home, it’s bringing up other security risks, especially with devices. And I’m sure you know, you probably explain more of your experience about working especially a forensic with people who are remote. And some of the challenges with those machines, especially, you know, the same people. They’re either working from home, people are getting furloughed or people are losing jobs where they’re, they’re not in the office. But they still have equipment. So I’m curious to see what you think about all that in terms of the device, the equipment, and some of the risks that come with that.

(LN) We’ve had a number of projects happen during this period where workers either have resigned, they’ve been terminated, or they’ve been furloughed, and there’s a need to get the company data back. And sometimes that data is on their personal computers. Other times the data is on a company issued laptop, but there are companies are just starting to get back to work. And there’s a whole host of issues. If you have sensitive trade secrets, and confidential electronic data on an employee’s personal or work computer, and you don’t have physical custody of that, there’s a real risk of that data getting disseminated to a new employer, maybe leaked online to the web, or maybe even you know, someone’s kid at home installs a game that opens up malware that puts those trade secrets at risk.

(DR) You know, we know a lot of people working from home, and a lot of people are using, I think the statistics said, the majority of people, maybe a slight majority, are using their own computers to, you know, tunnel in via VPN or whatever. But we all know that people still, under a lot of circumstances, let’s say they’re printing, or they have a file they want to, you know, leave locally or something. What is your advice from a forensic perspective? ‘Cause we can, we always see a lot of data co mingle together, unfortunately, where the personal and people’s business stuff maybe, you know, together in some way, so what is kind of your advice for people working at home for stuff like that?

(LN) If an employee’s is being asked to work from home, they should ask for a work issued computer.

(DR) Right

(LN) Also you should be using a virtual desktop of sorts.

(DR) Right. Yeah, exactly. But you’ve seen I’m sure you’ve seen a lot of situations where you’re asked to do forensic work. And there is a lot of personal stuff, even on a company.

(LN) Yeah, we’ve had situations where people have, despite having work issued computers, they’ve still connected their personal computer up to corporate resources, office 365. I’ve seen situations where there’s drives that are syncing to personal, former employees, personal computers, and even though the accounts are severed, so it can’t continue to sync, then all that data might still reside. So we’re doing audits right now for clients to look for, you know, what devices are synchronizing with corporate data stores, and some of those devices. You know, there really needs to be accounting and audit to match up those devices to ensure that only accounts of active employees are syncing and that those devices are company issued devices, not personal devices because it poses a real risk. It’s a problem that could be preempted by issuing, you know, work equipment, not co mingling work and home stuff.

(DR) Are you seeing problems where people are, let’s say they have a phone. And they have like, for example, let’s say they have an Apple phone and they have a iCloud account. And the phone belongs to the company, but their iCloud account is their own personal account where you have problems getting those passwords.

(LN) Yeah, for the most part, we’ve had compliance and I’ve worked to try to help solve the problem, you know, the employee might have stuff they need. And usually what we’re doing in most cases where we have co mingle data, where we’re giving the employee or former employee the opportunity to put all their personal stuff onto a drive that will then do a search against and then we’ll wipe, wipe, completely wipe, the original device. They’ll sign a certification of sorts, and then they’ll only copy the stuff that they, that they copied off that we verified, didn’t contain trade secrets, and they’ll pull that back down to the computer. But that relies on some level of trust that if the employee or former employee signs, a declaration or affidavit saying that they returned everything that they’re being honest.

(DR) Do you have people that are concerned, especially in the legal field about people doing remote document review, and having sensitive documents viewed on their computers at home?

(LN) Well, I think that’s a legitimate question. And you know, if, if companies are outsourcing document review, they should be asking the provider, provider questions about, you know, how, what steps are you taking to make sure that those endpoint reviewers aren’t using computers that are compromised? In many cases, companies are using independent contractors as their reviewers and they’re not issuing corporate equipment. So that that’s a real risk that the whole ediscovery industry really needs to grapple with, because someone’s going to get burned at some point in time, especially during this, this pandemic with, you know, resources taxed and people working from home.

(DR) I have one more burning question for you, actually. And this is about BYOD. What do you think? Because the pandemic, do you think more companies will start to do more or less, bring your own device things as a result? I think we’re going to see a lot of problems come out of BYOD devices where companies see the problem of losing control of their data. And, at least with the larger companies, I think you’re going to see probably more strict, more strict enforcement of using corporate resources. I mean, there were many companies right before Illinois shut down went into effect they were ordering laptops going running out to, you know, retail stores to quickly grab whatever they could, so they can issue laptops to their employees. And, and so I think you’re going to see, I think you’re going to see a movement away from BYOD in the future.

(LN) I agree with that. I think it’s been a long time coming. I don’t know if you remember when they were first doing this, you know, at first companies were giving people devices, then they decided well we’ll save money will be out BYOD Now it seems like a pain in the neck to deal with it. And it’s all these risk issues. So I really feel that they’re going to start to go back the other way.

(DR) Now, well there’s a cost associated with BYOD. And now people are furloughed and all your sensitive data is on former employees, personal computers. So then you’ve got to hire a forensic expert like me to try to work through to get the data back and to solve that problem, which, you know, it might have been much easier to issue a 500 dollar laptop to employee, then to have them synchronize that ’cause they’re going to pay more than $500 dollars to try to solve the problem of getting their data back. So after we get through this next bump in the business cycle where companies are paying out to have to retrieve their data, I think you’ll see that most CFOs will see it’s smart sense to issue corporate laptops and to block access to BYOD devices. But thanks for the question. It was a good one.

(LN) Thank you. Fascinating. Thank you for sharing.

(DR) Thanks

Related Articles

Check out our COVID-19 Statistics – Track your county!

Social Media and Cell Phone Forensics

Social media and cell phone forensics can play an important role in thwarting criminal activity. Check out this conversation between Cyber Forensic Expert Lee Neubecker and Data Diva, Debbie Reynolds. You will be so much smarter afterwards!

Snap Chat, Twitter, Facebook: Social Media and the Importance of Cell Phone Forensics

Lee Neubecker and Debbie Reynolds, the Data Diva, discuss the role of law enforcement in capturing social media posts when trying to thwart the bad guys coordinating a riot or the more recent looting incidents in Chicago. During this difficult time in our nation, what is the role that cell phone forensics should take? Did you know that Apple phones have the ability to automatically shut down when stolen and have a beacon that will detect the location of the phone making it easy for law enforcement to come knocking on the thief’s door? Check out this video to learn more about the role of social media and cell phone forensics.

Transcripts of Video Follows

Lee Neubecker (LN): Hi, it’s Lee Neubecker, and I have Debbie Reynolds back on the show, Debbie thanks for being on remotely.

Debbie Reynolds (DR): Thank you for having me.

LN: So I asked you to come on so that we could talk a little bit about some of the recent lootings that have happened in Chicago and other areas across the country. And what could be happening, as it relates to cell phone forensics and how law enforcement can be using that to get to the bottom of how these coordinated attacks are being planned and who might be involved.

DR: Most of what I know about this is basically what you told me so, why don’t you just sort of share what your experience has been so far in the current environment, and then we can talk from there?

LN: Sure. Well, right now, I know that some of the looters that were apprehended had cell phones on them. We don’t know exactly how the information is being used by law enforcement, but technically, an example of things that could happen could include, doing forensics on the cell phone, identifying Snapchat handles they have communicated with, looking at text messages, looking for Twitter accounts and postings. And potentially, what I saw happening during the last week, at least in one instance, there was a post made to Twitter by a user that made a reference to doing a gig at Urban Outfitters on the West Side, and roughly a few hours after, that post went out on Twitter, referencing Urban Outfitters, Nike’s, Liquor and other things. Around four hours after that, looting that went on at that store, so that handle that posted and anyone else that reacted to that post could certainly have been alerted to the potential for mass looting in a coordinated way via social media.

DR: Yeah, I think even though the police do have capabilities to do that type of tracking and tracing, they they do heat maps of certain things. The problem is that these incidents, if they are coordinated, they happen pretty quickly so it’s sort of hard for them to kind of preempt it. But as you said, always, they have capabilities, right? To do anything with like cell phones that they capture, but they also have capabilities to do things like geofencing about who was in the area at certain time. So, a lot of what they’re doing is not necessarily preemptive or pre-crime is more of, if something is happening or has happened, they can go back and try to backtrack or trace or… If there are people on the scene they can apprehend whoever is there that’s doing whatever and they sort of build it out from there, right?

LN: Yeah, but just the other day, someone was captured and apprehended in… They got caught because they were posting their raid via social media, and they had a live view of them going to bomb, they were threatening to bomb the place and looted, taking cash registers and the stuff was, this someone that was not from Chicago, I think from downstate, somewhere that came in and came in with a goal to create problems and had a past history of that, but the person had the audacity to post it to Facebook, and the FBI just busted them and they’re indicted now.

DR: I don’t know why people share such things on social media. Because yeah, they do track and trace that. But, a lot of the things especially as I saw, it seemed like a lot of stores that have things like mobile phones have been attacked. And as you know those things are pretty easy to trace back. So I don’t know how far people–

LN: Apple had LoJack, in all their phones at the retail store, and so people who took those phones likely those phones likely got located but-

DR: Oh yeah, definately.

LN: I don’t know that that’s happening at the the cheap cell phone stores, the burner phones.

DR: Well, yeah, those are… No, I mean, they probably… If anything, obviously may have serial numbers and stuff like that but, once you… Whether it’s broken, or people change sims or whatever, it’s harder to track that stuff down. But yeah, the Apple phones, yes. They wouldn’t have very much problem. I think as I heard, I read that what Apple had done is for all the phones that were stolen from them, they were able to lock those down. And then it had a screen on there so that you actually couldn’t use it. So, that’s what I heard was happening with Apple.

LN: Yeah, well, they also have the ability to beacon out and send GPS location so-

DR: Oh, absolutely.

LN: People who are buying stolen Apple phones might find someone knocking on their door, law enforcement.

DR: Yeah, it’s probably not a good idea to buy one off the street at this point. So yeah.

LN: Yeah. Well, any thoughts on your concerns if the privacy issues that might relate to mere surveillance on people and tracking social media posts and actually getting in and subpoenaing phone numbers that were taxed to help try to prevent looting from happening?

DR: Well, okay. I guess that’s a couple of different things rolled up into one. So, obviously I’m concerned with mass surveillance, especially if it is capturing information not accurately or targeting people who may not have even been involved. So for example, a cell phone can’t tell like let’s say for instance, you’re standing at a corner and I’m at the stoplight. It says we’re next each other, but we’re not together. So, a cell phone tracking can’t really tell that so eury people who aren’t involved, who are innocent, who are especially in this regard, peacefully protesting, having them be adjacent to other people doesn’t mean that they were involved so-

LN: Lets just say though, for instance, that they found that there was a string of businesses hit, the Foot Locker, then Denny’s Liquor, CVS and Walgreens.

DR: Yeah.

LN: There were a group of 20 people that all pinged off the four cell phone towers at the same times, and we’re in close proximity to that and a few other people were ID’d, would that be enough to justify surveillance on people where there were four cell phone towers in common across a range that put them all in the vicinity of where looting took place?

DR: I’m not sure if it would justify surveillance, so to speak, but I think that if they have other evidence, it may help them target those people more closely but, in terms of sweeping people up in surveillance exercise, I don’t think that’s going to happen unless they have additional information. So, let’s say they have information just like you said, like, okay, these people are in the vicinity and then they posted a picture on Facebook with some loot gear that they got, that would be enough, I think, to justify surveillance but just the fact, surrounding the vicinity, that’s probably not enough to go on, I don’t think.

LN: I appreciate your opinions and thoughts on this. It’s a difficult time right now and hopefully we’ll have stability and we’ll have people held accountable on all fronts, not just the leaders.

DR: Yep, I agree.

LN: Yeah, thanks Debbie.

DR: You’re welcome.

See Similar Posts

Cell Phone Privacy: San Bernardino

Computer Forensic Experts Lee Neubecker and Debbie Reynolds discuss the problem that involves government versus cell phone privacy.

Cell phone privacy played an important role in the San Bernardino attacks. On December 2, 2015, Syed Rizwan Farook and his wife, Tashfeen Malik, open fired on San Bernardino County workers at a holiday party killing 14 and injuring 22 others. The FBI wanted Apple to give them access to the perpetrator’s phone.

Apple states, “We built strong security into the iPhone because people carry so much personal information on our phones today, and there are new data breaches every week affecting individuals, companies, and governments.” Apple continued…”We feel strongly that if we were to do what the government has asked of us — to create a backdoor to our products — not only is it unlawful, but it puts the vast majority of good and law-abiding citizens, who rely on iPhone to protect their most personal and important data, at risk.”

Leading computer forensics expert Lee Neubecker, CEO & President of Enigma Forensics discusses the issues relating to cell phone privacy and the government’s desire to have a back door into your smartphone with the Data Diva, Debbie Reynolds of Debbie Reynolds Consulting. These experts have an interesting perspective.

Cell Phone Privacy: Part 2 of 4

The Video Transcript follows.

Lee Neubecker (LN): Hi, I’m back again with Debbie Reynolds. Thanks again for being on the show.

Debbie Reynolds: Thank you, Lee.

LN: So, we’re continuing with this multi-part series talking about cell phone forensics.

DR: Right.

LN: It’s specifically, this section we’re going to talk about the San Bernardino 2015 December attacker that unleashed terror, Syed Farook, and at the time when that happened, the FBI went to Apple and claimed that they needed assistance with unlocking the phone.

DR: Right, so I remember this very well. This was maddening to me, because a lot of the news reports, I don’t think any of them correctly stated how cell phones actually work, and they sort of bungled the information about the cell phone. So, a lot of the articles were trying to say that the only way they could unlock the cell phone is with Apple’s help,

LN: That wasn’t true. We knew that wasn’t true.

DR: No, you know that wasn’t true.

LN: You know, I thought when they were doing that, that they might have said that to put out misinformation so that other people who were communicating with the terrorists might have thought that they were safe. I was wondering if they might have done that on purpose so that people would keep their phones so that they could track and follow other people.

DR: I don’t know, my feeling was that you know, the FBI or whoever was making this request was trying to create a precedent to be able to have people like Apple give them, create vulnerabilities in phones so they don’t have to do this one-on-one unlock feature, but why would Apple or any other company who’s in the business to make money create a vulnerability that possibly could be the antithesis of their invention. I wouldn’t use a cell phone if I thought it was unsafe, right, or insecure.

LN: Well, I just assume they’re all insecure.

DR: Well, as secure as it can be

LN: As secure as it can be, but you know, Microsoft, Apple, they issue patches and updates for security flaws every month, so there are still bugs out there that can be exploited, but when that happened right away, I was wondering why they didn’t call Cellebrite, and ultimately, Cellebrite, Israeli firm, they’re likely the ones who actually got the contract to unlock that phone.

DR: Yeah, right, exactly.

LN: But the whole notion of having a common key that law enforcement can quickly unlock any device without any judicial intervention, it’s a little concerning.

DR: It’s very much concerning. It’s like you’re trying to boil the ocean to solve one problem.

LN: Well, then if you have one key, someone in the FBI leaves, and they take that key with them, then they go and they link it on the Dark Web, and this is the type of thing that’s happened with contractors to various cyber agencies and the government, and these keys get out there, or weapons get out there, and everyone’s getting exploited, and it takes the government a long time to report it to Microsoft, to Apple, and everyone’s getting hacked in the meantime.

DR: Well, and there are a lot of other ways to get stuff off of a phone, so I think of a phone as a gateway to other things. You know, if even you do banking on your phone, if you lose your phone, that doesn’t mean that the information’s lost. You can go to the bank, companies can serve affidavits on different entities that have other information. If a person was communicating with someone else, you may be able to crack their phone, so there are a lot of different ways to solve this problem that don’t require creating a back door for a complete product.

LN: Yeah, and you know to your point about the issue when then-director Comey, James Comey, had testified seeing that they needed help, apparently the FBI’s own remote phone specialization group hadn’t been tasked with trying to get into the phones, so they hadn’t fully explored their own capabilities before they went to ask for Apple, because like you said, they wanted to establish precedent, and they wanted to change how it worked, and I think we’ve consistently seen and heard that the FBI wants full access anytime so that they can protect people, and there are some issues with that because if it’s simply full access, it’s going to make everyone less secure.

DR: Absolutely, absolutely, so I think all of us, there was quite a bit of eye-rolling when these reports were coming out about them not being able to do the cell phone, and it was like a lower version, too, so it wasn’t like the super– With every cell phone they get more secure, the OS–

LN: You know, it’s like give me the cell phone, DR: Exactly! LN: I’ll get into it. DR: Exactly!

DR: You know, even when they were interviewing people in the press, they weren’t really interviewing the forensic people who do this for a living, so I’m like who are they talking to?

LN: All the computer forensic people I know, we talked about this. The best plausible explanation I could think of, again, that they were trying to create a false narrative so that they could break up other people who were collaborating, but in fact, the Inspector General’s report from the FBI revealed that they just hadn’t fully done everything, and it sounds like it was two-part, it was part they wanted the power and the access, but second the operational component. What happens, you know, there’s a more recent case that we’ll talk about in a later series, and the question becomes then, again, have they used that most, their own internal resources fully before they’re going to Apple?

DR: Or even have they leveraged people like Lee, who do this for a living. It was funny, because when they were, when this case was going on, I had another case at the same time, had the same cell phone, and literally I sent it out and got it cracked like within a day. I couldn’t understand what the issue was, exactly.

LN: Hey, what can I say, I’m good.

DR: Exactly!

LN: Well, tune in for our next segment, where we’ll be talking more about some privacy issues related to having a back door, and some better solutions that if, you know, if Congress and Senate if they want to pass legislation, there are some ways that we can still allow the FBI to get in without having a common back door key that doesn’t undermine security.

DR: Exactly.

LN: Thanks for watching. DR: Thank you.

To review the first video in this series please read below.

Click here to view Apple’s comments.

https://www.apple.com/customer-letter/answers/

Cell Phone Forensics

Personal Cell Phone Forensics inlcudes social media, business and personal messages, photos, emails and GPS.

Leading computer forensics Expert Lee Neubecker, discusses the complexities of cell phone forensics with Debbie Reynolds from Debbie Reynolds Consulting. We both agree the litigation involving cell phones becomes personal and proves difficult to gain possession. Personal and business text messages, social media posts, photos, GPS records, emails, are all weaved together and become part of the discovery equation. eDiscovery in today’s era is incomplete without including data from smart phone including text messages, Skype, WhatsApp, Slack, Signal and other messaging platforms. Learn more about eDiscovery as it relates to personal cell phone messaging systems by watching Reynolds and Neubecker discuss the topic in today’s blog video interview.

The video interview transcript follows:

Lee Neubecker: Hi, I’m here today again with Debbie Reynolds, and we’re going to talk about something interesting, which every piece of litigation now is getting into. We’re talking about cell phone forensics. What’s been your experience with litigation involving cell phones and discovery?

Debbie Reynolds: Well, whenever they’re cell phones involved eye-rolling begins because people take their cell phones very personally. As opposed to someone’s laptop, which maybe they don’t want to give up, they will fight tooth and nail not to give up their cell phones. And obviously people, they mix work with pleasure and they’re doing different things. They may not want you to see, even if it’s nothing criminal going on, people just feel very tied to their cell phone. The hardest thing is actually getting possession of it and letting them know that you’re not going to look through their juicy texts or their photographs, especially if it’s not an issue in the case.

Lee Neubecker: I know that whenever you need to get into text messages, it becomes a sensitive topic for people. But there are effective ways to get effective discovery without totally trampling over someone’s privacy in many issues involving contract disputes or other civil litigation, what’s important is to identify the relevant custodians. Let’s say we have your cell phone in the conversation with mine, we can then take that, we can create a single PDF document showing each conversation thread and then you could quickly go through it, if it’s your phone in which your attorney identify relevant, not relevant, and then only take the ones that are between the relevant parties and load that up into the review platform.

Debbie Reynolds: Right. And to one thing, one very effective thing that people are doing now, and that’s something that you do, Lee, is where someone, they don’t want the other side to see their whole cell phone so they’ll have a forensic company collect the phone and say, only give them X. That’s actually a very secure way. It gives people peace of mind knowing that they’re not giving everything over, that the forensic folks can actually do some of this pre-work before people actually start looking at things.

Lee Neubecker: Yeah. And like what I’ve done is, they’re not going to pay me to spend time looking at their photos, nor do I want to look at that stuff.

Debbie Reynolds: No. No one cares. I think that’s what people don’t understand. We’ve been working on cases for over 20 years and I really don’t care what’s on the phone or what you said or what videos on there. It really makes a little difference to us.

Lee Neubecker: What I try to do is I try to quickly create almost a summary index of okay, these are the conversation threads. Tell me which phone numbers are relevant, aren’t relevant, who are the relevant parties, and then we can just pull those specific threads out, put them up into the review platform.

Debbie Reynolds: Exactly.

Lee Neubecker: Now, sometimes there’s issues where photos are relevant specifically, if it’s important that you know the whereabouts or someone on a given date and time. Photos often can establish whether or not someone was really at home sick or out on vacation somewhere. There’s embedded GPS data that is recorded into most photos that are taken with smartphones.

Debbie Reynolds: Unless someone decides to strip it out. I think if you don’t do anything to it, it will collect that data. But there are ways to strip that information out. And also, people can turn off GPS tracking on their phone.

Lee Neubecker: Yeah. Well, thanks for being on the show again today.

Debbie Reynolds: Well, thank you for having me.

When to Select A Computer Forensic Expert

Selecting A Forensic Expert

Data Diva Debbie Reynolds and Enigma Forensics’ CEO Lee Neubecker discuss what to look for in selecting a computer forensics expert to assist with preservation, litigation and eDiscovery.

The transcript of the video follows

Lee Neubecker: Debbie, thanks for being on the show again today. I’m here with Debbie Reynolds, she is Eimer Stahl’s data protection officer and she also is the director of their eDiscovery subsidiary. Thank you for coming in and being on the show.

Debbie Reynolds: Thank you, it’s always a pleasure, Lee.

Lee Neubecker: So, today we’re going to talk a little bit about the differences between eDiscovery and computer forensics and when it’s necessary to bring in an expert to actually be the testifying expert or to handle more sensitive issues, and what you look for when you’re pulling in a computer forensic expert to assist one of your projects?

Debbie Reynolds: Well, it’s never not a good idea to bring in a forensic person, so I try to get someone who’s a professional in forensics on every case that we have, so, just depends. Some big corporations, they actually have people, ’cause they do so much litigation, they have people who are captive to their organization that do it. More times than not, they either farm out that work, to a company like Lee’s company, or they come to me, they ask me for recommendations. Just depends on where they are, what their ability, who’s available. For me, it’s really important that I work with people that I trust, smart people like Lee, who knows what they’re doing. Me, I tell people, I don’t chase company names, I chase the talent, so, I’ve had situations where I’ve had an investigator or forensic person go from one company to the next, and as a stipulation of us working with them, that case went with them ’cause they had the knowledge, so for me, the thing that I look for is a company, again, people that I know and trust, people that I know are smart that know what they’re doing, people who can really present themselves, ’cause a lot of times you’re going into a situation, you’ve not met these people, you’re going in there, touching their data, people are very sensitive about it, IT people can be very territorial, so having someone who can really put people at ease and be very professional in a situation where it’s semi-hostile, where you know that the IT guy takes pride in what he’s doing, thinks he’s the expert, so you have to kind of disarm that person.

Lee Neubecker: How often are IT people hostile?

Debbie Reynolds: Oh, 1000% of the time. They’re always hostile in some way, some are more passive aggressive than others, but you know, this is their baby, you have to work with them to get access to the data, and a lot of times they feel like, well why can I do this?

Lee Neubecker: And part of the problem, when I’ve worked with the IT people, usually they’re defensive because they’re having extra work to do.

Debbie Reynolds: Oh, absolutely.

Lee Neubecker: And they’re involved in litigation, so what I try to do is I try to sit down with them and say, “hey look, “this is my role, I need to understand enough of your stuff “so that you don’t have to talk to the attorneys, “and then I can buffer you from that so that you can “do your daily work,” and when they hear that, it helps them to understand, okay, you’re here to save me from a deposition.

Debbie Reynolds: Oh, absolutely.

Lee Neubecker: Then they’re more relieved, more willing to work with you.

Debbie Reynolds: Absolutely. I think the challenge is to get, when you start a litigation, companies, in order to try to save money, that’s where they want to save money. They don’t want to spend money on a forensic person, but if I compare cases against one another, two cases are very similar, one they had a forensic person, one who doesn’t, the one that has a forensic person, down the line, their case is more smooth, ’cause we don’t have a lot of questions about who did what, what is where, we don’t have a question about who needs to sign affidavits, who needs to go to court, all that stuff, so all that headache down the line is eliminated when we bring in someone. And I’ve had people on our cases tell me, who’ve decided that they didn’t want to bring in someone, they said no, but bad decision, we should have really brought in someone.

Lee Neubecker: In my opinion, I think it’s important to know who the person to be responsible for that data, if they’d never testified in court before, that’s a potential problem, and a lot of times people don’t ask those questions. Other things like, do they have some type of certification that shows that they mastered the field of computer forensics? And did they have to take a exam that was proctored by some independent party to assess that so that you know that your person truly has the knowledge, they didn’t just attend a class and got a certificate, because that’s a little bit of a difference, and there are many people, though, that I’ve encountered, that haven’t had the formal certifications, and they’re very bright, but when you’re putting the people up, they’ve got to survive a challenge against their admissibilities expert, if they don’t have cases of record, if none of the judges know who the person is, those things are definitely problems.

Oftentimes, I’ve seen new experts get up and make basic beginner mistakes where they let the attorney override what their report is, they let the attorney write the affidavit for them, and then it gets stretched too far, and then there might have been many good things that they had to say, but all of it goes out the window because they didn’t know how to manage the hard, nose-driven litigator that wants that report to be aggressive, so you have to listen and understand those driven litigators, but you also have to protect them from killing the case, and they assume that whatever expert you put there has those skills and a lot of them don’t know when they’re getting into trouble, and they need to be able to stand up for themselves, and do it professionally, and objectively.

Debbie Reynolds: Absolutely, absolutely. A lot of times, they don’t know what they don’t know. We had a person that actually went out and got a cell phone for a case, and we were like, we don’t want anyone to touch it, we want the forensic people to look at it, or whatever, he thought, oh well you know, I’m smart, I know how to do this stuff. Not that he wasn’t smart, but this was not his area of expertise, and he turned this phone on, and basically, the person who had the data on the phone, had sent a command to the phone to be erased, so when they turned it on, it wiped out all the stuff.

Lee Neubecker: So they didn’t put it in a Faraday bag?

Debbie Reynolds: No, they didn’t put it in a Faraday bag, they didn’t put it in airplane mode, they went to Walgreens, got cords, stuck the cord in the thing and turned it on, and that was it.

Lee Neubecker: So then that becomes some spoliation claim against–

Debbie Reynolds: It was spoliation, yeah. Everyone thinks, oh I have a cell phone, so I can do this, and it’s like no. I think people need to understand that what you guys do is very different than what we do in eDiscovery and what a normal person who’s doing IT can do, ’cause you have a different aim in my mind, and you understand spoliation of evidence, and how to get data in the right formats, where another person would not know that ’cause that’s not their background, that’s not their training and that’s not the purpose of what they’re handling data for.

Lee Neubecker: Well I really thank you for being on the show, again, to talk about this, it’s great. I look forward to seeing you again soon.

Debbie Reynolds: Fantastic, thank you!

Lee Neubecker: Thank you.

Do You Suspect Your Company Has Been Hacked?

Electronic Discovery Wins Litigation

Cell Phone Forensics for Use in Litigation

Computer “bots” Used by Insurance Companies

Are Computer “Bots” Making Your Healthcare Decisions?

Are Computer “Bots” Making Your Healthcare Decisions?

Enigma Forensics CEO Lee Neubecker and David Bryant from Bryant Legal Group discuss computer “bots” used by insurance companies as a way to underwrite policies and making insurance claims decisions. Bots are now determining how a given claim should be scored. See how ediscovery plays a role in getting success for your client.

The transcript of the video follows

Lee Neubecker: I’m here today with David Bryant from the Bryant Legal Group and we’re going to talk a little bit about health insurance claims in his work, helping people get the coverage they deserve.

David Bryant: Nice to be here, Lee, thanks for taking the time to stop by. We’re seeing a very significant shift in the insurance industry with respect to claims adjudication and claims determinations. One way of looking at how this change is happening is to look at the dollar volume that’s being invested into underwriting insurance policies and making claims decisions. The first metric I’d like to share with you is there is a company out of Europe that did some research on money flowing into what’s now called Insurance Tech, and approximately two billion dollars went into the Insurance Tech arena in 2016. This money is being deployed into not only underwriting, but how claims are made and I think everyone out there is familiar with Watson and the new term artificial intelligence. And how that’s playing out in the insurance industry is that a lot of claims decision-making is being taken out of the hands of individuals and being given to what we’ll call “bots”, robots, or termed a “bot” in tech speak. So these algorithms which will be designed by very bright people, such as yourself, to determine what a given claim should be scored. And if there’s a certain score, then a claims individual will be required to deny that claim. This is problematic for some of the insurance companies because if it’s discovered, through the discovery process, it can wind up hurting them in litigation for bad faith denial of a claim.

Lee Neubecker: So, David, can you tell me a little bit about what you do at the onset of one of your case matters to help make sure that you could argue your case in court?

David Bryant: So there’s really two phases to insurance claims. There’s the appeal process and then there is court. If your claim is denied I can always sue an insurance company in court. Typically that’s in Federal Court. I primarily practice in Federal Court but I do State Court as well. So once I wind up in a court setting I will send a litigation hold letter to the general counsel of the insurance company and that letter secures that all of the data in its electronic format is preserved. So if I want the emails on a particular claim individuals hard drive, that information should be present when I request that information by way of that litigation hold letter. When I do discovery in Federal Court we’re looking for electronically stored information. I’m not looking for paper any longer because we’re looking to get the metadata that’s embedded in that electronic information so we can find out who looked at it, when it was looked at, when it was altered. So, Enigma Forensics having the skill set to be able to determine who touches electronic files, who views electronic files, we will bring in your firm in those circumstances when we want that type of information in litigation. Lee Neubecker: So can you give me an example of when you’ve had to rely upon our computer forensic services for us to help you out with a matter and how that played a role in getting success for your client?

David Bryant: So we handle primarily health insurance and disability insurance claims on behalf of individuals and physician groups. So one of the matters that you handled for us dealt with a disability insurance claim and we were looking for certain key words and key word phrases that were on the server or hard drives of the particular individuals at the insurance company. Being able to cull through all this data is a Herculean task and would be extremely expensive for the defendants. So the defendants will typically go to the Court and say, “Judge, this is going to cost us way too much “money and interrupt our normal course of business. “We don’t want, Mr. Bryant, to have access “to this information or put us through the trouble “and cost of doing it.” I brought in your firm and your services and you were able to explain to the judge that you could do a search of all of the information held by the insurance company and find these key words and submit them to the Court in-camera, so there was no privacy concerns, and report to the judge what your findings were. The case soon settled thereafter.

Lee Neubecker: They usually do. Well thank you for being on the show today. If you need to reach David, his info is on the screen. Thank you.

Computer Fraud & Abuse Act Charges Filed

Capital One Data Breach

Capital One Data Breach – Interview of Data Privacy & eDiscovery expert on the fallout

Cyber Security &  Computer Forensics Expert Lee Neubecker interviews Data Privacy Expert Debbie Reynolds on the fallout from the recently disclosed Capital One Data Breach that occurred following alleged hacking of the company’s data stored in the cloud.  Issues discussed include an assessment of how the CEO of Capital One managed the crisis, pending charges filed against Paige Thompson and the Computer Fraud and Abuse Act in the government’s complaint filed earlier this week.

Transcript of video follows

Lee Neubecker: Hi, I’m here today with Debbie Reynolds from Debbie Reynolds Consulting and we’re going to be talking today about the recent news involving the Capital One Data Breach Thank you for being on the show Debbie.

Debbie Reynolds: Thank you for inviting me. It’s such a thrill, you’re such a joy to be around to talk to so it’s great to do this

Lee Neubecker: Well it’s great to have you here. So, trial’s expected this Thursday in the case. Can you tell everyone a little bit about what happened this week?

Debbie Reynolds: So this week is in the news that Capital One had a data breach. There was a woman who used to be, I believe she’s worked Amazon if I’m not mistaken, who had found a vulnerability in Capital One’s cloud system, and was able to obtain private or digital information on over a hundred billion customers or potential customers for Capital One so as far as I can tell they say that she may have gathered social security numbers and other private information about individuals who had even applied, who may not even be customers of Capital One, who have even applied for a Capital One credit card back as far as 2005.

Lee Neubecker: Yep.

Debbie Reynolds: So the vulnerability that was discovered and part of the reason why it was discovered was because she had apparently bragged about it on Twitter and she used her real name and so they were able to pull this stuff together. And I think the SWAT team went to her house?

Lee Neubecker: Yeah, so she was using the IP, iPredator, which is supposed to anonymize and protect you. When she was using that she created her online GitHub accounts and other accounts and it had that IP, the iPredator IP address range in her profile linked to her name. So she wasn’t really being smart about it.

Debbie Reynolds: No. So yeah, I think that she was bragging about what she had, I guess she was proud of what she had done and apparently someone who had seen something she had post on some forum contacted Capital One. This wasn’t a breach in which Capital One found out about; someone from the outside said, “Hey, this girl says that she has your data” and now it’s a really big thing.

Lee Neubecker: Yeah so now she’s charged with a computer fraud and abuse act which I think she’ll probably end up …

Debbie Reynolds: Yeah.

Lee Neubecker: Do you think she’ll get a plea?

Debbie Reynolds: She’s probably going to go to the slammer. It seems like especially when the SWAT team showed up at her house, they’re definitely going to make an example out of her with this. It’s pretty bad because I think right now the reports and what’s coming out from Capital One are different than what she said or what other people said they have. Because at one point they were saying that Capital One in their statement said that certain people’s social security numbers weren’t breached but then we know that they did get people’s social security numbers.

Lee Neubecker: It was mostly Canadian social security numbers, around a million–

Debbie Reynolds: Right.

Lee Neubecker: And then I think it was somewhere around 100,000 or so U.S. citizens.

Debbie Reynolds: Right, exactly.

Lee Neubecker: So it doesn’t necessarily impact the entirety of U.S. customers, but it still is–

Debbie Reynolds: It doesn’t, it doesn’t make you feel good. Yeah so basically over a hundred million people were touched in some way, shape or form. Even though not everyone’s personal data was taken to the same extent as everyone else, but I think this incident illustrates for us a couple of different things. First of all, they were saying that they had credit card information or information on people who had applied for credit cards going back as far as 2005. I’m not sure if they can make a justification for why they even had some of that stuff.

Debbie Reynolds: It’s first place. Especially if and I wonder what rights someone would have if they weren’t actually didn’t translate to being a customer of Capital One. The law’s kind of murky about how they should do that. I guess that’s the same issue with Equifax where not everyone who was touched by Equifax are customers of Equifax, they just happened to have their data.

Lee Neubecker: What would, how would you have advised Capital One had you gotten in there before the data breach?

Lee Neubecker: You think you might have been able to–

Debbie Reynolds: Well, you know–

Lee Neubecker: Get them in a better situation?

Debbie Reynolds: I think a lot of corporations, my view is that a lot of corporations have this mindset or business has this mindset of does it work? Does the computer work? Can I do the thing I need to do on a computer? The question that they’re not asking is is it secure? So a lot of them have a blind spot in terms of securing things because as long as it doesn’t impact their ability to work, they don’t really care how it works. So now companies have to ask how does it work? Is it secure? A lot of companies have these issues where they’re moving from internal infrastructure to the cloud and we know that the cloud infrastructure would typically be more secure quote unquote than someone’s on premise infrastructure but that all depends on how it was configured. The vulnerability that this woman was able to exploit in Capital One had to do with how the permissions and things were configured on a cloud infrastructure.

Lee Neubecker: And she had worked in that environment.

Debbie Reynolds: Right. So she had a little bit of extra insight–

Debbie Reynolds: Exactly.

Lee Neubecker: In this process.

Debbie Reynolds: Exactly. But I don’t know if you probably run into the same thing where you’re having clients that have cloud issues and they may feel more secure in themselves. Okay, we think our native is more safe than the cloud, not to say that the cloud is not safe, but if we have someone who doesn’t know how to fill those gaps and stop those vulnerabilities, it could be a huge problem.

Lee Neubecker: What do you think of the CEO’s response from Capital One?

Debbie Reynolds: I saw CEO’s response. I don’t know, someone needs to do a series about this where you compare all the response letters from these data breaches or whatever.

Lee Neubecker: That’s a great idea.

Debbie Reynolds: Not a bad response at all. I think the danger though is there may be an issue with consumer confidence obviously because no one wants their data breached, but if the things that are being said by the CEO or other leadership it becomes evident that it’s different than what actually happened, that’s going to be a problem.

Lee Neubecker: Yeah, cool.

Debbie Reynolds: I think rushing, the desire is to rush. To put out as much information as you possibly can but already the news reports are contradicting what the company is saying about what was actually breached.

Lee Neubecker: Well the complaint is available, I’ll post that on my website as well. I read the complaint and there’s a lot of detail in there and you’re right, in the news story they’re talking about Amazon cloud, they talk about a company that presumably is a subsidiary of Amazon inside the complaint.

Debbie Reynolds: Right.

Lee Neubecker: But they didn’t specifically mention Amazon in the complaint.

Debbie Reynolds: No, no so it’s going to be customers when they feel like they’ve had a data breach they definitely want, you know there’s attention that has to happen where the company wants to be as forthright and forthcoming as possible about what’s happened, but the facts may still be rolling out.

Lee Neubecker: Yeah.

Debbie Reynolds: The drip, drip, drip of it all may be tough I think.

Lee Neubecker: But I thought at least it was good that they public acknowledged it. It didn’t take forever to acknowledge it.

Debbie Reynolds: Oh, right exactly.

Lee Neubecker: And apologize, I mean–

Debbie Reynolds: Oh, absolutely. It does goes a long way–

Lee Neubecker: They just did that so I applaud them for not–

Debbie Reynolds: Absolutely.

Lee Neubecker: Sitting on it like Equifax.

Debbie Reynolds: Right. They didn’t say, “Well I’m sorry that you were hurt or you felt hurt,” or something where it’s like oh yeah, you know there is harm there so you might as well acknowledge it and try to at least be forthright about what you know and we know it.

Lee Neubecker: And from what I read too, not all of the data, some of the data was tokenized but there were birth dates, there were some socials. Debbie Reynolds: Right.

Lee Neubecker: And some other information that certainly if that were you or me, well we’re kind of becoming used to this all the time. It’s sad, but.

Debbie Reynolds: Right, well I mean and what we’re seeing, what I’m seeing, what companies are trying to argue in the U.S. having to do with data privacy is if you put, let’s say you’re on Facebook and you say, “Hey, today’s my birthday!” You know so if Lee puts his birthday on Facebook, is Lee’s birthday private? So let’s say you’re a Capital One customer, they could argue you know your birthday is not private because you put it on Facebook. That’s going to be an interesting theme.

Lee Neubecker: Well thanks so much for being on the show today.

Debbie Reynolds: It was fantastic, thank you.

Debbie Reynolds Contact Info

datadiva at debbiereynoldsconsulting dot com
312-513-3665
https://www.linkedin.com/in/debbieareynolds/
https://debbiereynoldsconsulting.com/