Interview with Julie McEwen on why privacy is key
Interviewer: Cameron Boozarjomehri
Welcome to the latest installment of the Knowledge-Driven Podcast. In this series, Software Systems Engineer Cameron Boozarjomehri interviews technical leaders at MITRE who have made knowledge sharing and collaboration an integral part of their practice.
Privacy engineering involves injecting legal, policy, and ethical requirements into technology. It takes perspective to effectively manage privacy risk while keeping the big picture in focus. Fortunately, Julie McEwen, MITRE’s Privacy Engineering Capability Area Lead, is on the case. She and the privacy engineering team provide policy and technical privacy support to MITRE’s sponsors and to MITRE’s Chief Privacy Official. Today, we learn what being a privacy engineer really means, and how she guides sponsors through the technical and legislative minefield that is privacy risk management.
Click below to listen to podcast:
Cameron: | 00:15 | Hello everyone, and welcome to MITRE’s knowledge driven podcast, a show where I, your host, Cameron Boozarjomehri, get the good fortune of interviewing brilliant minds across MITRE. Today I’m joined by the very talented head of our privacy capability. Julie McEwen. Julie, would you like to introduce yourself? |
Julie: | 00:31 | Sure. I am the privacy engineer in capability area lead in the cyber solutions tech center at MITRE, and I’m also a principal cybersecurity and privacy engineer. I’ve been working at MITRE for 20 years. |
Cameron: | 00:43 | I understand you have some very fascinating things you’d like to share with us about the privacy capabilities such as, and I think this is the most important question, what is it? |
Julie: | 00:50 | So, in the privacy team we are a pretty diverse team. It’s definitely a team effort. The group in there, we have people with backgrounds in law and policy, others who have backgrounds in social science like I do. Others who are systems engineers and we all work together to help protect privacy. What we do is we go in and help our government sponsors as well as within MITRE, MITRE internal systems to make sure that we address privacy considerations for the use of information about individuals. So just making sure that both the law and policy requirements are followed for protecting that information, but also looking at the ethical and social considerations for the use of information about everyone. |
Cameron: | 01:39 | And to build on that team effort point. Just to point out to everyone: a capability at MITRE is basically exactly a team effort. It’s when enough people with enough backgrounds come together to formalize their knowledge into a specific group so that when other people, say sponsors or other agencies, come to MITRE and say, “We need your help with this privacy problem.” They know that they can pull on a plethora of knowledge from all sorts of people of different backgrounds to answer the questions from a technology standpoint, legal standpoint, governmental policy standpoint, all those little pieces playing together. Is that about right? |
Julie: | 02:16 | Yes, it is. I have a team that works with me in particular to lead the capability. I have to give a shout out to Stuart Shapiro, Kathy Petrosino and Julie Snyder, who all help me lead the capability, as well as Michael Eisenberg, who’s one of our privacy policy and legal experts, and we all come together and look at what’s going on in the privacy space. |
Julie: | 02:40 | We have a number of tools that we’ve developed within MITRE to help our privacy SMEs to do their work. And then we also monitor what types of tools are being used outside MITRE and work with academia to look at the space and figure out what would be appropriate for us to use in our work. |
Cameron: | 02:57 | And for those of you who don’t know, a SME is a subject matter expert and MITRE is just full of them. But it seems we have a specific advantage in the privacy space because as I understand it, our privacy capability has been around for quite a while. |
Julie: | 03:10 | Yeah. The capability, we’re actually going to have our 20th anniversary in January, so we’re pretty excited about that. What we’ve done over the years. In the beginning we were going in and helping a number of [inaudible 00:03:22] agencies to actually develop their privacy programs. They didn’t have any that existed, and we came in and help them to start them. And then over the years, as agencies all had privacy programs, we have been going in and helping them to enhance their programs to make them better, to mature them in some cases. |
Julie: | 03:39 | And we have a privacy maturity model that we use to help agencies and kind of look at each of the different aspects of their programs and determine whether they should try to enhance that program to make it more formal. We typically would like to see agencies to have formal privacy program that’s documented and followed, so it’s consistent every time. |
Julie: | 04:03 | And then as in order to mature it beyond that, they would want it to be better integrated with other parts of their agency. And then another type of work that we’ve been doing more frequently as the years have gone by is really more one of doing risk assessments of different technologies. So we have different risk models we’ve developed to go in and look at technologies, how they’re being implemented, and look at how to address privacy risks better. |
Julie: | 04:30 | So a couple examples there. We’ve done work with connected vehicles, which is one of the technologies that we know is going to impact pretty much all of us later on down the line. It’s coming, but we’ve looked at what are some of the privacy implications of the technology that’s used to connect vehicles. There’s all kinds of information that is collected in your car as you’re driving it in a connected vehicle environment and how is that information protected and how is it shared and things like that. So that’s one area. |
Julie: | 05:01 | Another area that we’ve been involved with is privacy for the use of drones, the unmanned aircraft systems and other hot technology that we see use expanding. And, of course, it’s another technology where there are sensors in those devices and in the drones that can collect information about individuals, whether it’s audio or video. And then how is that information used and how do people know that they’re being recorded by a drone, for example. Things like that. |
Julie: | 05:30 | And then we’ve also looked in addition at other technologies that are pretty readily used right now like social media and voice assistance. So that’s kind of really an up-and-coming part of our work. It’s really expanding right now—that whole technology assessment, privacy assessment area. |
Cameron: | 05:48 | That’s actually something I like to learn a little more about is, it seems that MITRE has developed a lot of tools, their capabilities specifically, because it developed a lot of tools as a product of its interactions with these different agencies or investing in these new technologies. I was curious if you could speak to how, I guess how would you approach a new technology when there’s really no precedent or paradigm to map it to? |
Julie: | 06:12 | Yeah. The first thing we typically do when we’re looking at technologies, of course we’re not typically … We’re usually called in to look at the implementation of technology in a particular environment. So we will go in and look at that environment. We’ll look at what laws and what policy might apply there. So for example, if it’s in a health care environment, there are specific laws and policy and guidance that are needed for privacy in a healthcare environment to use of a technology. |
Julie: | 06:41 | We’ll go and we’ll look at that first. We’ll develop a risk model or use one of our already existing risk models to look at what the privacy implications are of the different uses of that technology. How does it impact use of personally identifiable information? For example, if you’re using a medical device or if you’re using a personal assistant for example, or I’m sorry, a voice assistant, how does that impact privacy? |
Julie: | 07:11 | So we will look at all of that and then go in and do the assessment and give our recommendations as to what types of actions can be done to mitigate some of the privacy risk. Can you maybe use a technology differently? Are there different types of environments where you don’t want to use some of these technologies? And if you do, will you have limits on how the information that it collects will be used, for example. So there’s a lot of different ways that we do that kind of work. |
Cameron: | 07:41 | And so as you continue to develop these technologies and take on these projects, where can people go to learn more about MITRE’s work either internally at the company or externally as people curious to get MITRE involved? |
Julie: | 07:52 | If you go to our external site, mitre.org/privacy, there’s all kinds of information there about the types of work we do. There are sample papers and presentations to give you an idea of what our impact has been and how we do our work. And internal at MITRE, if you go to the FastJump privacy engineering, you’ll find out all kinds of information about our capability and some of the meetings that we have internally. |
Cameron: | 08:18 | Yeah, not to totally out myself. But I am an aspiring privacy engineer here at MITRE. I have a joined the privacy capability since I started, and I have had the good fortune of learning from you and Julie Snyder and currently Cathy Petrozzino. So I was hoping that maybe you could tell our listeners a little more about what it really takes to become a privacy engineer here at MITRE and what it was like for you as someone who started in the space all those years ago. Back when I imagined most people were treating privacy as an afterthought. |
Julie: | 08:49 | Yeah, it’s changing. It is becoming more integrated and that’s really the goal of privacy. By design that we make sure that privacy is addressed in the beginning when you develop systems and design them. When I started in the field, I actually started working in cybersecurity. It’s how I started my career, and I was in the public sector, so I was working in the federal government. |
Julie: | 09:13 | And I was actually working with systems where there was a lot of HR data that I needed to handle and started looking a lot more at privacy aspects of that. I’ve always been interested in privacy, so working in cybersecurity, it was kind of an easy move to get more into that privacy area because the two are very related disciplines. They’re distinct, but they do rely on each other and they should, in order to be effective, they really need to work together. So you could have a system that’s got awesome cybersecurity on it, but at the same time, it may not have good privacy protection. |
Julie: | 09:48 | For privacy we want to pay attention to things like notice: making sure individuals know about the collection of the information about them and how it’s being used, as well as giving them the access to that information and the ability to update it if it’s incorrect. And also the opportunity to consent even to that collection and use of that information. And we try to limit what’s collected and used. That’s a really important part of privacy, just making sure the minimum is collected for the particular use. |
Julie: | 10:19 | So kind of working in that area, starting to learn more about that. And I became certified. I’ve got several privacy certifications. That’s a great way to help become a privacy SME is to go and get the certifications, but at MITRE, in addition, we also want people to be in particular very trained and very knowledgeable about the operational environments of our sponsors and at MITRE. |
Julie: | 10:47 | So when we have them go in and go on their first project, we have them work closely with a privacy SME and get that one-on-one mentoring and training that they need. And that’s how we develop our privacy SMEs here. So over the years, that’s kind of what I’ve been through as well. |
Cameron: | 11:03 | And I can safely say the experience has been nothing short of eye opening. There’s so much going on. Earlier when you mentioned we have people from all backgrounds, that’s a team effort. I cannot stress this enough. It is truly a team effort. We have legal experts who are helping understand and navigate not just existing laws, but as the government comes up with new policies or guidance on how they want to treat privacy in these domains. |
Cameron: | 11:31 | Who have to be able to plan for and expect for those laws. And then also as these technologies advance, we’re not just looking at the typical internet-connected devices, we’re looking at how are pieces of software enabling those different connected devices to talk in new ways or discern information that people really couldn’t even have imagined a few years ago or even last year? |
Julie: | 11:53 | Yeah. The other thing to be aware of is just the idea that it’s growing. We continue to grow the capability. I think that awareness of privacy issues is really important, even if you’re not a privacy SME. I’ve taught classes in that area. I’ve actually trained privacy professionals as part of the IPP’s privacy program, but even within MITRE in the MITRE Institute, we’ve had privacy classes in the past and we will have those in the future. |
Julie: | 12:22 | So I’m really passionate about trying to get the word out. I even have done presentations within my neighborhood about privacy, so it’s something that I get questions no matter where I go, I’ll be at the bus stop. Years ago, when my son was a kid, I’ll be at the bus stop and the parents would come over and ask me privacy questions to answer. |
Julie: | 12:40 | So it’s a kind of thing that you know, it’s something that people are always interested in and it’s so important to them. I’m really trying to make those connections with everyone and help them out regarding protecting their own information and being knowledgeable about what they can do as well as helping our sponsors and MITRE to do privacy protection appropriately. |
Cameron: | 13:04 | And I think this kind of leads me to the last major thing I want to discuss with you, which is privacy and security now often treated as in the best case, opposite sides of the same coin. In the worst case, privacy is considered some type of afterthought to security. You went ahead and you make your system super secure with all this encryption and all these controls. Now it’s time for privacy, but I want to take a moment to dispel those myths and say how privacy is its own distinct thing. |
Julie: | 13:31 | Yeah. Privacy and security are mutually supportive. Privacy can help in that you are able to, when you look at privacy, you’re looking at identifying the particular data elements that are personally identifiable information. And then that input can go over to the cybersecurity side. And then on the cybersecurity side, you’re looking at what mechanisms can be used to protect that information, for example, encryption. |
Julie: | 13:58 | So the privacy side tells you: here’s the PII, the personally identifiable information that you need to protect the cybersecurity site, put some mechanisms on it. So they help each other out, they’re mutually supportive. But at the same time there are some parts of privacy that are unique to privacy that you won’t find on the cybersecurity side. So the things like the notice, the choice, the access to information. The things I mentioned earlier, limitation of the collection and use of it. |
Julie: | 14:27 | Those are all unique to privacy. There’s more of a focus on data on the privacy side and more of a focus on systems on the cybersecurity side. So just having adequate cybersecurity isn’t enough. You do need to have the privacy protection as well. We’re always concerned about also just the use of the data and we see cases more and more where there’s a database where there is sometimes de-identification of data done where they take away certain data elements that might identify people. |
Julie: | 14:57 | But when they go and they combine that database with another database, they could potentially re-identify people. So we’re looking at this whole idea of big data, lots of databases being used together and how do you protect privacy in that space and work with cybersecurity closely to make sure that’s done properly. |
Cameron: | 15:18 | And I think that’s an excellent point, especially considering how a lot of people, when they think at an individual level about their own privacy, they’re thinking more in the context of “I don’t want a data breach to result in my data being released.” When they might not appreciate that they themselves are giving out their data in a way that even if it at that moment isn’t identifiable as you just pointed out, it can be combined with all sorts of other data that’s already out there to be made re-identifiable. |
Julie: | 15:44 | Right. Absolutely. |
Cameron: | 15:45 | Thank you so much for taking the time to come and talk with us today. I know we’d had some minor technical issues trying to get this conversation going before, but thank you so much for your patience. Is there anything you’d like to share with us about things that you’re excited about or we should keep an eye out for or just other ways that people can get involved with MITRE’s privacy capability. |
Julie: | 16:05 | Yeah, we’re pretty excited about the tools being put on our cyber platform within MITRE for our privacy SMEs to use. There’ll be a lot more information about that and sharing those with our sponsors as well. We’re also very excited about some of the upcoming activities that we have. We’re going to have a privacy engineering capability meeting internally to talk about face recognition and privacy in September and also with our 20th anniversary coming up in January we’re planning … We’re starting to plan some activities for that that might be of interest to people within MITRE. |
Julie: | 16:41 | At the MITRE Institute, we have taught training classes there in the past regarding privacy and those are often open to our government sponsors as well. We are planning our privacy series for next year. Coming up with that. So that in addition to just being able to do more privacy work and expand our knowledge, we’ve got a really mature capability with a lot of tested tools and it’s something that we want to continue to grow. |
Cameron: | 17:11 | All right, and thank you so much for your time, Julie. I’d also like to extend a quick thank you to MITRE and the Knowledge-Driven enterprise for taking the time to make the show possible. And again, a big thank you to you Julie for taking the time to come and share all the incredible goings on with the privacy capability and also your story at MITRE. |
Julie: | 17:29 | Thanks for the opportunity, Cameron. |
Cameron Boozarjomehri is a Software Engineer and a member of MITRE’s Privacy Capability. His passion is exploring the applications and implications of emerging technologies and finding new ways to make those technologies accessible to the public.
© 2019 The MITRE Corporation. All rights reserved. Approved for public release. Distribution unlimited. Case number 19-3088.
MITRE’s mission-driven team is dedicated to solving problems for a safer world. Learn more about MITRE.
See also:
The person on the other end of the data
A social scientist examines the role of technology in our lives
Interview with Awais Sheikh on Deciphering Business Process Innovation
Interview with Jackie Morin on her journey from intern to senior engineer
Interview with Jay Crossler on why passion is the key to success
Interview with Dan Ward, Debra Zides, and Lorna Tedder on streamlining acquisitions
Interview with Dr. Philip Barry on blending AI and education
Interview with Ali Zaidi on designing lessons in artificial intelligence
Interview with Dan Ward, Rachel Gregorio, and Jessica Yu on MITRE’s Innovation Toolkit
Interview with Tammy Freeman on Redefining Innovation
Interview with Jesse Buonanno on Blockchain