Security / Privacy Journalist Sam Pfeifle Interview

By: Dan Gelinas, Published on May 24, 2019

Sam Pfeifle is best known as the outspoken former Editor of Security Systems News. After that, he was publications director at the International Association of Privacy Professionals, giving him insights into both the security and privacy worlds.

In this interview, Pfeifle shares lessons he has learned in security, cybersecurity, and privacy, as well as how those areas have affected each other and where they will continue to overlap, evolve and drive societal change, including:

  • How pushback on public-cameras is no longer an issue
  • How false positives render face recognition and ALPR dangerous
  • How it may be impossible to totally trust today's "incredibly complex organizations" with your data
  • Why the Terms & Conditions you may or may not read before clicking are not for you
  • Why companies that use and sell your data are not evil
  • Why it is someone's job to "see what their employer can get away with"
  • How people did not understand the risks of "putting everything on the Internet"
  • How many are not ready for the CCPA
  • Why many people already ignore lots of different privacy laws

We included a printed interview plus a full audio / recording podcast.

You can listen to the audio recording of this interview embedded below:

IPVM: What exactly does the IAPP do?

SP: The International Association of Privacy Professionals is just the privacy version of the National Burglar and Fire Alarm Association (NBFAA, now the Electronic Security Association (ESA)). So it is the membership association for people who do the work of privacy in organizations all around the world, both government and private sector. Similar to the NBFAA, they focus on providing tools for people to use to do their jobs better. And they have gatherings where you all get together and talk about best practices, and how you’re going to do your job, and upcoming legislation and technology. But importantly, because they are 501(C)(6), same as any other membership organization, they actually can’t do advocacy. It’s kind of an interesting dynamic in the industry because in the IAPP—which is really just this exploding field since when I got there in 2013 there were 12,000 members and they just hit 50,000 members this week—so massive growth. Inside those privacy professionals, you do have some advocates. Advocates are privacy professionals. They do privacy for a living. But then you also have people whose job it is to figure out what their company can get away with. Most of them are sort of just compliance people—the same kind of people who make sure your company doesn’t pollute. There are also compliance people who make sure that you do HR correctly. There are a lot of compliance people in privacy. Lots of lawyers. But then you also now are having this explosion of privacy engineers and operational privacy people whose job it is to look at code and look at the configuration of the security world. If you’re doing a camera array for video surveillance—especially in Europe it’s important—you need to make sure that your cameras aren’t in the public sphere. They really have to be for the purpose that you say they are, which is protecting your business, say. One of the very first enforcement actions for the General Data Protection Regulation (GDPR) (see our GDPR for Video Surveillance Guide) was for a company that had their cameras pointed too much at the sidewalk and they were picking up random people. So a privacy professional probably doesn’t work for a tiny retail store, but they might work for an airport to make sure that you don’t have cameras in places where they don’t need to be. You have things like data minimization, where the privacy professional would ask things like: “Do we need that data? Are we collecting that data for a real purpose, or just because we feel like it?” Privacy professionals would adhere to those principals and say: “Hey, unless you can provide a compelling reason why that data needs to be collected, then we shouldn’t collect it at all.

Why it is someone's job to 'see what their employer can get away with'

IPVM: You mentioned that there are some privacy professionals whose job it is to “see what their employer can get away with.” That sounds a little shifty to me.

SP: No, it’s really not. It all depends on the business plan of the company, the risk appetite of the company. So say you’re a data broker. Data brokers exist. It is a business model that you collect data sell it to other people. You need to know, for instance, what consent is attached to that data you’re buying so that you can then communicate to someone that you’re selling to what consent is attached to that data. If someone is offering you a set of data, the people whose job it is to collect as much data as possible in that company so that they have assets to sell is going to be like: “Yeah, I want this!” The privacy professionals job is to come in and say: “Hey, hey let’s take a look at the quality of this data. What’s the consent that’s attached to it? What are we allowed to do with this data?” And they’ll walk right up to the line and letter of the law. And they’ll say: “Oh, the law says X, Y, and Z. This part is good, this part over here we’re going to be violating the law, Don’t buy that.” Some privacy professionals do see themselves as kind of the voice of the consumer and others, they have different roles. For example, if you’re in the General Counsel’s office you’re ethically bound to protect the company. You’re not ethically bound to protect the consumer in that specific circumstance. That’s why one of the most interesting roles in privacy now is the Data Protection Officer, which is a job that was created by the GDPR. One of the things it did was it created a mandatory job for some companies. If, as part of your business plan you regularly collect the personal data of Europeans or people on the European Continent, then you have to have this Data Protection Officer. Their job is to be the voice of the consumer. So, you’re supposed to protect them, to think of the consumer and be independent of the company as a whole. And so one of the big questions is: “I’m in the General Counsel’s office. I’m a lawyer who works for Company X. Can I also be the Data Protection Officer? Can I represent both of those interests? Can I ethically both protect the company from harm and the consumer from harm?

IPVM: That seems like a conflict of interest.

SP: It is a really tough question. So people are outsourcing that role to people outside the company or they’re creating this little separate office. And then you have this question: “Well, if I create this little office and I put the Data Protection Officer in a little cube in the corner and don’t tell them anything about what’s going on, are they able to really represent the interest of the consumer or am I just hiding everything from them until I have everything figured out and I tell them what we’re doing?” It’s a really interesting job and there are a lot of them now, because by the letter of the law they are required to hire these people. So there are a lot of Data Protection Officers out there who are sort of groping in the dark and trying to figure out what the job really looks like and whether or not they want that job. I think it gets at the complexity of the privacy profession because I think most privacy professionals—once you get into privacy, and I found this myself when I got into privacy—there’s this, I don’t know, someone’s got to come up with a name for it, but it’s sort of like that IT adoption curve that Gartner is so famous for (the Gartner Hype Cycle). You come into the job and you have this peak of aghastedness: “Oh my God! I can’t believe everybody’s privacy is being violated all the time! And they’re collecting data, and they know what my mom’s name is!” And then you go into this trough of disillusionment where privacy doesn’t matter and consumers don’t care and they give all their data to Facebook, no problem. “Why do we even bother with privacy, it’s dead! Privacy’s dead!” And then it levels off into “Well, maybe we should try to find some sort of middle ground where people get the customized features that they want but we don’t, you know, destroy democracy. That’s where I think most privacy professionals wind up, where their advice to their companies is: “Yeah, sure you can collect that data and maybe we’ll make a couple bucks on that, but we’re going to piss off all our customers, so maybe we can find a different way.”

IPVM: Let me rewind just a little bit. We’ll jump back to privacy in a bit, but for those who may not know who you are, how did you get your start in the security industry?

SP: I was the editor of the Portland Phoenix which was an alternative weekly newspaper that just went out of business in here in Portland, Maine.

IPVM: I wasn’t aware of that, that they went out of business. They’ve been around for quite a while, right?

SP: The Portland Phoenix is dead. Dead! We launched in 1999 and I had a wife and child who, you know, I wanted to see sometimes and didn’t want to be working 75 hours a week at the Phoenix and then going out every night. And so I got into business-to-business. And at the time I felt like I was selling out. I felt like: “Ah, I’ll just sell out and ruin my life and get into business-to-business journalism because I gotta pay for the mortgage and the wife and kids.” But you know, I’ve really come to love business-to-business journalism because I feel like with the way that the journalism industry has moved, it’s actually kind of the purest form of journalism left, in that, generally speaking in B2B you’re not just driven by clicks. You’re not constantly being ground down by CPM numbers because you have an audience that you understand. You know exactly who they are. These people have the same basic interests. They run the same basic companies. They have the same problems. And you can actually help them. They don’t have the time to search the Internet all day long. They don’t have time to call people up and ask them questions. They don’t have time to monitor every product release out there. You really are this filter for them. You can say: “Hey don’t worry about it. I spent all day on Twitter and I surfed the Internet and I called up all these people and I went to this event for you and here are the things that you really need to know to make sure that your business runs well. And the feedback is really direct. You get these questions from people: “Hey, I’m considering Sony vs. Panasonic and I looked at Sony and it seems like their product schedule has kind of gone dark. Do you know anything about that?” And you can say: “Yes, I talked with their product development guy at ISC West and he said they actually have some big new thing coming up and the only reason they went dark was X, Y, and Z.” And you can actually help those people make good decisions instead of chasing clicks and trying to come up with some sensational headline, which is the way mainstream journalism has gone. I loved working at Security Systems News. What I find—and this is my own personal feeling—is that I get a little bored writing about the same thing all the time. So I left security to write about lasers. Then I left lasers to write about privacy, and one of the things I came to a conclusion about—I mean, I’ve also covered boats and I’ve covered the music industry—you begin to see how the same dynamics play out in every industry. And you really begin to understand what that sales channel looks like. You see people making the same kinds of decisions over and over again and being successful and making the same kinds of decisions over and over again and being unsuccessful, and… you know, I used to think “content” was this super-crass word. I was like: “I’m a writer! I’m a journalist! I don’t make ‘content.’” But I’ve come around on that. Content is this really broad term and when you’re a writer, every problem seems like a 750-word article. And now that I’ve become a content creator, I realize sometimes you just need to tweet it out. Sometimes you need a white paper. Sometimes you need to write the article. Sometimes you need an infographic, sometimes you need a video, sometimes you need a podcast. And so I’ve set up my own shop called West Gray Creative. I’m working with some freelancers and some full-timers and some designers and I’m trying to help people use content to educate their community or get their message out there or do their marketing, or whatever their goals are. I try to match content creation with solving their problems.

How People Did Not Understand The Risks Of "Putting Everything On The Internet"

IPVM: Let jump back into SSN a little bit. While there you ran TechSec Solutions. Did that address in some way the confluence of physical security and cybersecurity? Did it get into privacy issues at all?

SP: Almost not at all. And now with the understanding that I do have of privacy, I don’t think enough people understood the risk that putting everything on the internet presented. I get that we were often talking to evangelists, and that was their job. You know, to evangelize the benefits of putting things on the network: “Oh you can check it from home, you can check it from your phone, you can check it from your laptop!” It also means that some hacker can check it from their phone and can check it from their laptop. So if you have this great, actual CCTV on closed coaxial cable, you know what can’t happen? Hacking. If you actually do have a closed system that people can’t get in, and then you know watch people in your building. And we’ve seen that. Nannycamgate. I don’t know what the number is, but this hacker comes out and says: “Hey, I’ve got access to 1.5 million nanny cams. Hey, here’s a baby in a crib in Bethesda, Maryland. Here’s a baby in a crib in Sydney, Australia. Here’s a baby in a crib in Singapore!” And it was because they shipped a bunch of IP-enable cameras with generic passwords and let me tell you, parents with infants aren’t really geared up to change their passwords. It’s just not what they’re thinking about right now. They just want to make sure their kid is still breathing in the crib. Writ super large, when you are seeing an IP-enabled system for every problem as a solution, I think maybe you aren’t considering that there is some risk there. What privacy and security have in common—or they should have in common (I sometimes worry that not enough people think this way)—is that there should be risk-based approaches. Your solutions should be commensurate with the risk. And so in privacy what that means is you should apply an appropriate amount of security to data given the risk of it being released. If you have a big database of business emails and business phone numbers that are on everybody’s business cards, yeah, you want to protect it, but you don’t need to give it the gold-plated, Fort-Knox solution. But if that data is DNA or if that data is a list of AIDS patients, you better take a much more important approach to your security. Similarly, the people that you’re protecting in a security environment, the threat matrix for that facility, the employee makeup of the people who are operating that system, all those things should go into the calculation for the solution that you create. Risk is so slippery. You hear about risk scoring: “Oh, I gave that a Green, and this an Orange and that a Red.” It’s not like that’s iron-clad. It all depends on perspective. One person’s Green is another person’s Orange, depending on their cultural approach. I think it’s It’s an interesting dynamic because security is often the reason that privacy doesn’t matter. If I’m in a business, I can say: “Hey, I’m doing security.” And I can record video of every single person that comes into my business. And nowadays, people are using facial recognition to check against databases of shoplifters. We have as a society decided: “You’re protected against shoplifting. No problem.” And only just recently with people being concerned about privacy are people raising there hands and saying: “Well, is it really true that protecting against someone shoplifting a 6-pack of Natty Light is commensurate with all of us being in a database and my face being scanned all the time. Maybe that’s actually bullshit. Maybe the convenience store can lose a couple cans of Natty Light and my privacy isn’t just violated every time I walk into the store.” And we’re starting to think about things like context. One of the tenets of privacy is notice and choice. I give you this notice of what data I’m collecting, and you can choose whether you use the service or not. Sounds fair. But if every convenience store is going to record me, what does that mean? I just choose not to use convenience stores?

IPVM: What’s convenient about that, Sam? Now it’s an inconvenient store.

SP: Right! “Sorry. If you’re offended by that, then you just can’t get gas. You just can’t Internet. Sorry! The Internet is closed to you. You have to choose to not use the Internet.” On some level that breaks down because, yeah, you’re giving me notice, but I don’t really have a choice. I really need to get gas for my car. And if I want to go in and get a Cliff Bar while I’m doing it, maybe facial recognition in exchange for that isn’t fair. And I think we as a society are starting to think about that much more often and Europe just is ahead of us. In Europe, you really need to show what the legal basis is for your data collection in a way that you don’t have to here in the United States. And that’s what we’re grappling with right now is what privacy laws do we want in place. And I really see it affecting the security industry in the monitoring part. So: “Hey, I’m going to monitor whether or not gramma took her pills.” That’s a service I can offer. Did she open the box or not open the box this morning? Well, that means some monitoring company now has data that Sally Smith has cancer and is taking cancer medication. That’s super-personal information. How is it being protected, and who has access to it, and what happens if it leaks? I’m sure the security industry has really thought deeply about that, and I think that’s coming in a big way.

How Many Are Not Ready For The CCPA

IPVM: Here in the US, we’ve had laws like California’s CCPA—

SP: Pay attention to that, everybody listening!

IPVM: Yes, and the saying goes, “As goes California, so goes the rest of the nation.”

SP: The reason that we have privacy policies on our websites is because California passed that law.

IPVM: Right. I read a story recently about a survey—by Dimensional Research and reported by TrustArc—of companies trying to get ready and most respondents—I think it was 88%—say there’s no way they’re going to be ready to comply with the CCPA by the time it goes into effect.

SP: Correct.

IPVM: Obviously, it’s becoming a big, huge deal. What do you see happening when these companies aren’t ready for it, but they have to comply?

SP: Well, all laws are only as good as the enforcement behind them. So you know, there are all sorts of laws that you're not compliant with right now. But you don’t care about them and you’re not worried about being enforced against.

Why Many People Already Ignore Lots Of Different Privacy Laws

IPVM: Right, I don’t care because I’m not going to be caught.

SP: Right. So, there are many, many companies right now that are not compliant with laws that have been in place for a long time, like HIPAA, or the Children’s Online Privacy Protection Act, or FCRA. We have privacy laws, and companies sort of ignore or don’t based on their risk profile. If you go look at the recent enforcement actions from the Federal Trade Commission—TikTok, really super popular app—FTC just said: “Hey guys, you’re ignoring COPPA, so we’re going to make you stop doing that.” Some companies put that into their risk register: “We want to grow super fast, we want to collect a bunch of data, we want to make a bunch of money. We’re not really worried about the FTC, because they don’t have fining powers until you do a bad thing twice, so let’s push the envelope here and make a ton of cash.” So: “Oh yeah, we got whacked, but we have a bunch of cash in our bank account, so we don’t really care.” People are going to be making those kinds of decisions about CCPA. Those companies that value privacy have privacy as part of their marketing, or are super customer focused, whatever their priorities are for a company, they’re going to make damn sure they’re CCPA-compliant. Other companies that play a little faster or looser, or are not particularly customer-focused—have a different risk profile—they’re going to be less compliant with the CCPA and there’s going to be a spectrum. And it will be up to the Attorney General’s office of California to go ahead and enforce that law. And I can tell you that privacy laws are really hard to enforce. You know, data is slippery. You don’t really know what’s in that database, you don’t really know what they’re doing with that data. One thing that CCPA is going to do is lots of audits. CCPA is mostly about communicating with customers. It gives them rights to access, it gives them rights to deletion, it gives them rights to object to the sale of their data. It doesn’t really focus that much on internal use of the data at the company. There’s going to be: “Hey, it says in the law that you have to have a button on your website that allows people to object to the sale of the data. Where’s your button?” And they say: “There’s my button.” And it will be: “The law says that I can request you to report back to me all the categories of data you have about me. I’m making that request. What do you got?” And if you can’t produce it in the time allotted, or if you produce some sort off half-assed report, then the Attorney General is going to say: “Well, it looks like your privacy program sucks. Let’s have a big, longer conversation about enforcement.” We’ve seen that in Europe. They’ll do a sweep and they’ll say: “Okay, we’re going to look at children’s apps in the App Store.” And they run through and they say: “Okay, you know, 37% of these apps don’t do X, Y, and Z that they need to to do. We’re putting you all on notice. You’ve got some time to go take care of it and then we’re going to get serious.” But it’s impossible to enforce the entire internet. There are millions and millions of pages. They’re going to issue guidance and they’re going to say: “This is what we expect.” And then they’re going to go after the egregious violators first. So there’s definitely some companies where their risk register says: “Well, we’re kinda breaking the law, but we’re pretty close, and we’re keeping our heads down, and I know that dude over there is way worse than us, so I think we’re probably Okay.” And you know that’s, every company has their own appetite for that kind of thing.

IPVM: That’s interesting. “Have their own appetite.” So it’s all about the risk: “What can I get away with?”

SP: There are some companies that are going to say: “We are 100% compliant.” That’s their thing. And they have awesome compliance departments and every single box is checked and that’s part of their culture. I would say that maybe Silicon Valley does not have that culture. They’re much more aggressive and venture capitalists expect quick returns. And so they’re out there: “Let’s see what we can do!” Literally, Facebook’s mantra was “Move fast and break stuff.” Unfortunately, they broke democracy. “Oops! Oh well! But we made a crapload of money!”

IPVM: Facebook is in the news all the time these days. You bring up a good point, though. How can one regulate the Internet?

SP: It’s super hard.

IPVM: Well because it’s not a known quantity on any given day is it? I mean pages get added and pages get taken away in real time, like a living thing.

SP: That’s one of the first things you learn in privacy—I mean there are all kinds of clichés—but one of the things is that privacy doesn’t really respect borders. You want to talk about cross-border data transfer which is a big thing in Europe. You’re not supposed to send personal information outside of the European Union unless you have a legal reason for doing so. But how the hell do you track whether data just got emailed from the UK to Maine? How are you going to do that? It’s impossible. One of the unfortunate things about privacy is that it puts a lot of the onus on the consumer. You have to kind of raise your hand and be like: “Um... I think they’re misusing my data?” Versus, imagine in the physical world if you had to constantly be like: “That guy just robbed my bank. Can someone pay attention?” It’s a lot more tangible than: “Hey, they just stole a big database of mine.”

IPVM: You bring up an interesting question that I was going to get to, and that is how much of the onus is on the end user to read contracts before clicking install?

SP: It’s a lot.

Why The Terms & Conditions You May Or May Not Read Aren't For You

IPVM: And to be aware of privacy policies before downloading and using apps or connected services? And should that shift in some way?

SP: Well, so that’s another cliché. If you go to a privacy conference someone will say something like: “And the terms and conditions for Groupon is longer than Macbeth. Hahaha.” And everybody’s going to laugh. And it’s true. Terms and conditions are fundamentally not for consumers. What they are for is for regulators. So, lawyers draft privacy notices to tell the regulator: “Hey guys, we’re taking this seriously. We’re telling the consumer every single thing we’re doing with their data.” It flies in the face of human nature. When you go to the App Store to get an app, it’s because you want it right now. That’s why you went there. “I want a fitness tracker app because I have decided to keep track of how many situps I can do.”

IPVM: And you’ve got your shorts and t-shirt on and you’ve got your water bottle and you’re doing it right now.

SP: Right, you’re doing it this minute. So then you get to this point where you’re about to download it, and it’s like: “Read our privacy policy.” No, you’re not going to go take an hour to read this privacy policy and then make a decision on that app. What you do is you found an app with the features you want and you’re like: “Yeah!” And then you download it and you do your situps. And there’s this whole portion of the privacy thought leadership that says: “Let’s stop putting that onus on the consumer. Let’s focus on the use of data. Let’s not worry about that point of collection, because it’s not fair. Let’s regulate improper uses." But then it’s back to how do you know how a company is using your data? Is there any way to know that that fitness tracker just took your data and sold it to a data broker? There’s no way to know. That’s where things like Privacy Engineering comes in. Can we actually come up with a way to allow consumers to track where their data goes? Is there some way to code for privacy where it in real time alerts you: “Hey, you’re giving some data that they’re going to use this way. Do you really want to do that?” And you see that. There’re a lot more just in time notifications these days. I was out in Oregon and I needed a gas station to return the rental vehicle. “Where’s the nearest gas station?” And Google pops up a thing that says: “Hey, we’d like to use your location to help you out. Do you want to do that?” I said: “Yeah, sure. I want you to know where I am. Because I need to figure out where the nearest gas station is. If you think that I’m in Nebraska, this isn’t going to work!” And so then I feel a little bit empowered as a consumer. I’m like: “Okay, I made that decision, I said yes.” And then I’m assuming because they just asked, that they’ll stop tracking my location at some point in the future. And Google has just rolled out a tool this week where you can go and delete your data.

IPVM: I saw that on Adweek this morning, that you can set it…

SP: You can set it to auto-delete.

IPVM: And that’s great because it takes out that inconvenient factor where every single time you’ve got to say: “No I don’t want you to sell, I don’t want you to use, I don’t want to do this!” Is that a big move?

SP: It’s a huge move. It absolutely is a huge move. But then you get into trust. Because how do you know that data just got deleted? I have no idea. I’m trusting Google to do what I told them to do. This came up—I’m not going to pick on Uber, but I did pick on them on Twitter, so I’m going to do it—I got this notification that said: “Hey, you have a special offer!” And I hate notifications. I turn off all my notifications because I don’t want to be interrupted. So I go to my settings in the Uber app and there’s a little setting for Offer Notifications, and I say: “Off!” And what happened, Dan? I got a notification the next day! So that Uber is a lier and they give me this false choice. I said: “I do not want these things!” And they gave them to me anyway and that erodes my trust in Uber. And that’s where the consumer experience hits the road. If you want to develop this trust with data, you need to do what you say you’re going to do. And I think that’s where privacy and security part ways in a lot of ways. Security is about access and privacy is about use. So you can have the greatest security in the world and never have a data breach, but still violate people’s privacy all the time by not honoring the terms you agreed to with them. And so that’s where I think people really tend to focus on the data breach and that part of privacy, whereas the real sticky issues are in the use. If you’re a security operator, you’ve made this promise: “I’m monitoring your business.” If you then sold to a marketing company the amount of foot traffic that came through that business, that would be a privacy issue, but it’s not a security issue. And I think that’s going to be an interesting area as security companies become more and more intimately involved with their customers, there’s going to be that temptation.

IPVM: If I'm honest, I’ve never read from beginning to end—I’ve glanced through—But I’ve never read from beginning to end any one of these Privacy Terms and Conditions where you’re supposed to read and then check the box: “Yeah, I read it.” So I’m lying, too, right? I checked it off. I read it.

SP: My poor kids know they have the dad who actually reads the Terms and Services. My daughter who is 15-years-old is in her health class in high school and the Health Teacher says: “Download this nutrition tracking app because I want you for a project to track all the things we eat.” And all the other kids just download it because that’s what the teacher told them to do. And my daughter knows that she under no circumstances should ever download anything without asking me first. And so she sends me: “Hey dad, I’m supposed to use this app, can I do it?” And I go and read the Terms of Service, and literally in the Terms of Service—I screengrabbed it and sent it back to her—it says: “Do not use this app if you are under 16. We do not comply with any laws for kids under 16.”

IPVM: So it’s totally illegal and the teacher just directed the entire class to download it?

SP: Yeah, the entire class. I sent it back to her and I said: “Ruby, under no circumstances are you to download that app and use it.” Would anything probably have happened? Probably not, but it says right there, don’t use it if you’re under 16. All these kids are under 16. So I think we just need to be a little more thoughtful. It’s a skill set. We have these Money Matters classes in high schools in Maine now, where we teach kids how to deal with finance and get a loan and balance a checkbook and all that stuff. I really think we need some kind of personal data navigation classes where we teach people: “Yeah, it’s a thousand-page Terms of Service, here’s the 3 or 4 things to search for.” One of the things I always search for is the number 13 because that’s the age of consent for the Children’s Online Privacy Protection Act. So if people are saying that they are not COPPA complaint, they will put something in that says: “Do not use this app if you’re under the age of 13.” And what they’re saying is: “We do not comply with COPPA and if you’re a kid, we do not want you using this.” So, I’ll do that search and if I see that I’ll say: “Hey, 12-year-old son of mine, no, you can’t use that, because they don’t comply with COPPA. And I’ll tell you, all social media does not comply with COPPA, and I see 11-year-olds on Facebook and I’m like: “Oh my God what are you doing!?”

IPVM: But of course Facebook thinks they’re 23.

SP: No, of course, because they all lie. Instagram has made liars of every mom in this whole world, because when their kids come to them and say: “I want to be on Instagram!” They say: “Yeah, sure, I’ll just lie and say you’re 16.”

How it may be impossible to trust today's "incredibly complex organizations"

IPVM: But you said something earlier too, and it sort of clicked something in my head. When we were talking about the trust factor. When you deactivated your notifications from Uber, you still got the push notifications. And then Google has this new thing: “Hey, don’t worry about it. You can select to automatically delete your data, and we’ll do it for you." So here’s the question: Do you trust Google? Because I was going to say, “It’s Google! They’re a well-known mega-corporation that’s worth gazillions of dollars. What’s not to trust?”

SP: What I would say is organizations are incredibly complex and you’d be a little bit foolish to trust any of them to be doing exactly what they say they’re doing. I think Facebook added 36,000 employees just in the last quarter. I was reading their financial statement. Do you think you can train all 36,000 of those employees to handle personal data correctly in 3 months? I don’t think so.

IPVM: Get AI to do it. AI does everything now, Sam.

SP: Don’t even get me started on AI. AI is very unreliable. But I think the human factor means that things are going to be configured incorrectly, humans are going to make poor decisions. Humans are going to be inexpertly trained and some of these things are super contextual and are really hard. And privacy also bleeds into what is becoming known as content moderation. So how does Facebook decide what to allow posted and not allow posted. And it’s AI, but its also people, and that’s why you get Facebook bans the Venus De Milo from being uploaded because it’s got a boob: “Oh, we’re not allowing boobs.” It’s the fricking Venus De Milo! So now their algorithm has been tuned to allow the Venus De Milo. They have these rules like “What about breastfeeding mothers?” They literally have these things like: “Oh, if there’s less than 1/3 or a teat being—“ It’s ridiculous. Like you have some person measuring the nipple. But in privacy it’s kind of similar. If your last name is Smith, it’s not a particularly personal piece of data. There are a billion Smiths out there. But if your last name is Pfeifle, like mine—Google Sam Pfeifle. There’s like 3 in the United States—So my name is a little bit more personal than Jim Smith’s is and that’s really, really hard to train AI for, for sure, but even people. How do you train them to understand that if you combine profession and zip code in Maine, there’s probably only one OBGYN in Lisbon Falls, so you know exactly who that person is? But there’s 500 of them in Boston, so you know it’s a really tough problem. There are companies I trust to make a really good effort, and there are some I don’t trust to make a really good effort and I think that is where security and privacy overlap the most. If you’re going to be working with a security integrator you want to be able to trust them to do what they say they’re going to do. If you’re working with a monitoring company, you trust them to alert you if a bad thing happens. It’s about focusing on those things that would cause you to lose trust and making sure they don’t happen and that’s where the risk comes in, too.

IPVM: Let me ask you this. Sort of rewinding a little bit. You’re no longer in the security industry or the privacy industry—

SP: I’m in every industry now.

IPVM: What's driving you as you move from industry to industry?

SP: I think privacy is one of the most interesting spaces to be in, and I will definitely still do some work there, but I want to do some privacy work in security. I want to do some privacy work in lasers. One of the apocryphal privacy stories is that Google was out there doing their Google mapping which uses laser scanning and photogrammetry so you get that street view. That’s mostly done with a laser scanner and a camera. And along the way, they were like: “We’ll also pick up WiFi signals as we’re driving by, and we might as well keep those just to see what happens.” And they ended up keeping all the MAC addresses of all the WiFi routers that they rode by throughout all the United States. And they were like: “What could we do with this?” And the Federal Trade Commission found out about it and they were like: “You can’t do that.” And they got whacked with an FTC sanction. I want to start using what I know in one industry to inform another. So, actually, another industry that I’m super interested in right now is cannabis. Cannabis is exploding and just think about how that intersects with privacy and security. You’re a security guy that wants to get into cannabis. There is some super-interesting stuff there around point-of-sale. Cannabis is an all-cash business. You’re not allowed to use credit cards, you can’t do any banking. What issues does that present if you’re trying to secure that enterprise? That makes point of sale much harder. So, that’s what I’m interested in right now—how these different trends affect industries back and forth.

IPVM: We mentioned the thing with Google this week. Something else that came out recently was the Virginia ruling on ALPR

SP: John Honovich called me up when I was 2 months on the job and I knew very little about privacy yet, and he asked me about it. And I said: “Well, I think that license plate recognition stuff is going to be a major issue because you’re tracking all of my movements because why, exactly? I’m a criminal because I’m driving a car? I think that’s going to be a privacy issue.” And it’s starting to be a privacy issue. For good reason, in my opinion. I don’t think that just because I’ve chosen to drive a car around my state that you have the right to track my movements. I think that’s crazy. And so what privacy would say is there should be some data minimization. So maybe you capture it, but you don’t save it. Or you save it for a day or for a commensurate amount of time. If you’re scanning for wanted criminals and the cars that they drive, that might be fair, but scanning everybody’s license plate and saving it in a big database just so that you can go back and check later, that seems creepy. I think that’s one of the places where legislation is really hard to write. How do you write a law to say you can use this technology in this very specific way and also be able to predict how that technology is going to grow. You know, license plate capture via video wasn’t possible 20 years ago. Now it’s effortless.

How False Positives Ruin Face Recognition And ALPR

IPVM: And that relates to what we were saying earlier about scanning at the convenience store looking for people that are known to shoplift. Is that commensurate to me and everyone else who has never shoplifted having our faces recorded?

SP: Also false positives. False positives are a major, major issue. And that’s where privacy and harm come into play. How do you quantify the harm of a privacy violation? I have cancer. You lose the fact that I have cancer to some hacker and now it’s out there in the world that I have cancer. What’s the harm? Is there a monetary value? Am I just really sad? It’s really hard to quantify the existential dread that you feel when everybody knows that you’re a cancer patient and you didn’t want them to know that. But with false positives and LPR... Say there was a little bit of extra dirt and my 9 looks like an 8 and therefore the cops come and bust my door down because they think this is my house and I’m a bad guy who did some bad thing, and I’m at gunpoint and my children are there. That’s some serious harm. I think we can quantify that. That’s emotionally damaging and my door just got kicked down. And heaven forbid I’m a gun owner and I’m wearing a sidearm and they shoot me, because they think I’m Joe Smith the bad guy, but really I’m Sam Pfeifle the dude who had some dirt on his license plate. That’s a major problem. What if I walk in and my face is remarkably similar to some other white guy with a beard and they say: “Sorry, you’re not allowed to shop here, shoplifter guy. We see that you have multiple felonies on your record.” And my kids are there, or my employer is there, and everybody thinks I’m some awful shoplifter guy. That’s a big harm. So is that potential harm commensurate with the benefit? What did you save? Like .00003% of your margin by making sure that lipstick didn’t get stolen? We have to have those conversations. I’m not the one who’s going to do that math, but hopefully, we’ll have some leaders in our democracy make those decisions.

IPVM: But It’s sort of a matter, too, of scale. I was going to play Devil’s Advocate, and say with the license plate or even the facial recognition, if I’m an employee at that store, and I’ve been there for years and I see you walk in the door and I say: “Oh, that’s the guy that tried to rip me off for a 12 pack last week!”

SP: Yeah: “That’s the bad guy. Get out of here!”

IPVM: And you say: “Get out!” Isn’t that kind of the same thing, but a matter of scale?

SP: It’s a matter of scale and you know, I trust that longtime employee to correctly identify me more than I trust the algorithm. I make that argument all the time. It’s not even Devil’s Advocate, it is a matter of scale. So advocates might complain about targeted advertising. They say: “You know, this is an invasion of my privacy. Just because I went on Facebook and searched for this book, I shouldn’t see an ad for it everywhere I go around the Internet.” I can kind of sympathize with that, but we’ve been doing things like that all the time. There’s Enterprise Records in Portland. The most low-digital store ever, but the dude, Bob, who works there, knows that I love Doc Watson, and—amazingly—every time I walk in the next record that gets put on is a $55 special-issue Doc Watson record that I absolutely have to have. And then I blow $55 on an album that I didn’t even know I wanted. And that’s also targeted advertising.

IPVM: It’s just a low-tech version of it.

SP: It’s a matter of degree. And what we need to do is that harm/benefit analysis. Absolutely, if we’re protecting against major harms, then we can suffer a little harm on the back-end privacy, but if it’s this little teeny tiny harm, I would expect that my privacy is only harmed in a tiny way as well. And that’s where you need some really sophisticated thinkers.

How Pushback On Public Cameras is no longer an issue

IPVM: And that brings me to another question, too. This notion of a zero-sum game. To a degree, in this world we have today of ubiquitous connected technology and cameras—I just interviewed somebody the other day who owns a restaurant and has been in business for 50 years and he’s been working there since he was 12 and has a dedicated clientele and he put up a bunch of cameras and I said: “Well, did you get any pushback? I have to imagine people who have been going there for 20 years…” and he just laughed and said: “No, the thing is today, you can’t go anywhere without seeing a camera. I think that ship has sailed.”

SP: I think it has.

IPVM: So is that increasing? That zero-sum game of “If I want to be safe and secure, some things are going to have to go. Convenience is going to have to go. Some liberties are maybe going to have to go in order to make sure I’m safe?"

SP: I think that sometimes we’re chasing this imaginary safety. Tthere’s this idea—and I think this happens especially in schools and in places where these awful, awful things happen. We have this idea that if we just spent more money, if we just put in more security apparatus if we just did more training, we can make it so these bad things won’t happen. And I think that that is a false hope. I don’t think it’s cynicism to believe that if bad people want to do bad things they’re going to find a way to do them. And so it gets back to—I beat this drum all the time—the response should be commensurate with the risk. What are you protecting against and Is the thing you’re protecting against really not going to happen because you take these steps? Absolutely, surveillance works. I just served on a grand jury where I heard like 400 cases in Maine. You basically can’t steal things anymore. It’s impossible because almost everybody has surveillance and they look at the surveillance and they say: “Oh that’s Jim Smith. I know that dude.” And they go and arrest you. But the bad things still happen, actually—you just caught the guy. After September 11th, so much money got funneled into airport security and then I get on a bus, and all I have to do is show a minimum wage employee my ID and I get on. So if I’m on a plane, I got to be physically groped with a wand, but to get on a bus with the same number of people, I have to show them an ID one time. I mean, why? I still do not to this day understand why. It’s because they’re not really doing a risk-return, it’s security theater, and I just want us to be making that calculation more often.

IPVM: And so how do we make that calculation regularly and accurately? Is it the discussions you mentioned where we have to start having these discussions? Who leads those?

SP: Well, I think it’s about having people in charge of it. Right? So, the reason that the privacy industry is growing so fast, is that people are realizing: “It’s actually somebody’s job to make sure we do privacy well." We can’t just hope that we do privacy right. We need a trained professional to be that person. That’s why we have security directors. Hopefully, a director of security is making that calculation. That’s their job. And I think we need—and we are starting to see it, we’re starting to see states with Chief Privacy Officers, and Chief Information Security Officers, and we’re starting to see the federal government have Chief Technology Officers—we need more of those things. We need trained professionals helping us make policy decisions. Do you want to see something terrifying? Go watch a Senate Committee interview Zuckerberg. They don’t even know what they’re talking about and that’s not even really a slag on them. Do I expect 80-year-old Lindsay Graham know how Facebook works? Not really. But we do need those professionals in Government on their staffs. To be able to at least see the potential harms to be able to protect against them rather than only seeing what we see as the benefits. And also, I’m on the school board and right now we just had to make a security audit in our district and we’re having to make a decision: Are we going to blow 400-grand on a security system to protect against an active shooter? I don’t know. It’s really hard to tell parents: “Hey, we don’t really think that an active shooter is very likely, so we’re not going to spend all this money.” But then it’s also hard to tell parents: “Hey, we blew 400-grand on security so we can’t hire 5 teachers. Oops." That’s really hard.

IPVM: Yeah, that’s a difficult discussion to have.

SP: It’s a super hard decision. And I just want us to have those conversations. I want us to think about it rather than us just going: “Yeah, yeah, yeah security’s always good.” Let’s hope that we have lots of different goals and priorities and it’s not always trying to be as safe as possible and it’s not always trying to make as much money as possible. There’s this blend of really hard risk calculation.

IPVM: Let me ask you, and I’ll wrap up here, we’ve been at this for a little while. Are things headed in the right direction, do you think? Having been somewhat of a subject matter expert in both security and privacy, are things headed in the right direction with these moves, in Virginia, with GDPR, with Google doing the auto-delete?

SP: I think so. I actually am, I guess, mildly optimistic that we’re starting have conversations that we should have had maybe 5 or 10 years ago. Unfortunately, technology always outpaces policy. It just does. You can’t write a law to protect against stuff that you don’t know exists. What I want us to do is at least have some infrastructure and political will in place that when you do see it happening, let’s act on it. Let’s go! And hopefully, there are some universal things that we know are bad that we can try to start addressing. I don’t know how you regulate Facebook, I don’t know how you regulate the Internet the right way, but let’s try. And I think we are at least starting to have the right conversations. When this New Zealand Mosque shooting happened, we have a bunch of people now saying: “Hey, maybe it’s not the best idea to allow live broadcast of anything at any time to the entire planet. Maybe not.” Maybe it’s not cool to have zero filters and I can just show everyone right now this awful, horrible, gruesome thing. Maybe we should have, you know, an hour delay on the Internet. I don’t know. Maybe you should have to get a license to broadcast on the Internet. Oh, what a crazy idea! Maybe we should have something like the Federal Communications Commission that says: “Hey if you want to broadcast things to the world, you have to follow these little rules.”

IPVM: But the problem, though, is that it’s the Internet, right? If I wanted to broadcast something on TV, live, then I have to go to the FCC and get a license.

SP: You need to get a license. And this is a whole different rant of mine. For whatever reason when the Internet came around we decided that it wasn’t like everything else and it shouldn’t have rules: “Oh my God, we’re trampling free speech if we try to regulate the Internet at all. And we have the Communications Act, that makes it so that Facebook is not responsible for the content on its own platform. Which is insane to me! If you’re the New York Times and someone in the New York Times writes a crazy, libelous piece of crap, they have to suffer for that. If someone on Facebook, which is somehow different because… I don’t know… Writes some crazy libelous crap, Facebook is not responsible in any way. Why? It makes no sense. If you’re broadcasting video, you’re broadcasting video. If you’re publishing information, you’re publishing information. Because it’s on the internet, it’s different than on my TV? Because if it’s on a piece of Paper it’s different than it is on my computer screen? Why? We need to start having that conversation. If we’re all just going to decide that we don’t think anybody should have access to awful crazy crap at any time and it should be gated and it should be put in a place where only people who are definitely over 18 should see, let’s actually make an effort at that instead of just saying: “Eh, it’s the Internet.” Let’s try

IPVM: Customers today, are they savvy enough? Are they becoming more savvy with these kinds of things? For example, you said you’re the one crazy dad in the world who reads all the Terms, and your kids know that. But are more people headed that way?

SP: I don’t think so and I think it’s unfair to ask them to be. You know, one of the—this is one of my rants and this gets into capitalism and all kinds of stuff—But we’re being asked to do more and more and more. We have this cult of business. Everybody you talk to—they’re just “crazy busy,” right? And they’re all stressed out. And some of that is of their own making. People feel they need to keep up with their Facebook feed, or they feel the need to make sure they understand what’s going on on Twitter. Those are decisions that they’re making. But also, it is definitely true that wages have been stagnant for 30 years and people have to work harder to make the same amount of money. And so, we’re asking them to work harder to feed their families. We know that their buying power is less. We know that they have to spend more time trying to pay all their bills and at the same time, we want them to be able to read these new crazy terms and conditions and also be able to participate in democracy. That’s really hard, and there’s only 24 hours in a day. So I do think—just imagine with any product… Imagine you’re buying a car and you had to inspect the car to make sure that it had seatbelts and: “Does this actually have airbags, or does it just say it has airbags? Does it actually 35 miles a gallon or does it just say it does?” We actually have laws in place that say: “Hey, you have to have seatbelts. You have to have airbags. You have minimum miles per gallon standards." The government and we as a people, who are the government, we made those decisions. We said: “Actually, you are not allowed to sell cars without seatbelts. That’s illegal.” Why can’t we also say: “Hey, you’re not allowed to collect people’s data without doing X, Y, and Z?” There’s no reason. Of course, we can do that. So instead of me having to read these terms and conditions and see that they sell my data to some data broker who I don’t want to have it, how about we just say: “Hmm, you can’t do that.” We could. Or we say that you need to be licensed in some way in order to sell that data and you need to display a little badge that says: “Hey, I’m a licensed person.” I have to get a license to cut people’s fricking hair! I don’t need a license to collect people’s data and sell it to other people?! We make these decisions all the time and it speaks to our cultural priorities.

Why Companies That Use And Sell Your Data Are Not Evil

IPVM: The reason is probably that someone is making a lot of money off of it.

SP: Yeah, and the thing is, I don’t want to portray—I actually do think Facebook is evil, that’s a different thing—but I don’t want to portray most of these companies that are using data as evil or anything. They’re just playing by the rules as they are right now. It’s incumbent upon us as a society to change the rules. Absolutely. And I think that’s what leadership should look like. Leadership should look like: “Hey, the people that I represent are being harmed. They don’t have, as individuals, the ability to protect against that harm, so it’s my job, as their government representative to protect them by putting in some laws and policies.” That’s what I want to see happen. I think we will see that happen, actually. I think it just takes a while.

IPVM: It just takes a while to get there, and unfortunately, usually change comes after some sort of loss, event, bad thing.

SP: Right, and we have the larger issue where our democracy is not exactly functioning at the highest efficiency right now. And that’s a bigger problem than security and privacy.

IPVM: Something I want to push back on. When you were talking about having laws that make it illegal to sell a car without a seatbelt and you said that we could do the same thing with people's data and make it illegal to sell it, my pushback, to play Devil's Advocate, is: Wouldn't all people's favorite apps, their social media become not so free anymore? isn't that the way that Facebook is free and all these other social media apps can be free? Because you agree they can access and use and perhaps sell your data?

SP: Yes, that's Facebook's argument, essentially—that they provide this glorious and liberating free service and you pay with your data. Which is fine. Facebook doesn't actually sell data - they sell access to targeted populations based on the data you give them and the data they buy from other services to combine with the data you give them. And that's what carmakers said about safety regulations: “They'll be expensive!” And that's what food service companies said about health regulations: “They'll be expensive!” That's basically the opposition to all regulations, by industry. So, maybe some free things won't be free, but I'm not sure even that is validly true. Can Facebook and other free services not survive on just ad sales alone, targeted at the profiles and likes that we provide as part of the set up of the service? Facebook throws off BILLIONS per quarter in free cash flow. Free newspapers are free because the ads support them. Do they need data to be free? Why not? The selling of data to whomever based on one click of one terms of service that is thousands of words long and stands in the way of me being able to use a free service that I want and am not particularly educated to understand strikes me as outside of the social contract for "fairness" we have made as Americans. Others might disagree. But, you say, these people are agreeing to have their data sold! Consenting adults! Well, there are lots of other things that are not allowed to be bought and sold based on the social contract we have made as Americans. I cannot pay for anything with sex. Why not? Some people might want to accept sex with me in exchange for my weekly groceries! I cannot buy opium at my corner store? Why not? Opium is fun and I want it! Nor does me signing a contract saying it's totally cool to pay me in sex or sell me opium change anything about the legality of those things. I think that as Americans come to understand more of the data ecosystem we will come up with rules and regulations that make the data-exchange with organizations much more transparent equitable and give consumers more power in the transaction. But, obviously, that remains to be seen.

IPVM: Let me ask you this, too, and this will be my last question. Sort of the flipside to where I asked you if customers were becoming savvy enough, are manufacturers being diligent enough?

SP: I think there are some that have decided to make it part and parcel of their business model. They see it as a marketing advantage and that’s capitalism. Writ large is they’re responding to market demand. All of a sudden, Apple is putting out really strong privacy messages. I see them on Twitter. “Apple: Your phone is locked like a super-secret safe and we’re going to encrypt all your messages and we can’t get into your device.” Do not underestimate what a big deal it was when they said: “No, FBI, we are not going to help you figure out who that terrorist is by cracking their phone.” That’s a huge deal. They’re certainly going to take a hit for that on some level. There’re some people who are like: “Oh, Apple loves the terrorists!” But that was a stake they put in the put in the ground. They said: “We’re not going to help you crack our security. Sorry. No. That’s going to ruin things.” And you're seeing tech companies fight—Australia just recently passed a law that said you have to provide a backdoor to your encryption. Well, good luck ever having encryption again, because once that key gets out, your encryption is useless. You’re telling me that the government is going to just keep that key super-safe? No employee is ever going to be bribed for it? Data is slippery. A lot of tech companies are saying: “Hey if you want the Internet to work forever—if you think privacy and security will ever work, encryption has to be what it says it is. We can’t give you back doors." So absolutely there is a wing of manufacturing and tech that is absolutely on board with privacy and security and they are having some success. There’s a browser called Brave that has said: “Hey, we’re going to pay you for the ads you watch. You can decide to watch ads and get paid, or you can just not watch ads. You can block them all.” And I downloaded it and it works great. I’m getting paid in bitcoin to look at weird ads. I don’t care. It’s this tiny little bit, but I’m like, oh yeah, it’s worth it. I think we’re very very slowly starting to see some inroads made. But ultimately, it’s going to be this collective, societal decision of what value you place on your personal data and your privacy and your security. And what “sacrifices” are you willing to make? Who are you going to reward with your dollars? I think there’s just some places where—just getting back to the car analogy—If you let the market decide whether you want seatbelts or not… I just think that’s not okay. As a society, we said this is a minimum safety standard that we want. And like with pollution—We’ve all decided—Just think about it—it used to be cool, you’d just dump your sewage and waste into the river.

IPVM: It’s not there anymore, it’s carried away.

SP: Right and we all decided: “You know what? We like clean water, so let’s not do that anymore." I think we will come around to that. But imagine if you just made every individual be like: "Oh, I’m not going to buy your products because you dumped toxic waste into the river.” No, that’s not going to work, actually. We need the government to step in and say: “Hey, we’re going to represent the people here.” And I think that’s the dynamic we’re seeing now, and if you give a crap about privacy, you should absolutely advocate to your representatives.

Comments (0)

Only IPVM PRO Members may comment. Login or Join.

Related Reports

France Declares School Facial Recognition Illegal Due to GDPR on Oct 31, 2019
France is the latest European country to effectively prohibit facial recognition as a school access control solution, even with the consent of...
Milestone XProtect 2019 R3 Tested on Oct 30, 2019
Milestone has had problems over the last few years releasing significant new software. Now, in XProtect 2019 R3, Milestone is touting "one search...
Struggling Arcules Changes Chief Revenue Officer on Oct 29, 2019
Canon / Milestone spinout Arcules has struggled, now changing their Chief Revenue Officer, just a year after he came on board. The issue is not...
Lock Status Monitoring Tutorial on Oct 28, 2019
Just because access doors are closed does not mean they are locked. Unless access systems are using lock status monitoring, the doors and areas...
Remote Access (DDNS vs P2P vs VPN) Usage Statistics on Oct 25, 2019
Remote access can make systems more usable but also more vulnerable. How are integrators delivring remote access in 2019? How many are using...
Covert Elevator Face Recognition on Oct 24, 2019
Covert elevator facial recognition has the potential to solve the cost and complexity of elevator surveillance while engendering immense privacy...
Alarm Veteran "Demands A Criminal Investigation" Of UL on Oct 18, 2019
The Interceptor's Project pressure against UL continues to rise. Following Keith Jentoft's allegation that "UL Has Blood On Their Hands", Jentoft...
Camect "Worlds Smartest Camera Hub" Tested on Oct 18, 2019
Camect is a Silicon Valley startup that claims the "Smartest AI Object Detection On The Market", detecting not only people and vehicles, but...
Network Optix NxWitness 4.0 Tested on Oct 10, 2019
Network Optix released Nx Witness 4.0, proclaiming new features like a deep learning analytics metadata SDK, increased H.265 support, and UX...
Hikvision ColorVu is Smart Marketing on Oct 03, 2019
Hikvision ColorVu (see IPVM test results) is smart marketing, a lesson to be learned by competitors and a rising trend. Inside this note, we...

Most Recent Industry Reports

ADT Stock Surges - "Leading The Commercial Space" on Nov 15, 2019
Don't call it comeback... but maybe call it a commercial provider. ADT, whose stock dropped by as much as 2/3rds since IPOing in 2018, has now...
Gatekeeper Security Company Profile - Detecting Faces Inside Vehicles on Nov 14, 2019
Border security is a common discussion in mainstream US news and politics, as is the use of banned Chinese equipment by US Government agencies....
Hikvision CEO And Vice-Chair Under PRC Government Investigation on Nov 14, 2019
In a surprising and globally covered move, Hikvision CEO Hu Yangzhong and Vice-Chairman Gong Hongjia are being investigated by China's securities...
Camera Field of View (FoV) Guide on Nov 13, 2019
Field of View (FoV) and Angle of View (AoV), are deceptively complex. At their most basic, they simply describe what the camera can "see" and seem...
UK Big Brother Watch: Hikvision Is 'Morally Bankrupt' on Nov 13, 2019
UK civil liberties advocate Big Brother Watch has condemned Hikvision as being 'morally bankrupt' following IPVM exposing Hikvision marketing...
Color Low Light Mega Camera Shootout - Avigilon, Axis, Bosch, Dahua, Hanwha, Hikvision, Panasonic, Speco, Sony, Vivotek on Nov 12, 2019
This is the biggest color low light shootout ever, testing 20+ super low light models from 10 manufacturers: Increasingly, each manufacturer...
Wireless / WiFi Access Lock Guide on Nov 12, 2019
For some access openings, running wires can add thousands in cost, and wireless alternatives that avoid it becomes appealing. But using wireless...
Hikvision Global News Reports Directory on Nov 11, 2019
Hikvision has received the most global news reporting of any video surveillance company, ever, ranging from the WSJ, the Financial Times, Reuters,...
Hikvision Markets Uyghur Ethnicity Analytics, Now Covers Up on Nov 11, 2019
Hikvision has marketed an AI camera that automatically identifies Uyghurs, on its China website, only covering it up days ago after IPVM questioned...
Open vs End-to-End Systems: Integrator Statistics 2019 on Nov 11, 2019
Preference for open systems is on the decline, according to new IPVM statistics. We asked integrators: For video surveillance systems, do you...