Secure Boot Could Eliminate Botnets - But Manufacturers Ignore It

Published Dec 29, 2016 13:13 PM

Increased cyber attacks have motivated video surveillance manufacturers to begin to release hardening guides, instructing users on how to better secure devices from attack. These guidelines put the onus on the installer or user to secure devices connected to the internet, but manufacturers could take another approach that would secure devices from software exploits, even if an attacker had the admin password.

"Secure Boot" support would all but eliminate exploits like Mirai, and by extension would make it less profitable for attackers to target security devices. Secure Boot is even available in some chipsets already in cameras, so why do manufacturers not enable it? In this report we provide an overview of Secure Boot pros and cons.

Bootloader ********

**** * ****** **** ** ********* system (**** * ****** ** ***/***) powers **, ** ***** ******** ********** code, ***** ** *********** *** ******* things **** ******* ******* *** ******* interfaces, ******* ** * ************** **** with ***** **********, *** **** ******* off *** ****** ********* ****** **** process. ********** **** ** **********, **** like ****** ********, *** ** **** cases * ****** **** ****** ********** code ******* *** ******* ************ ** authorization.

Secure ****

** * ****** **** ******, *** manufacturer ***** * ****** ********** *** into *** *********, *** *** ********* will **** ******* * ********** **** a ******** ***. **** ******** ******* from ********* *** ********** **** **** malicious **** **** ***** ** **** to ********** *** ******.

** *** ** **** ******/******** **********, the ********* ****** ***** ** ********** to **** *** ** ** ********** the **********, *** ***** **** ******* any ********** ************ ** ** ********* signed. ** *********** **** *** *** properly ****** ***** *** ** ********** by *** ********* ******.

* **** ****** **** ************** ******* that *** **********, ********* ******, *** any ********** ******** (**** ** *** components **** ****** ***** ********* ** a *** *********) *** *** ******** by *** ************ ** ***** *********. This ***-**-*** ************ ** **** ** often ******** ** ** * "****** Operating ***********" (***).

***** *** ********* **** ***** *********** forms ** ****** **** *** ** SOE ** ****** ***** ** * way ** **** ****** ******* ********* their ********.

Supported ** **** *************

********** *************,*****/*********** ******** *** ********* **** **** have ***** ********* ** *** ******** market **** ******* ****** ****, ****** none **** ** ************* ***** *** feature ** ******** ** *** **.

Secure **** / *** *****

****** **** ******** * *********/*** **** proper *******, ***** **** ************* ** these ********** **** ** ***** ******, though ********* *** * ****** ********** cost ******** ** *****-*** *****.

*** **** ***** ** ************** **** from ********** *********** **** ****** ** implement ****** ****/***. **** *** ******* heavily ********* ****-****** ****, **** *** things **** ******** ***** *******, ** support *** ****** *********** - **** effort **** ****** ********** ***** ******* to *******-********* ********. ************ ** *** can **** **** ** **** ********* to ******* ***** ***** **********, ** Axis **** ********.

Malware *** ***

************ * ****** ********* *********** ***** make ** ********* ********* *** ** attacker ** **** *** ******* ******* on *** ******. ** ** ** they ***** **** ** **** *** encryption *** *** ************ **** ** sign ***** ********, *** **********, *** highly ******** *** **** ******.

Not * ***-***

*** *** ******** *** ****** ** implementing ****** ****/***, **** ********* ***** still ***** ********* ** **** ****** to * ******, *** **** ***** be ******* ** **** ********** ********* that *** ************ *******. **** ***** still ***** * ****** ******** ** see *****, *********** ****** ************* ********, or **** ***** * ****** ** factory ********. *** **** ******, ****** passwords *** ****** ******* ******** *** still ****** **** ******* ******* ******.

* ****** ***** **** ***** **** exploitable **** ** ***** ****** ***** that ******* ********* ** ***** ***** with *** ******, *** ******* **** would ** ****** ** ****/******* ********* code *** ****** ***** ** **** more ********* **. ** ******** ***********.

Comments (47)
Avatar
Marcelo Martinez
Dec 29, 2016
SATIS Argentina

100% agree!

IK
Itamar Kerbel
Dec 29, 2016

As you wrote secure boot is only one way to protect a camera.

I'm not sure but I don't think the latest attack could be prevented this way. The problem was that Mirai gained access to the cameras using default username and passwords.

I think the manufacturers think that first they need to solve the simple ways that malware get hold of a camera and only then handle the more sophisticated methods.

As all/most of the cameras use Linux as the camera OS it a little more difficult to create and maintain a secure boot but since there are already examples of how to do it out there I think its worth the effort.

More on secure boot and Linux can be found here: How Secure Boot Works on Windows 8 and 10, and What It Means for Linux

(2)
Avatar
Brian Karas
Dec 29, 2016
IPVM

I'm not sure but I don't think the latest attack could be prevented this way. The problem was that Mirai gained access to the cameras using default username and passwords.

Itamar -

Mirai (and generally any other botnet) does two things to take over a device:

  1. Gain root or admin-level access to a device (this could be using a shell, like Mirai did, or through web UI exploitation)
  2. Download/install the actual malware that can be used to enable remote botnet control, and to add capabilities that the device might not have by default (like the ability to create/control raw sockets)

You can make a device much more secure simply by changing default credentials, and doing other things to limit network access, but there are still risks.

Secure boot/SOE handles the second half of the above, it prevents unauthorized software from executing. So even if someone gained access to your device, they could not load and run botnet software.

Mirai could not work if EITHER of the above two security issues were addressed.

Point #1 is mostly in the hands of the user/installer to address, manufacturers can only do so much in the sense of enforcing strong passwords, and doing this comes with trade offs in increased support costs, etc.

Point #2 though is under the control of the manufacturer, they can implement a secure operating environment that helps protect devices that are poorly configured. It also helps to protect against other exploits that may be found, hackers might find a way to gain shell access, but if the cannot do very much with that shell, it makes the device less valuable and the hack less risky to their customers (though they would still have a duty to fix the discovered exploit).

(1)
(1)
UD
Undisclosed Distributor #1
Dec 29, 2016

This would be a fantastic solution if implemented properly.  There are a few flaws in doing so in our current world though.  #1 - most of the OS's and applications are just being pirated and copied by so many manufacturers that they just don't know how to safely alter them.  This is why a list of 63 username/passwords were able to get into so many different devices, they were all using pretty much the same version of Busybox Linux that the manufacturers didn't know how to safely modify or just didn't want to.  #2 - "though sometimes for a slight additional cost compared to lower-end chips" this shoots it down immediately.  These companies are doing everything they can to save every penny, they refuse to do anything that will increase the cost at all.

We (the professionals that use IPVM as a great industry reference tool) would love these changes, but unfortunately the manufacturers care more about just getting every single penny and if it can be done with cheaper parts and insecure software then so be it.  They will suffer no consequences from these botnets other than an uncomfortable week or two of bad press and then it's back to normal.

(1)
(1)
Avatar
Brian Karas
Dec 29, 2016
IPVM

#1 - most of the OS's and applications are just being pirated and copied by so many manufacturers that they just don't know how to safely alter them.

#2 - "though sometimes for a slight additional cost compared to lower-end chips" this shoots it down immediately.  These companies are doing everything they can to save every penny, they refuse to do anything that will increase the cost at all.

For multiple reasons, I would not expect manufacturers that are solely focused on cost to do much with secure boot. No matter how you choose to approach it, security is expensive (relative to doing nothing or bare minimums). But I think there are other manufacturers, not targeting the budget buyers, who do their own firmware and OS modifications and have the resources to implement this, and sell/market it in a way that justifies the expense.

In multiple places I have seen feedback that people think cyber security issues are going to continue to be a big concern in 2017. A manufacturer that implemented secure boot, and had a decent bug bounty program (inviting people to discover and report weaknesses) could gain a significant edge by promoting themselves as the "most secure" platform, justifying a higher price and differentiating themselves from others.

(4)
UD
Undisclosed Distributor #1
Dec 29, 2016

I 100% agree with you on this.  Unfortunately most of the "security" industry these days just wants the cheapest and easiest products to slap into a home/small business so they can get their check and hopefully never hear from the customer again until they want to buy another $10 camera.  The true security professionals will try to guide customers and provide them with the best solution possible, but this costs more and in the current disposable, Walmart world we live in, the vast majority just wants cheap and quick which falls right into these manufacturers ball parks.  Sorry for the cynicism, hasn't been the best year in the industry as I'm sure most of you are also suffering with.

(1)
IK
Itamar Kerbel
Dec 29, 2016

We have a branch here of a major US telecommunication company.

They are upgrading the cctv system in the building.

We offered them Tiandy cameras. They got back to us saying that the security check they did not approve Chinese cameras.

We offered them avigilon as the new avigilon  cameras (h4) offer encrypted data communication between the camera and the nvr.

They got back to us after a few days saying that if we can close port 23 (telnet) in Tiandy they will be approved....

Money talks....

(2)
(4)
UE
Undisclosed End User #2
Dec 29, 2016

Interesting stuff, but I think that's more like "take care of the symptoms (can/have been cracked) instead of take care of the cause of symptoms (proper coding w/o vulnerabilities)".

JH
John Honovich
Dec 29, 2016
IPVM

#2, would secure boot have stopped the Axis critical vulnerability?

UE
Undisclosed End User #2
Dec 29, 2016

Don't think so, since that was exploiting one already "approved application by the manufacture".

Avatar
Brian Karas
Dec 29, 2016
IPVM

Agreed, secure boot would not address the fact that Axis offered pre-installed software that enabled back-door access to devices. It would severely limit what someone who gained access could do with that access, cutting off the potential to leverage the camera as a host for other malicious software.

It is also worth mentioning that in cases like the Axis exploit, a user could not be 100% sure that an attacker did not change the bootloader, or do other things that would allow them to get back into a camera/device after an updated firmware had been installed.  Secure boot would provide much higher confidence (I hesitate to ever say full confidence when it comes to security) that patched firmware would close the original exploit and ensure no other backdoors were added to a specific exploited device.

(1)
UE
Undisclosed End User #2
Dec 29, 2016

That's true, it's possible to tamper the boot loader code to first load some other hidden code somewhere in the flash and then jump back, nothing new in that behaviour. (That's one way, which been used for ages by viruses)

 

When it comes to the name "secure boot", my reflection is that we talking about "securely" booting up the the device, not what's running after boot in user mode. I'm thinking of how the manufacture can/will "approve legal applications". Would be interesting to find out more about this.

Usually, when upgrading the FW, the boot loader will also be upgraded in most cases, so I don't see that as big issue.

Avatar
Josh Hendricks
Dec 30, 2016
Milestone Systems

I could be wrong but I was under the impression this technology was only effective in preventing rootkit-level infection.

Rootkits run at the BIOS level and can hide their presence in some ways, which makes it easy for them to re-infect the target in case someone removes the malware or factory defaults the unit (which probably isn't deep enough to touch/clean the BIOS).

The vast majority of botnet clients do not require that level of infection, and I imagine it's very hard to accomplish a rootkit level infection on an IoT device without physical access. Unless of course the BIOS supports kicking off updates from the OS.

If I'm not wrong (I'm only feeling 50% here), then there would be little to no value in implementing secure boot with regard to the standard level of attacks out there including Mirai.

However, if I were sourcing equipment for a government or other high security project, I would prefer to use SOE-enabled devices as it would potentially protect against APT's or advanced persistent threats. I imagine nearly all APT's are carefully designed and implemented and are in a different classification from the "script kiddie" stuff usually seen.

If I'm wrong and the signature used by SOE is somehow based on the OS drive(s), such that any changes to key areas of the filesystem would invalidate the signature check, then this would be a beautiful way to lock down IoT devices. I think this would be fairly far-fetched though.

At the OS level, I'm sure linux has the same ability to deny running "unsigned" code considering Windows already does this to prevent unsigned device drivers from being installed. Linux probably had this feature first. It seems like configuring the linux to execute only code signed by a shortlist of keys would be the best bet.

If manufacturers were to do this, then the only ways to accomplish remote code execution would be to either steal the private key from the manufacturer so you can sign your infectious code, or to discover a flaw in the manufacturers proprietary software allowing for remote code execution directly through the manufacturers software/processes.

Avatar
Josh Hendricks
Dec 30, 2016
Milestone Systems

The whole signed code angle could potentially be further secured by requiring cert transparency such that any newly added executable, even if signed by a valid/trusted cert, would have to be audited the first time by checking with the manufactures cert log server. That would block against threats where the private key is compromised, and potentially even be an early warning system for the manufacturer (if any valid certificate is audited and found never to have been issued, you may want to revoke that certificate).

U
Undisclosed #3
Dec 30, 2016
IPVMU Certified

Rootkits run at the BIOS level and can hide their presence in some ways, which makes it easy for them to re-infect the target in case someone removes the malware or factory defaults the unit (which probably isn't deep enough to touch/clean the BIOS).

BIOS is for PCs, do you mean bootloader?  

The vast majority of botnet clients do not require that level of infection, and I imagine it's very hard to accomplish a rootkit level infection on an IoT device without physical access. Unless of course the BIOS supports kicking off updates from the OS.

Installing a rootkit is simple once you get in as root, with local access or remote access.

Avatar
Josh Hendricks
Dec 30, 2016
Milestone Systems

Technically I think PC's use UEFI and the term BIOS is becoming out of date. If hardware devices running linux exclusively call the concept a bootloader, my mistake. PC's have bootloader's too, they come into play after the BIOS/UEFI loads (GRUB, NTLDR, etc).

Is the term "bootloader" actually different between the HW/IoT world and the x86 world?

WRT rootkits, I was thinking more about kernel mode rootkits, and I can see those are just as easy to accomplish remotely as anything else. So one defense would be to only allow signed drivers and code, and another might be to disallow loading of additional/external drivers.

I would wonder if SOE, or some similar technology exists to hash the kernel and protect against alteration of the kernel at the FS level?

(1)
UI
Undisclosed Integrator #4
Dec 30, 2016

The Linux kernel, at least, can be run directly from UEFI, which would require that it be signed if using secure boot. Kernel modules would then also need to be signed, as UEFI implies module.sig_enforce. That still leaves userspace.

(1)
U
Undisclosed #3
Dec 30, 2016
IPVMU Certified

Is the term "bootloader" actually different between the HW/IoT world and the x86 world?

Not really, but PC BIOS is a lot more than a bootloader, that's why I was trying to clarify what you intended by "at the BIOS level".

UI
Undisclosed Integrator #4
Dec 30, 2016

"why do manufacturers not enable it?" severely understates the effort that would be required for this. Signed bootloaders, kernels, and even kernel modules? Sure, that's already well-developed. But userspace? In Linux, at least, that's still a whole lot more than just something to be "enabled".

See: https://wiki.gentoo.org/wiki/Integrity_Measurement_Architecture

Sure it's clarified later with "The main costs of implementation come from additional development work needed to implement Secure Boot/SOE. This can require heavily modifying open-source code", and maybe I'm just being pedantic, but the opening just seems flippant.

Nitpicking aside, do there exist today any OSes that will run only signed code at every level? Genuinely curious.

UI
Undisclosed Integrator #4
Dec 30, 2016

As well, even secure boot isn't a fix-all (as nothing ever can be):

https://www.youtube.com/watch?v=QDSlWa9xQuA

http://www.c7zero.info/stuff/DEFCON22-BIOSAttacks.pdf

http://www.ghacks.net/2016/08/10/secure-boot-bypass-revealed/

Still specifically in the UEFI world, sure, but the principle stands. I do agree that embedded device manufacturers need to keep up with the cat/mouse game, but I'm not hopeful that they'll ever care to--especially in the middle of a race to the bottom.

Avatar
Brian Karas
Dec 30, 2016
IPVM

maybe I'm just being pedantic, but the opening just seems flippant.

 

First, I appreciate the feedback.  The opening was to meant to mirror what I think is a common question that people might have upon learning about secure operating environment options, "if it's possible, why aren't manufacturers doing it?" The answer, of course is because it is not just as simple as turning something on, but that does not mean it should not be considered an option to combating cyber security issues.

The purpose of this post was to call more attention to alternate ways to secure devices, and that the task does not need to fall solely to the user or integrator.

Yes, there is effort involved in making devices more secure, and today manufacturers are putting minimal effort into true device security. Secure boot and secure environments already exist, to various degrees, in the PC world. SElinux, as one example, would provide far greater security than what we have now for the internals of most systems.

The userspace problem is somewhat easier to address in embedded devices, where you generally do not have to deal with actual users installing additional software applications. This may complicate things for ACAP, or other cameras that support 3rd party apps, but those are a very very small part of the broader installation base.

 

Nitpicking aside, do there exist today any OSes that will run only signed code at every level?

The capability exists, but this is where it gets harder to make comparisons between things like Apple's code-signing requirements (for example), and embedded devices. It is possible for a user to turn off signed-code requirements in desktop OSes, because there are just too many apps and use-cases out there. But an embedded camera could be set to only run manufacturer-signed code, if the full stack (Secure boot, secure kernel, etc.) was all implemented.

(1)
(1)
UI
Undisclosed Integrator #4
Dec 30, 2016

On the note of SELinux, also check out grsecurity. The concern with userspace, though, is that the "users" are those who have gained remote access via some means (default credentials, exploits). At that point, even if the remote access/control isn't able to be made persistent because of secure boot et al., the device is still owned.

To your last point, that's where the very significant effort in implementing such restrictions comes in, as well as my concern with the phrasing. Manufacturers are "ignoring" secure boot and userspace code signing requirements in the same way that I'm "ignoring" the option of buying a Tesla.

UE
Undisclosed End User #2
Dec 31, 2016

It's always good to bring up new ideas, that looks outside of the box.

> But an embedded camera could be set to only run manufacturer-signed code,
> if the full stack (Secure boot, secure kernel, etc.) was all implemented.

How about to digitally sign PERL/Python/shell/etc.. scripts?

Avatar
Josh Hendricks
Jan 01, 2017
Milestone Systems

Scripts can compiled and signed, and some scripts might be executed under a trusted host process like Apache/nginx/etc (which is another risk of course).

If designed from the ground up with security in mind, the use of interpreted scripts could be minimized.

It would definitely be a challenge, but a worthy one to undertake. I wonder if the race to the bottom could be slowed if the market demanded more attention to security? With the US government and Genetec writing off HikVision (for various reasons), perhaps this could already be in progress?

U
Undisclosed #3
Jan 01, 2017
IPVMU Certified

I wonder if the race to the bottom could be slowed if the market demanded more attention to security?

Once we get rid of all the default/weak passwords, security will be so improved I wonder if it will be demanded.  

(1)
Avatar
Josh Hendricks
Jan 01, 2017
Milestone Systems

Fair point. That's the low hanging fruit and most cost effective measure. Broad attacks like Mirai would be greatly reduced and consumers may consider this to be good enough.

(1)
U
Undisclosed #3
Dec 30, 2016
IPVMU Certified

...do there exist today any OSes that will run only signed code at every level? 

iOS

(1)
UI
Undisclosed Integrator #4
Dec 30, 2016

Ha! An easy example that I completely missed. iOS also is a good example that even when security is a core concern, exploits are still found.

(1)
Avatar
Paul Curran
Jan 03, 2017

Wasn't some of the security issues found recently down to people reverse engineering the firmware to find hard coded accounts and passwords. To what extent does secure code limit this? Would the user set admin account (which is forcibly set at first boot to a unique identifier)  be the encryption key for secure boot?

Avatar
Brian Karas
Jan 04, 2017
IPVM

Paul -

With secure boot the firmware would generally be encrypted. Granted, any encryption can (theoretically) be broken, but done right it would be very low odds that secure boot firmware would be easily examined.

Still, if that did happen, an attacker could learn default account credentials, and find other exploits in the firmware, but would be more limited in what they could do with that data, as they could not load/run additional software, they could only leverage what the manufacturer had already built into the firmware.

The encryption keys are set by the manufacturer, and are not otherwise publicized or controllable by the end user, so it is not tied to admin passwords or things like that.

Part of the assumption with a secure boot implementation is that the manufacturers are actually trying to make things more secure. They would likely (hopefully) not go through all the effort of implementing secure boot, and then leave telnet open (or leave an easily launched telnet backdoor), but that is technically possible.

I do not mean to imply that secure boot is a 100% fix for things like Mirai, but it would go a long way in making devices less exploitable, and making any found exploits less powerful, as well as instill more trust/confidence in users (for the users that care, many still do not give a crap about security ;) ).

(1)
U
Undisclosed #3
Jan 04, 2017
IPVMU Certified

The encryption keys are set by the manufacturer, and are not otherwise publicized or controllable by the end user...

Where does the camera get the key to decrypt the firmware if its not the manufacturer's public key?

Avatar
Brian Karas
Jan 04, 2017
IPVM

It is a key defined by the manufacturer, but it is not "public". Similar to how DVD encryption/decryption works (or, worked, since the DVD keys were broken several years ago).

U
Undisclosed #3
Jan 04, 2017
IPVMU Certified

Ok, but would the key to decrypt the firmware be in the firmware itself?

Or a rom?

U
Undisclosed #5
Jan 04, 2017

Brian,

its like PGP

Manufactures will encrypt with cameras public key

and cameras will decrypt with cameras private key

right?

(1)
U
Undisclosed #3
Jan 04, 2017
IPVMU Certified

Manufactures will encrypt with cameras public key and cameras will decrypt with cameras private key

Yes, but then anyone could use the manufacturer's public key to encrypt their own hinky firmware, which would be accepted by the camera, which is what we are trying to avoid with secure-boot.

UI
Undisclosed Integrator #4
Jan 04, 2017

There's some confusion going on here. Secure boot and such use signing, not encryption, to verify that the kernel, modules, etc. have not been modified. On-chip encryption of that code is an entirely different matter altogether. (Again, all to my knowledge on "big" devices and not embedded ones.) Now you're getting into tamper-resistant storage, security coprocessors, and other fun stuff. I'd be curious to know from someone familiar with such things how that would affect costs.

Avatar
Brian Karas
Jan 04, 2017
IPVM

You are correct, the various software components are verified for authenticity via signed code in a "common" secure boot implementation. However, it would also be common for the firmware file itself to be encrypted so that it cannot be easily extracted and reverse-engineered.

The above is specifically for the case of embedded devices, where the firmware is typically distributed as a complete image (OS, software modules, etc.), and you are not creating an environment where users are going to be loading random packages/applications on their own.

U
Undisclosed #3
Jan 04, 2017
IPVMU Certified

Secure boot and such use signing, not encryption...

But signing uses encryption.  

In any case it's the same problem, either a key is hardcoded in the device or a trust chain must be established.

UI
Undisclosed Integrator #4
Jan 04, 2017

But signing uses encryption.

Sure, strictly speaking, but merely signing something does not also encrypt it, which is the point I was making. And yes, you do need a root of trust, but these are all things that experts have been working on for a very long time:

http://www.uefi.org/sites/default/files/resources/UEFI%20RoT%20white%20paper_Final%208%208%2016%20(003).pdf

http://www.uefi.org/sites/default/files/resources/UEFI%20Forum%20White%20Paper%20-%20Chain%20of%20Trust%20Introduction_Final.pdf

https://www.sans.org/reading-room/whitepapers/analyst/implementing-hardware-roots-trust-trusted-platform-module-age-35070

(Edit: This didn't link to the replied-to comment for some reason, so here's that)

U
Undisclosed #3
Jan 04, 2017
IPVMU Certified

...but merely signing something does not also encrypt it, which is the point I was making.

My point was only that if you do encrypt the firmware you have the same challenges of authenticity that signing has.  UEFI is a solution, but I'm not sure there is a viable Linux busybox port out there yet.

UI
Undisclosed Integrator #4
Jan 04, 2017

...the same challenges of authenticity that signing has...

With the rather critical difference that verifying a signature requires only that the public key be present on the device. Attackers can then reverse the firmware all they want--they aren't going to be able to run code on the device without either the private key, or physical access to the device (if even then).

U
Undisclosed #3
Jan 04, 2017
IPVMU Certified

Attackers can then reverse the firmware all they want, they aren't going to be able to run code...

They want to reverse it, even if they can't run it. It's how they scan for vulnerabilities in the "secure" version.

(1)
UE
Undisclosed End User #2
Jan 04, 2017

Agree, and if the device can run the signed and/or encrypted, researchers can also find out the way to run it - one way another.

Still is the talking about "take care of the symptoms (can/have been cracked) instead of take care of the cause of symptoms (proper coding w/o vulnerabilities)".

Manufactures  should have (maybe some already have) their own researchers that trying to break their own things, not source code researcher but binary researchers who only gets the firmware image.

When you coming down to disassembling binaries, you get a totally other view than checking source codes as you are down to machine code and the disassembled machine code.

And not looking at code in some text file that some compiler shall first interpret down to assembler and then to machine code.

UI
Undisclosed Integrator #4
Jan 04, 2017

So what is it we're discussing here? The article was about secure boot. My comments are about secure boot, with one comment correcting that signing does not having anything to do with encryption of the firmware. If an attacker were to analyze unencrypted firmware and find remotely-exploitable vulnerabilities, we're back to the point I had made about the difficulty of extending the chain of trust to userspace programs.

Seems like we're going in circles.

UE
Undisclosed End User #2
Jan 17, 2017
UE
Undisclosed End User #2
Jan 04, 2017

Yup, I actually find the same

Avatar
Brian Karas
Jan 16, 2017
IPVM

This is not an exact parallel to security cameras, but Google's custom security chips show some insight into how Google handles security and authentication on its servers. It is an interesting read for anyone interested in these topics.

(1)