Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> billions of embedded systems implementing safety critical functions

I realize that there is a pattern: when high-level programmers talk about security and reliability, they basically mean "hackers breaking your system to steal your passwords". To the point that memory bugs = security vulnerabilities, and nothing else.

This, of course, has another meaning for embedded.



For application development, a memory corruption bug or a race condition is a safety issue that causes data corruption, and when the same corruption can be used to compromise the system, it's a security issue, as you said, often there's no point to distinguish them. In the embedded world, this distinction can be important. If a vendor says its microcontrollers have safety features, it means an elaborate system of watchdog timers, glitch-filtered inputs, and checksums. On the other hand, "a microcontroller with security features" means a crypto engine and secure key storage memory.


I agree with what you wrote.

> there's no point to distinguish them

I think the distinction should be clearly made when someone wants to sell me something like Rust. They come up with the daily link about 70% of the bugs are memory issues plus something in the lines of "memory vulnerabilities! think about the hackers! exploits!", when it's clear that these arguments don't click in (many areas of) embedded. Safety and security have another meaning. The steal of a password is the least of my fears, if I think a bad implementation of mine can chop-off the hand of an operator.


The distinction is made quite clearly, one is safety, the other is security :)

(while the terms are intermixed in general discourse, both the embedded and security worlds consistently separate the two, at least in everything I've seen)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: