Advertisement

ESA_Olaf

image
MD

Olaf Pfeiffer studied technical computer science at the Cooperative University in Karlsruhe. He is one of the founders of the Embedded Systems Academy. Together with his partners, he wrote the books „Embedded Networking with CAN and CANopen“ and „Implementing Scalable CAN Security with CANcrypt“. He is a regular speaker at the international CAN Conferences and other events. Olaf is chairman of the CiA 447 standardization group that defines how CANopen is used in special purpose vehicles (including police cars, taxis) and for automotive accessories.

ESA_Olaf

's contributions
Articles
Comments
    • Here's a devilish lightweight encryption method for “hex” coded programs that almost makes us feel sorry for the hacker.

    • An attacker could try to replace the manufacturer’s bootloader with one that would require some specific keyword to unlock, and attempts to reinstall the old code would be prohibited by the hacker’s bootloader.

    • I am unsure where I gave the impression that this is meant as a replacement for real encryption? Of course it is not. Maybe I should have been more detailed here: "However, the cryptographic methods commonly used for Internet communication can’t always be easily adapted to microcontrollers that don’t have security peripherals. A pure software solution might require too many resources (code or execution time) to fit an existing system." For years we are in the business of helping embedded systems engineers to cope with limited resources, because they are locked into some ancient hardware (that for whatever reasons they still have to use). The described method would only be an option as long as no better alternative (for this hardware) is available. Even with Kerckhoff's principle out there for so long, as a consultant I have seen a lot of security through obscurity solutions out there still today... The "fun" part of the nightmare cipher remains, even if the substitution methods are known, without the key any hacking work on this is turned into debugging work. I think we should call it security through demotivation (to hack it).