Harlan Rosenthal

image
Firmware/Software Engineer

Biography has not been added

DutchUncle

's contributions
Articles
Comments
    • Where have you been doing an ostrich imitation? This has been going on for years, since well before the last time I got laid off (nothing wrong with 1/2 of the company, they just didn't need us here in this state anymore). My son certainly had no inclination to go to engineering school after seeing the feast and famine in my career.

    • Multiply and divide, yes; addition, no. And even multiply and divide are different if you were working (and thinking) in Pascal on another project, where the evaluation is right to left. Tell me, do you get charged extra for parentheses by your compiler? Think of them as a comment. :-)

    • I think this was done better in "The Art of Computer Programming", volume 1, originally published in 1968, currently in third edition.

    • We also pay IAR. The debugger works just fine on optimized code; it's human brains that don't follow the optimization, and e.g. get upset about not being able to distinguish breakpoints because the compiler shared the common code - just like assembler coders (like me) used to do. Test what you ship, and ship what you test. If the low-optimization code fits and does the job, ship it. If we can't fit space or time constraints and need to do higher optimization, build and test and ship it that way. I keep trying to convince management that we should do away with the debug/release build dichotomy completely.

    • I'm assuming that this: C++ language: const int DRIVE_SHAFT_RPM_LIMITER 5 1000 was supposed to be const int DRIVE_SHAFT_RPM_LIMITER = 1000; This comes from C, along with the expected optimizations. C does NOT require all definitions first; in fact, counter to some coding standards, it is *better* to declare a temporary variable within a structural block to clarify locality to the compiler and optimizer. Inlining comes from C too. If you're going to make comparisons, play fair.

    • I was pointed to this article again by one of my colleagues who seems to think it says "nobody ever needs an RTOS", and anything more complicated than a top level polling loop is unnecessary. If we had a complete system-level understanding of our system before starting any work on it (by the way, just how complex *is* this system under discussion?), maybe we could do this kind of analysis. If we had fewer comm channels (remember the EEPROM and the display and everything else that has latency is a comm channel under the hood!), ditto. If we didn't have people changing feature specs the day before we're supposed to do releases, it would help too. And I agree with one finding - I would expect more small tasks with proper segmentation to schedule better than large complex multi-purpose tasks. But for flexibility (including iterative development), and adaptability, and more convenient handling of asynchronous events and protocol timings and interval timings, without investing as much time in analysis of a system spec that's going to be invalidated by change, I think I'll work with an RTOS, thanks. I don't expect the hardware designers to build their own processors out of 7400 gates, and I see no particular reason to hand-create my own scheduler every time I need one. Hey, I know, let's buy an industry-standard, well-tested, well-understood integrated package and get a head start on the project that people are actually *paying* us for.

    • "Writing drivers for each and every one is a task too daunting ... " But that's not the task at hand. A hardware company can choose one system, one model, that they're going to support, because they have to write some level of driver in order to prove that the hardware works. The real problem is when you get a chip with a spec sheet and some "reference" code and after a few hours it becomes clear that nobody ever TRIED the (expletive-deleted) thing using software, certainly not that so-called reference. If I have to get out a scope to debug their code . . oh, wait, that doesn't even work anymore because everything is on-chip and I can't get to the intermediate signals. No, sorry, if the reference code doesn't work, it's just as blatant a failure as the pins not being standard size - I'll find a part from someone else.

    • C deterministic? Have you been reading the articles in your own emails? C was ludicrously limited from its first release. It's a language that doesn't even know what its variables are. We were learning block structure and data-driven design and structural concepts that would become OO design, and here came a brand-new low-level language - except its macros didn't even do as much as IBM 360 assembler and because it tried to avoid attachment to any particular processor it was uncertain just how it would perform *anywhere*. C only persists because people are lazy.