DickH

image
Signal Engineer

Biography has not been added

DickH

's contributions
Articles
Comments
    • I've never found any difficulty writing anything I can write in C in Pascal instead - but Pascal forces me to make my intention explicit from the start, and express it unambiguously, and then the compiler has a much simpler task and can produce *better, simpler* code. The language compiler will pick up most violations of good practice at compile-time, rather than producing code which blows up at run-time. If it compiles, it *will* run (maybe it won't do what you wanted, but that's *your* fault). The most powerful word in pascal is "type" - and you can have any type you can imagine and nest any types. The variant type construct is so expressive you can express constructs that require the use of the pre-processor in C (they work without generating any extra code, since they just express another view of the same data in memory) - and the C pre-processor is where most of the horrors lie. Any direct translation from C code to pascal gives smaller faster executable code. Mere prejudice ('not invented here') prevented pascal from becoming the default choice. N. Wirth did an absolutely brilliant job.

    • this is common to all OOP languages - a class KNOWS which class it is, what kind of object an instance of the class is. So it's an underpinning of OOP in general, not just C++

    • that's too convenient and sometimes too cowardly, mate - what if someone might die or be injured if you don't speak up?

    • a friend is a senior programmer - he got his first real programming job as a C programmer, though he'd only used assemblers, fortran and algol 'til then. He told me that the only reason he got the job was he lied in his C.V. and he lied in the interview - the latter was convincing because he had a great short-term memory and had 'scanned' (not read) Kernighan and Ritchie in the plane on the way there - a 1- hous flight - so at least sounded convincing. They didn't test.

    • here in UK, that would cost me the equivalent of 2300 dollars a year - and people complain if they are billed half that for all the other energy they use, never mind the A/C. A school here redesigned their A/C system with the help of an architect to get a system that was mostly driven by (free) air flow over the building - saving tens of thousands of pounds over the year. You might like to think about that. That's truly green, and very meaningful.

    • Jack, I know what a kW is, it's a measure of power equal to 1000 joules per second. I know what a kWh is, it's a measure of energy indicating a power of 1kW consumed for 1 hour, hence 3.6 Megajoules. The article says "on the order of 1.5 billion KW/hr per year of waste, two orders of magnitudes lower than the Energystar figures. But a billion+ KW/hr is a lot of waste." What's a kW/hr ??

    • wilc2010 and jb232 - you guys could have liked the original Borland definition of classes in object-oriented Pascal - the one from Borland Pascal5.5 to BP7, about 1986 to 1989 - before the new version called 'ObjectPascal' or 'Delphi' (originally BP8). Nothing happened automatically or hidden from view - it's all explicit, clear and simple, and under your control. Like C++ but with all the hidden behaviour removed. You decide where, when, and in what order, constructors are called, if at all. Unfortunately -- Borland then thought for their next iteration (ObjectPascal, or Delphi) that programmers shouldn't be allowed the chance to easily manipulate pointers, as was the fashionable paradigm, so they made all 'objects' only accessible through references allocated with new() - but at least, even there, the particular constructor which is used (if not the default do-nothing-except-allocate-space) is called-by-name explicitly. But this also meant you couldn't create a temporary (local) object on the stack. Sadly, the original simple, tiny, powerful, strongly-typed, and very flexible language that was object-oriented pascal is long dead. It would have suited embedded code very well.

    • (continued..) I would contend that the popularity of Java and C# among their proponents comes from their having been made to use C, while never having been properly introduced to a good O-O Pascal (and actually, to all intents, C# really is pretty much ObjectPascal, made to look like C). _They_ say these interpreted or JIT-compiled strongly-typed languages 'just let you get on with the job, no struggling with the tools, you can make a change and immediately try it out'. Just like pascal. I wrote my first program in Algol 60 on an Elliot Automation machine at Burroughs' facility in Dundee on a day-visit from high school - not complicated or large, it solved Poisson's Equation - I was just turned 16, so it must have been early 1968. C, relatively newborn then, should have died that year. I hate to think of the millions of man-hours that awful language has consumed, wasted, the flaky code hiding bugs for years. But it didn't bother me that much in the end, because I became an analogue and RF man, in R&D, and programming was really something I only did for simulations, for calculations, and for myself, until much later. Algol gave us both C and Pascal. More than half the members of the Algol 58 and 60 committees were Americans, and yet somehow Americans seemed to think programming in Algol was unAmerican, European, foreign, and they gladly latched on to C when it came. Unfortunately, C left out all the good bits of algol that made it really productive (simple uniform syntax, strong typing, bounds-checking, readability etc). C was terse, the code looked more serious. It looked 'harder', more abstruse, more arcane, so it had to be better. Shame it wasn't. Unix was a triumph, even more so because it worked in spite of the language they used. It's a pity they wrote it in C.

    • It's 21 (?) years ago and I'm sitting in front of my almost-brand-new desktop at work: a 33MHz 386. I'm arguing with a C-proponent about my insistence on using algol's lovechild - Pascal - a beautiful tiny language carefully chiselled out of algol by a brilliant American-German engineer called Niklaus Wirth (as I recall). I have rewritten my colleague's C Windows application and extended it. To demonstrate, I tell him I will perform a complete build of the app, about 50000 lines in all, of which around 8000 are non-library bespoke code. He is of the opinion that he can't hang around for this and will go and get a coffee, but I stop him with "You haven't got time for that... Watch!". I click the appropriate button in the IDE and count aloud, to 4, and then say 'Done.' He believeth not. Less than five seconds. So I run the app. He is astonished. He recognises it, and it does not crash. He knows it takes him around 15 minutes to build his app from C. But 'Ahh', he says, 'it won't be as good code, and it won't run as fast'. An hour later, and he agrees it runs faster by 8-10 percent in some parts, and no slower in others. The executable is half the size. I then realise I have made an error - and I demonstrate to him that I have left 'runtime checking' ON. It could have been another few% to 10% faster if I'd turned it off. Now, granted, C compilers have gotten better, and the code they produce much better than that old MS C 1v5 (?) he was using - I should have used the Watcom compiler for comparison, its code ran at least 15-20% faster, though it took even longer to produce it. But no matter, my point was made. When I wrote in pascal, I never used a debugger - I never needed to. Most bugs wouldn't even compile - and those that did were obvious in the source code if you took the time to read it. (continued..)

    • the intro said "In order to satisfy user desire for longer ranges, wireless engineers must understand key parameters that impact the range of their systems" If they don't know how to work out a link budget and the physical parameters that decide the range to be expected, they aren't entitled to call themselves "wireless engineers". and it goes on to say " and subsequently know how to design around those parameters in a very efficient way." Design around them? You design to approach the limits imposed by the laws of physics, but I hardly think you can get 'around' them! If you could, no-one would have heard of Communications Theory or Shannon's Law.

    • iniewski, I chased this up as much as I could afford the time, and came to the conclusion that ENOB is less than or equal to 3. Whether that's right or not, I can't be sure. I may be looking at the wrong Fujitsu papers (probably out of date). Would the author care to comment? And could Somebody fix the link so that we can actually read the article?

    • "Real cats are essentially worthless"? Tell me that when you next have a mouse or rat infestation. Humans are essentially pointless, too. Evolution isn't about purpose, it's about Life. Our best effort machines and processes can hardly manage what an ant does in real time, at millions of times the power consumption, and come nowhere near the capabilities of a sparrow.

    • It's usually the transmit side, if you can do it, that's most useful. Even cheap receiver designs can manage a 3dB NF these days, and with the best will in the world, you won't improve that by more than 2dB - and at that, depending to some extent on the band in use, you'll be limited in fact by atmospheric and solar noise coming in the antenna.

    • Most seem to miss the essence - the difference between Engineering and Art. In pure art, you can do whatever you like, whatever you can conceive, there are no CONSTRAINTS. Engineering IS art, BUT always Art with Constraints - as someone once said, it's the art of the Possible. So the thing you're engineering has to 'work', and 'work fast enough', and it has to be of such a size, no more, and it must use no more than a certain amount of resource (memory for example) and... etc etc. That means that even today when memory is often available in huge amounts cheaply, power and size constraints mean that embedded software engineering is 'real engineering' - you can't just do whatever you like, only what you can within externally-imposed limits, whereas writing application software increasingly ISN'T engineering, there's so much memory, more than adequate processing speed, so much available resource in general, that you really can in effect do whatever you like within the strictly defined limits of the definition of 'software'. I often in the distant past had to program a board with 1kB of ram, and 2kB of rom or less, fit everything in and make the thing do the job quickly enough at a <1M clock rate. Cobol, algol and Fortran programmers in the early sixties had to work with tiny amounts of 'core'. In either case, it was engineering, and embedded mostly still is.

    • an engineer is enthused and fulfilled when he sees his ideas carried forward and made something of, and when he is free to have a good idea and pursue it. In this age of 'standards, standards, standards' most are forced always to further and implement someone else's ideas - and proposing a standard and getting it accepted is seen as much more important than simply (often not at all simple) getting it to work well, or just to work at all- and there's not seen to be any kudos in this, no points, no prizes, by 'the management' who here at least, are mostly ignorant of engineering, being mostly accountants and bean counters - they think, if he can't get it going, he's simply no good and we need someone else, not that it was a poor, difficult or inefficient idea in the first place. When I started 30 years or so ago, any good idea was worth pursuing - but now the management are always looking for confirmation of the 'next big thing' before they begin to believe it might be worth putting money in that direction.