Re: “Yea for big brother”: I don't have a problem with a “Big Brother” box. One of the best ways to encourage law-abiding behavior is to personalize it, and a black box that is an even-handed, unbiased and consistent tattletale is certainly one way to do that.
I don't even lose any sleep over the privacy risks of such a widely deployed system. Judges and juries, by and large, have consistently demonstrated their ability to resist efforts by the other branches of our government and industry to claim more than the citizenry allows them to, and I don't think a mandated electronic surveillance system will change that. Sometimes the process takes a while, and sometimes people test the system in the meantime, but We The People have managed to prevail pretty well so far.
My strong objection to broad adoption of electronic surveillance devices is purely professional. To date, the embedded systems industry has not demonstrated the capability to handle potentially evidential data in a responsible manner. Until we can reach that point, I don't think we can even consider any significant, automated approach to law enforcement.
GPS receivers and cell phone logs are routinely introduced as evidence during legal investigations, discoveries and testimony. But to date, I haven't seen an implementation of any ability to quantitatively verify the trustworthiness of that data. The technology to do so exists, but nobody is using it.
Let's say a GPS log appears to put you at a certain location at certain time, which makes you a suspect in a crime. How can an impartial observer know that data is right? The overwhelming potential for software bugs demonstrated by our industry places data produced by any embedded system at risk of defect, which means you shouldn't be making life-changing decisions based on it without some means to verify that the information you're looking at hasn't been tampered with, fabricated completely, or just plain screwed up.
I could construct a GPS log that puts me on Antarctica, for example, but that doesn't mean I've actually been there (Midwest winters notwithstanding). And keep in mind the potential contributions of several layers of tools, utilities, libraries, etc. that data passed through on its way from the device to the hardcopy printout provided in court.
I occasionally see telephone numbers on my bills that I didn't call, and I've rebooted just about every modern piece of electronics I own. Why should I believe anything they say?
The current solution to this problem makes the assumption that the phone company and device manufacturer don't know who I am, so they have no reason to fabricate data that might uniquely finger me in a crime. The question is: how do you prove that? We don't know if a fabrication has happened, and we couldn't spot it even if it did, so we pretend that it can't? Bah!
The real solution is the widespread application of modern cryptographic techniques for authenticating data in embedded systems. A GPS receiver with a unique electronic serial number provided by a reputable vendor (e.g., the network ID of any Dallas Semiconductor's One-Wire device) provides a key that can be used to cryptographically “sign” all the data produced by the enclosing system. If the signature passes during evidence seizure and later during testimony, and the signature algorithm itself is proven, then you can trust that the data is what it says it is.
Unfortunately, our industry just doesn't get it yet where data integrity is concerned. I've offered classes on cryptographic algorithms that are perfect for embedded systems, but nobody shows up. There are tons of reputable sources of freely available and extensively studied algorithms available in one or two clicks from Google. Bruce Schneier's book, Applied Cryptography , one of several must-read titles for anyone serious about data integrity, even includes source code and documentation. But honestly, judging by our actions to date, we just don't care enough to use any of it.
And until we do, I'm against any electronic Big Brother.