PC Programming - Embedded.com

PC Programming

When the Danish newspaper Jyllands-Posten ran those cartoons that so enraged so many my take was that, while they surely had the rights to publish such material, it was in bad taste.

When the French then republished the caricatures, I felt they were purposely trying to offend a huge block of people. Sure, the right of free speech is an essential liberty. But that right doesn’t mean we must say every offensive thought that comes into our heads. The reaction in the Middle East was even more objectionable, and to me, appalling and inexplicable.

When someone tailgates, most of us, because we’re grownups, repress the urge to respond with the single-fingered response. The Wall Street Journal could run porn, but chooses not to. CBS could run naked breasts during the Superbowl but… well, hmmm…

In real life we do filter our speech in the interest of building a tolerant society. That’s an essential component of maturity. The screaming two year old demands the supermarket candy bar at full volume. Adults only voice a quiet wish.

This can go too far. This morning I read in the New York Times that a member of the Religious Coalition for Reproductive Choice, a pro-abortion group, complained that searches using the word “abortion” on Amazon returned, among other things, suggestions for books advocating adoption. I respect that the person was able to exercise his or her right of free speech to complain.

But Amazon responded. They’ve changed the search algorithm, for this particular query, to display a page full of anti- and pro-abortion books, with no suggestions that don’t contain the word used in the search.

In the law, a corporation is a person and largely has the rights of any person. Amazon surely has the right to have their search engine return any old result they’d like. Type in “abortion” and pop out “adoption,” “pro-life,” or “tomato.” It’s up to them.

But the implications for Internet commerce in general are chilling. Consider the code: there’s some sort of database whose output goes into a bit of software the figures what other people were looking for when they ran the same search. Finally, now, there’s the Politically Correct Filter, a new module that insures no one is offended by the results.

Who writes that code? Where do the algorithms come from? The PC landscape changes constantly; half a century ago divorce was forbidden, in 1959 few dreamed a Catholic could be president, and twenty years ago South Park would have been banished. Some poor programmer at Amazon must be tasked with the frustrating task of conforming to the latest social norm. Talk about working with unstable requirements!

Ironically, the software filter is in response to someone using the world-wide-web, the one information resource that connects everyone to every idea. The easily-offended should stick to narrow publications that represent their views, not a media in which one mistyped letter can vector you from a site promoting a particular political viewpoint to raw uncensored porn.

Instead of a PC Filter which will never satisfy everyone, Amazon should add a preference setting that displays no search results: “Click here if your sensibilities are so easily bruised that you can’t ignore our suggestions, which are simply responses automatically selected from a vast dataset by imperfect software which neither knows nor cares about your particular viewpoint.”

What do you think? Should database searches be restricted by matters of political correctness? If so… how?

Jack G. Ganssle is a lecturer and consultant on embedded development issues. Join him to learn how to develop better firmware faster in Dallas and Denver April 26 Contact him at . His website is .


Of course not, database searches should not be restricted by Political Correctness!! What an absured question to ask. I want raw information, not filtered information. If I discover that a particular search engine I'm using is filtered, then I'll “STOP” using that search engine and find one that is not filtered.

Personally, I think every search engine should give the user the ability to select Raw Output Information or Filtered Output Information. If you are “sensitive” then you can select “filtered”, otherwise select “raw output”!

– Steve King


The first problem is who decides what is politically correct? What plays in Peoria may not play in New York City.

What I find to be in bad taste, someone else won't. The fellows who wrote the Constitution got it right – “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech or of the press, or of the right of the people to peaceably assemble, and to petition the Government for a redress of grievances.”

– tom mazowiesky


I had to chuckle about your “one mistyped letter” comment. That is so literally true. I was once trying to go to xilinx.com and missed the first 'i'. Suffice it to say that the xlinx.com site that popped up would have been more appropriately named xxxlinx.com!

The pro-abortion folks need to realize that there are many sides to each issue. I think it is completely appropriate for books about adoption to come up in a search about abortion – the topics are intertwined. I'm surprised Amazon caved on this one.

– Warren Sande


” In real life we do filter our speech in the interest of building a tolerant society. That’s an essential component of maturity.”

Exactly.

Ever since Einstein had brought a concept of relativity to us, the world view had changed dramatically.Obviously, that stresses out people who have Great Inquisitor-like minds. They want to “protect” us from U2-Bono's juvenile comments at Grammy (2003) with umbrella-like set of PC rules.

How many Giordano Brunos burning at the stake will humanity need to see, how many Solzhenitsyns will be “silenced” before we collectively grow up and mature? Can technology aid this process? Well, the PC filters in database indicate that we have along way to go.I call it Google_in_China – GiC syndrome(TM).:-)

Reply to Steve:What if there was not any unfiltered search engine? What would you use? What if you did not know it is filtered result you got. What would you use?

We live in an age, where history is being re-written faster than it's being formed.C'est la vie!

– Roger Lynx


Response to tom mazowiesky:

I agree… who decides what is PC, offensive or not offensive? I believe that a true database search should be immune from PC regulation. Adults are mature and can filter out those items which are irrelevant. Contextual filtering is tough — leave that up to the Human, we do a better job of it anyway.

– Steve King


The search engines should have a “Filter Potentially Offensive Information” check box. When selected it should return a blank page. No single selection is going to apply to everyone. There ar many filtering packages available that filter all content (not just the search results). If you would be offended enough to not want to see (or have a child who shouldn't see) certian sites then that is the tool for the job. Most of these allow you to configure what is and isn't objectionable.

– Paul Glaubitz


You slightly mis-represent the issue; Amazon didn't return adoption books when searching on “abortion”, it suggested you may have mistakenly searched for abortion when you meant to “adoption”. (“Did you mean 'adoption'?”)

The difference isn't one of filtering results, but one caused by the automated suggestion/affinity/mis-spelling software (in this case). Given that the public doesn't understand how Amazon's system decided to put “Do you mean 'adoption'”, and that (if you're not Amazon) you don't know how they make the suggestions, it's understandable how such a response would be perceived.

The public (in this case retired minister) doesn't know whether the suggestion was automated, was programmed in, was a paid-for listing, was an 'editorial' position, etc. Given the actual suggestion given in this case, it's easy to understand how he would assume it was an editorial position, decided on by a human. It so happens he was wrong, apparently – but that doesn't matter to perceptions.

“Adoption” still shows up under “Related searches”, it's just not suggested as a mis-spelling or a “you should have searched for …” type of suggestion.

It does still fall under your overall point – it was an automated system that now has been tweaked (though I suspect they put in the tweaking ability long ago for other reasons). However, the differences above do affect your argument. A search engine (Google, etc) is expected to return the results you asked for. Presenting the output of a pseudo-AI/guess-what-the-user-meant program in a way that people don't perceive as an “engine” will cause people to react to it not as a machine, but instead as if a human suggested it. I suspect perception/reaction this is even stronger in something like Amazon's search, compared to Google.

– Randell Jesup


Reponse to R. Jessup

Thanks for being particular! A lot of opinions get formed on incomplete or sound-bite-ized (i.e., context free) data. The picture changes considerably when one contemplates the whole of the context.

I agree with others who have said that a blank page is the correct response to “filtered output, please”. I really get ired by people who want to impose their idea of sanitazation on *my* world. If they want to live in a sterile room, it is totally their right to move to one. They have no right to try to change mine.

/rant

This is more of a technology issue than it might appear to some – we, as practitioners of the art, are the people who implement these things. As one of my wisest instructors pointed out: engineers generally approach a problem with “how can it be done?” and often forget to ask “should it be done?”.

So let me pose this question to your audience, Jack: do engineers have a responsibility to resist things like this? Or is that just imposing our worldview (that it should be unfiltered) on others?

– Daniel Singer

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.