You are viewing glyf

entries friends calendar profile Scribbles in the Dark Previous Previous Next Next
Please Visit http://glyph.twistedmatrix.com/ - This Blog Is Closed. - Ethics for Programmers: Primum non Nocere
Sorry about the ads, I can't turn them off.
glyf
glyf
Share
Ethics for Programmers: Primum non Nocere
This post isn't about Divmod, exactly.

I've been mulling over these ideas for quite a while, and I think I may still have more thinking to do, but recent events have gotten me thinking again about the increasing urgency of the need for a professional code of conduct for computer programmers. Mark Russinovich reported on Sony BMG's criminal contempt for the integrity of their customer's computers, and some days later CNET reported on Sony BMG's halfhearted, temporary retraction of their crime. A day later, CNet's front page has news of Apple trying to institutionalize, as well as patent, a similar technique. While the debate over DRM continues to rage, there are larger issues and principles at stake here that it doesn't seem like anyone is talking about: when you run a program on your computer, who is really in charge?

I posit that, in no uncertain terms, it is a strong ethical obligation on the part of the programmer to make sure that programs do, always, and only, what the user asks them to. "The user" may in some cases be an ambiguous term, such as on a web-based system where customers interact with a system owned by someone else, and in these cases the programmer should strive to balance those concerns as exactly as possible: the administrator of the system should have no unnecessary access to the user's personal information, and the user should have no unnecessary control over the system's operation. All interactions with the system should faithfully represent both the intent and authority of the operator.

Participants in the DRM debate implicitly hold the view that the ownership of your operating system, your personal information, and your media is a complex, joint relationship between you, your operating system vendor, the authors of the applications you run, and the owners of any media that pass through that application. Prevailing wisdom is that the way any given software behaves should be jointly determined by all these parties, factoring in all their interests, and that the argument is simply a matter of degree: who should be given how much control, and by what mechanism.

I don't like to think of myself as an extremist, but on this issue, I can find no other position to take. When I hear lawmakers, commercial software developers, and even other open source programmers, asking questions like, "how much control should we afford to content producers in media playback programs?", I cannot help but think of Charles Babbage.
On two occasions I have been asked [by members of Parliament!], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
The "you don't own your computer" paradigm is not merely wrong. It is violently, disastrously wrong, and the consequences of this error are likely to be felt for generations to come, unless steps are taken to prevent it.

Computer programmers need a socially, and legally recognized code of professional ethics, to which we can be held accountable. There have been some efforts in this direction, the most widely-known one being the Software Engineering Code of Ethics and Professional Practice. As long as I'm being extreme: this code of conduct is completely inadequate. It's sophomoric. It's confused about its own purpose. It sounds like it was written by a committee more interested in promoting "software engineering" techniques, as defined by the ACM, than in ethics. I'll write a bit about exactly what's wrong with it after I describe some similarities in existing professional codes of conduct which themselves have legal ramifications.

Although there are many different codes of ethics for medical doctors, a principle which echoes through them all is one which was formulated in ancient history, originally by Hippocrates but distilled into a catch-phrase by Galen: "First, do no harm."

The idea is that, if you are going to be someone's doctor, you have to help them, or at least, you shouldn't ever harm them. Doctors generally regard this as a sacred responsibility. This basic tenet of the doctor-patient relationship typically overrides all other considerations: the doctor's payment, the good or harm that the patient has done or may do, and the advancement of medical science all take a back seat to the welfare of the patient.

In this modern day and age, where doctors often perform general anesthesia on their patients to prepare them for surgery, this understanding is critical to the credibility of the medical profession as it stands. Who would knowingly submit themselves to a doctor, knowing that they might give you a secondary, curable disease, just to ensure they got paid?

Lawyers have a similar, although slightly more nuanced, principle. Anybody who has watched a few episodes of Law and Order knows about it. A slightly more authoritative source than NBC, though, is the American Bar Association, who in their Model Code of Professional Responsibility (the basis for the professional responsibility codes of most states' Bar associations in the United States) declare:
The professional judgement of a lawyer should be exercised, within the bounds of the law, solely for the benefit of his client and free of compromising influences and loyalties. Neither his personal interests, nor the interests of other clients, nor the desires of third persons should be permitted to dilute his loyalty to his client.
(emphasis mine)
For criminal defense lawyers, these "compromising influences and loyalties" may include a basic committment to the public good. A lawyer who represents a serial murderer who privately admits to having committed heinous crimes must, to the best of their ability, represent the sociopath's interests and try to get them exonerated, or, failing that, the lightest sentence possible. Low as we as a society might consider a lawyer who defends rapists and murderers, we would think even more poorly of one who gave intentionally bad advice to people who he personally didn't like, or sold out his client's interests to the highest bidder.

A doctor's responsibility is somewhat the same. If a doctor is treating a deeply evil person, they are still obligated by the aforementioned sacred patient/doctor pact to honestly treat that person, not use their position as a doctor to proclaim a death sentence, or cripple them. They are obligated to treat that person equitably, even if that person's evil extends to not paying their medical bills.

This pattern isn't confined to professional trades. Catholic priests have the concept of the "seal of confession". If you confess your sins to a catholic priest, they are not to reveal those sins under any circumstances, regardless of the possible harm to others. A priest certainly shouldn't threaten their flock with knowledge of their confessed sins to increase contributions to the donation plate, even if one of them has confessed a murder.

In each case, society calls upon a specialist for navigating a system too complex for laymen to understand: the body, the law, and the soul. In each case, both society at large and individuals privately put their trust completely into someone allegedly capable of navigating that system. Finally, in each case, the trust of that relationship is considered paramount, above the practitioner's idea of the public good, above the practitioner's (and other's) financial considerations.

There is a good reason for these restrictions. Society has systems in place to make these judgements. Criminal defense lawyers are not allowed to judge their clients because that's the judge's job. Doctors aren't allowed to pass sentences on their clients because that's the legal system's job. Catholic priests don't judge their confessors because that's God's job. More importantly, each of these functions may only be performed with the trust of the "client" - and it is important for the client to know that their trust will not be abused, even for an otherwise laudable goal, such as social welfare, because notions of social welfare differ.

I believe that computer programmers are a fourth such function.

Global telecommunications and digital recording are new enough that I think this is likely to be considered a radical idea. However, think of the importance of computer systems in our society today. Critical functions such as banking, mass transit, law enforcement, and commerce would not be able to take place on the scale they do today without the help of computer systems. Apropos of my prior descriptions, every lawyer and doctor's office has a computer, and they rely on the information provided by their computer systems to do their jobs.

More importantly, computers increasingly handle a central role in our individual lives. Many of us pay our bills on a computer, do our taxes on a computer, do our school work or our jobs on computers. Sometimes all of these things even happen on one computer. Today, in 2005, most of those tasks can be accomplished without a computer (with the exception, for those of us with technical professions, of our jobs) but as the public systems we need to interact with are increasingly computerized, it may not reasonable to expect that it will be possible to lead an average modern life in 100 years without the aid of a personal computing device of some kind.

If that sounds like an extreme time frame, consider the relative importance of the automobile, or the telephone, in today's society versus 1905. It's not simply a matter of convenience. Today it is considered a basic right today for accused criminals to make a phone call. Where was that right when there were no telephones?

Another way to think of this relationship with technology is not that we do a lot of things with computers, but that our computers do a lot of things on our behalf. They buy things. They play movies. They make legal claims about our incomes to the federal government. Most protocol specifications refer to a program which acts on your behalf (such as a web browser) as a user agent to reflect this responsibility. You are not buying a book on Amazon with your computer; you click on some links, you enter some information, and you trust that your computer has taken this information and performed a purchase on your behalf. Your computer could do this without your help, if someone has installed a malicious program on it. It could also pretend to have made a purchase, but actually do nothing at all.

Here is where we approach the intersection between programming and ethical obligation. Every time a user sits down to perform a task with a computer, they are, indirectly, trusting the programmers who wrote the code they will be using to accomplish that task. Users give not only the responsibility of performing a specific task, they trust those programs (and thereby their programmers) with intensely personal information: usernames, passwords, social security numbers, credit card numbers - the list goes on and on.

There may be a technological solution to this problem, a way to limit the amount of information that each proram needs, and provide users with more control over what different programs can say to each other on their own computer. Some very smart people are working on this, and you can read about some of that work on Ka-Ping Yee's "Usable Security" blog. Still, one of the experts there contemplates that perhaps, given the abysmal state of software today, perhaps the general public shouldn't even use the internet.

DRM is definitely a problem, but the real problem is that it's the top of a very long, very slippery slope. Its advocates point at the top of that slope and say "See, it's not so bad!" - but where will it end? While I am annoyed, I'm not really that concerned with the use of this kind of technology to prevent copyright violations. It's when we start using it to prevent other sorts of crimes that the real fear sets in.

Today, it's considered almost (but not quite) acceptable that Sony installs the digital equivalent of a car-bomb on my computer to prevent me from copying music. As I said at the beginning of this article - they don't think that the practice is inherently wrong, simply that there are some flaws in its implmentation. Where will this stop? Assuming they can perfect the technology, and given that my computer has all the information necessary to do it, will future versions of Sony's music player simply install themselves and lie in wait, monitoring every download, and automatically billing you for anything that looks unauthorized, not telling me about it until I get my credit card statement?

Whether unauthorized copying should be a crime or not, preventing it by these means is blatantly wrong. Let me be blunt here. It is simply using a technique to wring more money out of users because the technique is there. Much like the doctor who cuts off your nose and won't reattach it until he gets paid for his other (completely legitimate) services, this is an abuse of trust of the worst order. It doesn't matter how much money you actually owe the doctor, or Sony: in any case, they don't have the right to do violence to you or to your computer because of it.

What of "terrorism"? Will mandatory anti-terrorism software, provided to Microsoft by the federal government, monitor and report my computerized activities to the Department of Homeland Security for review? From here, I'll let you fill in the rest of the paranoid ravings. I don't see this particular outcome happening soon, but the concern is real. There is no system in place to prevent such an occurance, no legal or ethical restriction encumbent upon software developers which would prevent it.

This social dilemma is the reason I termed the IEEE/ACM ethics code "sophomoric". With the directionless enthusiasm of a college freshman majoring in philosophy, it commands "software engineers" to "Moderate the interests of [themselves], the employer, the client and the users with the public good.", to "Disclose to appropriate persons or authorities any actual or potential danger to the user, the public, or the environment", to "Obey all laws governing their work, unless, in exceptional circumstances, such compliance is inconsistent with the public interest." These are all things that a good person should do, surely, but they are almost vague enough to be completely meaningless. These tenets also have effectively nothing to do with software in specific, let alone software engineering. They are in fact opposed to certain things that software should do, if it's written properly. If the government needs to get information about me, they need a warrant, and that's for good reason. I don't want them taking it off my computer without even asking a judge first, simply because a helpful software engineer thought it might be a "potential danger to the public".

Software developers should start considering that accurately reflecting the user's desires is not just a good design principle, it is a sacred duty. Much as it is not the criminal defense lawyer's place to judge their client regardless of how guilty they are, it is not the doctor's place to force experimental treatment upon a patient regardless of how badly the research is needed, and it is not the priest's place to pass worldly judgement on their flock, it is not the programmer's place to try and decide whether the user is using the software in a "good" way or not.

I fear that we will proceed down this slippery slope for many years yet. I imagine that a highly public event will happen at some point, a hundred times worse than this minor scandal with Sony BMG, and users the world over will angrily demand change. Even then, there will need to be a movement from within the industry to provide some direction for that change, and some sense of responsibility for the future of software.

I hope that some of these ideas can provide direction for those people, when the world is ready, but personally I already write my code this way.

I've written about this a couple of years ago, and I think there's more to the issue, but I feel like this key point of accurately relaying the user's intent is the first step to anything more interesting. I don't really know if a large group of people even agree on that yet.

So, like I said, this post isn't about Divmod - exactly - but when we say "your data is your data"... we mean it.

Tags: , , ,
Current Mood: quixotic quixotic

Comments
deeptape From: deeptape Date: November 13th, 2005 08:27 pm (UTC) (Link)
Wonderful essay! What you have expressed here is critically important.

We had a glimmering of this working at Origin, where a respected leader advocated 'fascist' management policies for our network services, the virtual worlds. In a world where virtual and physical entities have increasing interplay and influence on each other, the policies governing the crafting and operations of the virtual will also increasingly impact the real, for good or ill.

Examples: Cell phones reporting user locations, unauthorized RFID scanning, traffic cameras IDing license plates, zombie PCs, Magic Lantern, Carnivore, autodeleting Tivos, and that's just what we know about.

Please develop this idea further. We're going to need it.



puzzlement From: puzzlement Date: November 13th, 2005 10:28 pm (UTC) (Link)
traffic cameras IDing license plates

In what sense do you feel that this falls within the scope of the essay? Should it? The connection isn't clear to me: I as a driver am neither owner nor user of the traffic cameras (at least, not if you mean ones like the ones in Australia, which are automatically taking photographs for the purpose of charging people with traffic offenses). The author of their software does not therefore seem obliged to serve my interest in not being tracked and fined, they seem obliged to serve their user's interest in harming me.

There's probably an open question in this particular essay about the extent to which inflicting harm is a sacred duty by programmers, because an analogy is drawn to two professions with opposed ethical requirements on that front.

For doctors, the situation is clearer, because it's difficult to directly harm a third party by giving medical treatment (even though indirect harm, such as that patient using their healthy body to kill someone may result). There are some grey areas, such as allowing a patient with HIV to actively deceive their sexual partners (who may also be that same doctor's patients) about their status since that is often the only way to get that person to continue to seek their advice, and these are actively debated. But essentially, doctors do not harm patients or third parties.

For lawyers, it is required to inflict harm, if it is harm inflictable through legal processes and is likely to benefit their client. Their role is to harm their client's opponents as directly as they professionally can in fact: to destroy their reputations, to confuse them or induce panic attacks in cross-examination, to have them convicted and their livelihood destroyed. Lawyers do not harm clients, but they are compelled to harm third parties as a matter of professional ethics.

So, for computer programmers, where the user of the program and the owner of the equipment want to inflict harm (for example, fining people for traffic offenses) and where they are not intending to use data and equipment that belongs to others to inflict that harm, this essay tends to suggest that the programmer should aid them. But that's me inferring from Glyph's analogies, I'd certainly be interesting in hearing whether he thinks the analogy to lawyers holds that far, or if not, what the difference is, and what the ethics are of writing a program intended to harm people, but not by using or abusing their private data or equipment.
glyf From: glyf Date: November 14th, 2005 07:33 am (UTC) (Link)

I remember it well

The first draft of this essay specifically mentioned MMPs, in the section on the 100-years-hence role of a computer.

Since you brought it up, I'll paste that paragraph back in: There are already examples today of interactions which are impossible unmediated by technology. You can't engage in an online game via postal mail. While many players say that the important aspects of the game are really the social aspects, those social aspects are impossible without the technological scaffolding around them.

The "respected leader" you mention is a pretty good example of why the issue is not as straightforward as "be a good person and you'll be a good programmer". I think that his *personal* integrity was great, and he very honest and forthright, but certainly his operational policies fell outside the bounds of this framework I'm setting up.

I do plan on doing some more writing on this topic. But don't expect to see it on a regular basis :).
From: oubiwann Date: November 14th, 2005 03:35 am (UTC) (Link)

YES!

Holy schnikies. Thank you, glyph. That was so well put together. This needs to get passed around until everyone's read a copy of it.
glyf From: glyf Date: November 14th, 2005 11:33 pm (UTC) (Link)

Re: YES!

Thanks :).

I've just read the mefi commentary though, and it definitely doesn't sound like people are getting it. I think I'll have to refine the ideas here a bit more, post some clarifications, and maybe outline some kind of an actual plan, so that people can have an idea of what I'm proposing. (Hint: it isn't "professionalize programming", or "arrest the F4I programmers")
From: _king_ghidorah_ Date: November 14th, 2005 01:30 pm (UTC) (Link)
>.> programmers don't need licenses. people don't usually die from bad programming.
glyf From: glyf Date: November 14th, 2005 02:24 pm (UTC) (Link)
I never said that they did. Are you extrapolating from something?
From: burdges Date: November 16th, 2005 12:02 am (UTC) (Link)

Hmm

Interesting, but various other less code-centric buisness practics may have just as much impact. Not sure how the line should really be drawn.

Copyrights and patents were historicallly to force disclosure of inventions. Why not just require all source code to be published for a copyrigh to be valid? A less invasive requirment, for a less invasive profession.
glyf From: glyf Date: November 16th, 2005 09:39 am (UTC) (Link)

Re: Hmm

That would definitely be a good start. I'm thinking more about the basis for such laws at the moment though, rather than a specific remedy.
From: hyades Date: April 3rd, 2006 05:24 am (UTC) (Link)
I ABSOLUTELY agree with this excellently written piece. I think such a code is needed ASAP.

Trouble is, the United States is hellbent on using all our technology to massacre the world's poor and abrogate our rights to speak out against it.

I've thought of a technician labor union that would get involved in political causes and organize mass strikes. That would them in the wallet place...
From: ex_worldmak Date: April 3rd, 2006 06:16 am (UTC) (Link)

There is a Professional Code of Ethics...

You found the Software Engineering Code of Ethics, and mentioned the ACM, but you apparently did not find the ACM Code of Ethics. As a member of the ACM I am bound to uphold it (afaik the same does not apply to the SE Code of Ethics you reference), and its just as binding and similar in shape to the Code of Ethics for any of the other Engineering Professional Societies.

The biggest concern is not that it exists, its that it isn't well enough known, and that right now there aren't enough companies that know the difference between Codemonkey and Professional Software Engineer, nor that organizations like the ACM exist to help determine that distinction. That is a market failure (too many jobs for too few true Professionals) not an ethical one. In the same vein, a good University is going to have an Engineering Ethics class of some sort (we had a focused Computer Engineering Ethics Course), and again, right now companies aren't exactly choosy right now in whether or not to look specifically for an Ethics course in a Software Engineer's background.

Hope that was informative.
From: spierepf Date: April 3rd, 2006 01:20 pm (UTC) (Link)

Re: There is a Professional Code of Ethics...

AFAIK, one does not need ACM certification to be a computer professional in the same was as one needs a license to become a medical professional. In what way is the ACM Code of Ethics binding?
From: dcell59 Date: April 3rd, 2006 06:55 pm (UTC) (Link)

Control and protection

While I agree with your basic premise, I find it somewhat idealistic. In most cases, programmers are paid employees of a company where other people control what the software ultimately does. I can't imagine that the decision to add DRM to Sony's music CDs was made by a programmer, or even someone familiar with software development. It was almost certainly made by an executive who simply said "Get this done". Software companies constantly release software that is not ready to be published, with no intention of fixing the problems, not because the programmers are incompetent or unethical, but because the person who controls the ship date cares more about making that date than shipping working software.

A code of ethics for programmers is a great idea, but it doesn't address the consequences. As it stands, if I don't like what my company produces, the only choice I really have is to leave the company, and figure out some other way to pay for my mortgage. How many of us are in a position to do that the moment an assignment comes in that we disagree with?

That said, I think that there is great need for change. As a user, I am constantly disappointed by the quality of software, which has gone down in direct proportion to the price of computer hardware and software. I am tired of buying products that technically work as advertised, but are full of little bugs and silly restrictions. Executives need to learn that software isn't done just because it's time to ship it to make the quarterly revenues.
From: notivago Date: April 4th, 2006 05:43 pm (UTC) (Link)

Re: Control and protection


I think that is the difference that tells an etical person from an unethical one, and thats why we need ethics on our professional. Sometimes I tell my cooworkers we are worse than whores, because the good ladies at least deliver to the user what he wanted at first place.

We don't do that, we are inescrupulous mercenaries, we excuse ourselves by shifting the blame to the ones making financial decisions, but we are as responsible for our failures as anyone else.

When a doctor is asked to cut of his patient just because he is a rober does he do so? If he does, is it well looked upon by the medical society or his peers? I don't think so. Because they abide to an ethical standard.

We don't abide(as a group) to any ethical standard, and each professional goes for what he thinks is right or wrong. To add injury to damage some ill conceived ideas have entered and installed in our day to day thinking like plague. For example "the the good is the excelence enemies" or "nobody dies due to computer programs" and such. Some professional recite them like mantras, throwing even more darkness on our profession profile.

Of course we lack legal support to refuse to do certain jobs or things... Yet it is the responsibility of each one what each one does. And by accepting to do anything you are agreeing to what is being done.
retiqlum From: retiqlum Date: April 4th, 2006 06:06 pm (UTC) (Link)

Excellently stated.

Very well put, and quite true.

FWIW: I learned of your blog through www.userfriendly.org so your voice has been heard by a great number of people in a position who are in the industry.
skjalm From: skjalm Date: April 5th, 2006 08:54 am (UTC) (Link)
One problem I see arising from this is when one particular piece of software (or other product for that matter) has several users with different goals.

In the following I'm giving examples and trying to be objective so please don't extrapolate from it to try and guess my meaning - cause we all know (hopefully) that extrapolation can lead to extremely bad conclusions ;-)

Is the user of Sony's copy protection software Sony or the person who listens to the music on her/his computer?

Is the user of a car tracking module (can't remember the exact name?) the owner of the car? The driver of the car? The car thief who drives the car after stealing it? The police trying to track the car? The non-licensed entity using the signals emitted from the tracking device to gain information about the car's position?

I don't see any easy and simple code to adhere to because the concept of "good engineering practise" is either too vague or far too detailed to be applied in practise. If it's too vague you'll soon find yourself in an uncovered grey area. If it's too detailed you'll soon find yourself outside its defined "world".

Anyways, just my 2 cents and while they're Euro cents I don't really believe they're worth more than anyone else's cents. The opposite is more likely the case ;-)
skjalm From: skjalm Date: April 5th, 2006 08:55 am (UTC) (Link)
and guess my meaning

Erm, make that "and guess my opinion"
From: thecunningbison Date: May 3rd, 2006 08:57 am (UTC) (Link)
I lecture a module in Information Systems Practice.

It's an excellent post for opening up a discussion in one of my classes. I hope you don't mind me emailing this post to my students.
glyf From: glyf Date: May 3rd, 2006 05:36 pm (UTC) (Link)
Not at all! In fact, I'd be honored.
From: julian_morrison Date: May 22nd, 2007 01:29 am (UTC) (Link)

Property

I think you have to split this into two halves.

1. I will never program a computer to disobey its owner.

2. I will never program a computer to harm its user.

Private property solves the complications. A computer must obey its owner, no question about that, but must only avoid harming its users. You aren't obliged to help them. This gets around the questions like: what if I want not to let the users of my online service upload porn. You're allowed to write code to refuse the upload, but you're forbidden to write code that reports them to the police.
moldy_crouton From: moldy_crouton Date: May 22nd, 2007 04:24 am (UTC) (Link)

Thank you.

I am a graduate student in Information Studies at UT-Austin. Recently, the field has started to shift from strictly books (i.e. Librarianship) to Digital Media. I have been, as a young student, very vocal about the need to start implementing a code of Ethics in ALL disciplines. The simple fact is, we have been training people to just do jobs but have not, at all, made any attempt to make them think on a philosophical, moral, or ethic level. What comes from this is a an army of workers with no regard for the impact of what they do. To them, the ends justify the means. I am SO relieved to see a person willing to say "we need a return to ethics".
From: roodman Date: May 31st, 2007 07:09 am (UTC) (Link)

Debian

Anybody who takes the time to read and understand the Debian software distribution system documents ("Social Contract", "Policy and Procedures") would find that we have already been considering it our sacred duty for at least a decade and arguably much longer. It is more a question of education and popularization but there are already many thousands of people signed up in the cryptographically secure, grassroots Debian Web of Trust that is possible via the public cryptography we use in our procedures.

http://debian.org/
From: punkyfee Date: September 11th, 2007 01:57 pm (UTC) (Link)

Discussion help

1. Other than an ethic to “obey the law”, there is no need for computer professionals to hold any other ethical values associated with their knowledge and skills in computer and Information Technology.

If anyone could help shed some light on this question, i would really appreciate it :). I really enjoyed the essay as I am having to do a unit on Ethics in IT and Multimedia and it has certainly opened up a lot of questions.
cratermoon From: cratermoon Date: December 22nd, 2007 05:30 pm (UTC) (Link)

protecting the individuals in the profession

Greetings. I found your essay while googling for software developer ethics and conduct, and I thank you for writing it. Although it's been a while since you wrote it, not much has changed, sadly.

One aspect of a code of conduct I would more directly address is the ability of individuals within the profession to be able to say they have a call to standards higher than their employer's wishes when they are asked to do something unethical. In the same sense that doctors, lawyers, and many other professions can appeal to their responsibility to their licensing organizations if they are asked to do something unethical, programmers need the same safety net. Right now, if you or I try to respond to our employer's unethical request with reasons to refuse, they are completely able to terminate our employment and replace us with someone who will implement their wishes with no consequences, real or threatened, at all.

When broken or malicious software gets shipped, it might be the incompetence of programmers, but how often is it the demands of the employer to cut corners, ignore possible consequences, and meet the deadline that result in bad software? Programmers have little to stand on when the ethical requirement to do the right thing comes up against the employers desire to wring profits out of small margins arises.

This situation will continue until we, like a lawyer asked to violate client the attorney/client privilege, can firmly stand up for what's right.
From: jolie_hope Date: June 21st, 2008 12:12 pm (UTC) (Link)

Re: protecting the individuals in the profession

when ı'had seen this document, ı was shocked, this article is very useful for me, ı'm waiting contuine of this, like this article

Vizyondaki Filmler
51 comments or Leave a comment
profile
Glyph Lefkowitz
User: glyf
Name: Glyph Lefkowitz
calendar
Back December 2010
1234
567891011
12131415161718
19202122232425
262728293031
page summary
tags