Sunday, September 29, 2013
Who would have thought that a formal code of ethics would be interesting and thought provoking? Having been bored to delirium with some books on foundations of formal ethical principles I didn't expect to get sucked in by reading the ACM Code of Ethics. Surprise!!!!!
What really got my attention is that there is a very active - nay proactive - tone to the whole thing. In contrast to tomes (and I mean TOMES) I have read in the past, which are thorough and informative if you can stay awake, this page and half kept me sitting upright in my chair. The Code suggests - nay instructs - ACM members to take charge in a variety of societal and environmental ways.
Here are several excerpts, with emphasis added by me in each case. Think about what each one might mean to you in your work:
"Therefore, computing professionals who design and develop systems must be alert to, and make others aware of, any potential damage to the local or global environment." (from section 1.1)
"Well-intended actions, including those that accomplish assigned duties, may lead to harm unexpectedly. ... the responsible person or persons are obligated to undo or mitigate the negative consequences as much as possible." (from section 1.2)
"...obligation to report any signs of system dangers that might result in serious personal or social damage. If one's superiors do not act...it may be necessary to "blow the whistle"..." (from section 1.2)
"...provide full disclosure of all pertinent system limitations and problems." (from section 1.3)
"...with the recognition that sometimes existing laws and rules may be immoral or inappropriate and, therefore, must be challenged." (from section 2.3)
"...must consider the personal and professional development, physical safety, and human dignity of all workers." (from section 3.2)
That last one especially hit home.
Many years ago I was given an assignment to work on a scheduling system that was seriously opposed by the home health care providers who going to be forced to use it. The system was going to provide each health care worker with a daily schedule of who to see, what order to see them in and what time to arrive and depart each home. A device in their vehicles would track where they were at any point in the day. Management loved the idea: (supposed) efficiency and (short term) cost savings were their concerns. There was much talk about how helpful this system would be for everyone. However, in prototype trials, the health care workers found themselves driving back and forth, sometimes recrossing parts of town, cutting short time with patients who needed additional assistance, and feeling as if they had lost control of the all important human interaction part of the healing process.
As I gradually found out all this, and I realized what "my" system was going to do, it sucked. That is the only word for it. I felt a huge ethical dilemma because I was putting my skills as a developer to work on a project that would, in my estimation, degrade the lives of the health care workers and the patients.
I lucked out that time. The system imploded of its own volition for technical reasons before going live, thus removing the problem from my plate. However, I couldn't forget about it and I decided that I'd never knowingly work on a project that caused harm to others again. Not only that, I decided that from then on I'd pay more attention to learning about the impact of my projects, and not leave it to chance that I'd find out.
One of the things I really like about the ACM Code of Ethics is that if you read it, and think about it in the context of your professional activities, it will likely get you thinking in new ways. Before you find yourself faced with an ethical dilemma at work.