Computer Ethics - History

History

The concept of computer ethics richa originated in 1950 when Norbert Wiener, an MIT professor and inventor of an information feedback system called "cybernetics", published a book called "The Human Use of Human Beings" which laid out the basic foundations of computer ethics and made Norbert Wiener the father of computer ethics.

Later on, in 1966 another MIT professor by the name of Joseph Weizenbaum published a simple program called ELIZA which performed natural language processing. In essence, the program functioned like a psychotherapist where the program only used open ended questions to encourage patients to respond. The program would apply pattern matching pattern rules to human statements to figure out its reply.

A bit later during the same year the world's first computer crime was committed. A programmer was able to use a bit of computer code to stop his banking account from being flagged as overdrawn. However, there were no laws in place at that time to stop him, and as a result he was not charged. To make sure another person did not follow suit, an ethics code for computers was needed.

Sometime further into the 1960s Donn Parker, who was an author on computer crimes, led to the development of the first code of ethics in the field of computer technology.

In 1970, a medical teacher and researcher, by the name of Walter Manner noticed that ethical decisions are much harder to make when computers are added. He noticed a need for a different branch of ethics for when it came to dealing with computers. The term "Computer ethics" was thus invented.

During the same year, the ACM (Association of Computing Machinery) decided to adopt a professional code of ethics due to which, by the middle of the 1970s new privacy and computer crime laws had been put in place in United States as well as Europe.

In the year 1976 Joseph Weizenbaum made his second significant addition to the field of computer ethics. He published a book titled "Computer power and Human reason" which talked about how artificial intelligence is good for the world; however it should never be allowed to make the most important decisions as it does not have human qualities such as wisdom. By far the most important point he makes in the book is the distinction between choosing and deciding. He argued that deciding is a computational activity while making choices is not and thus the ability to make choices is what makes us humans.

At a later time during the same year Abbe Mowshowitz, a professor of Computer Science at the City College of New York, published an article titled "On approaches to the study of social issues in computing". This article identified and analyzed technical and non-technical biases in research on social issues present in computing.

During 1978, the Right to Federal Privacy Act was adopted and this drastically limited the government's ability to search bank records.

During the same year Terrell Ward Bynum, the professor of Philosophy at Southern Connecticut State University as well as Director of the Research Center on Computing and Society there, developed the first ever curriculum for a university course on computer ethics. To make sure he kept the interests of students alive in computer ethics, he launched an essay contest where the subject students had to write about was computer ethics. In 1985, he published a journal titled “Entitled Computers and Ethics”, which turned out to be his most famous publication to date.

In 1984, the Small Business Computer Security and Education act was adopted and this act basically informed the congress on matters that were related to computer crimes against small businesses.

In 1985, James Moor, Professor of Philosophy at DartMouth College in New Hampshire, published an essay called "What is Computer Ethics". In this essay Moor states the computer ethics includes the following: "(1) identification of computer-generated policy vacuums, (2) clarification of conceptual muddles, (3) formulation of policies for the use of computer technology, and (4) ethical justification of such policies."

During the same year, Deborah Johnson, Professor of Applied Ethics and Chair of the Department of Science, Technology, and Society in the School of Engineering and Applied Sciences of the University of Virginia, got the first major computer ethics textbook published. It didn't just become the standard setting textbook for computer ethics, but also set up the research agenda for the next 10 years.

In 1988, a librarian at St. Cloud University by the name of Robert Hauptman, came up with "information ethics", a term that was used to describe the storage, production, access and dissemination of information. Near the same time, the Computer Matching and Privacy Act was adopted and this act restricted the government to programs and identifying debtors.

The 1990s was the time when computers were reaching their pinnacle and the combination of computers with telecommunication, the internet, and other media meant that many new ethical issues were raised.

In the year 1992, ACM adopted a new set of ethical rules called "ACM code of Ethics and Professional Conduct" which consisted of 24 statements of personal responsibility.

3 years later in 1995, Gorniak Kocikowska, a Professor of Philosophy at Southern Connecticut State University, Coordinator of the Religious Studies Program, as well as a Senior Research Associate in the Research Center on Computing and Society, came up with the idea that computer ethics will eventually become a global ethical system and soon after, computer ethics would replace ethics altogether as it would become the standard ethics of the information age.

In 1999, Deborah Johnson revealed her view, which was quite contrary to Kocikowska's belief, and stated that computer ethics will not evolve but rather be our old ethics with a slight twist.

Read more about this topic:  Computer Ethics

Famous quotes containing the word history:

    I cannot be much pleased without an appearance of truth; at least of possibility—I wish the history to be natural though the sentiments are refined; and the characters to be probable, though their behaviour is excelling.
    Frances Burney (1752–1840)

    These anyway might think it was important
    That human history should not be shortened.
    Robert Frost (1874–1963)

    We know only a single science, the science of history. One can look at history from two sides and divide it into the history of nature and the history of men. However, the two sides are not to be divided off; as long as men exist the history of nature and the history of men are mutually conditioned.
    Karl Marx (1818–1883)