Do We Really Need To "Teach" Ethics?

I always find it curiously interesting when universities feel compelled to introduce courses on ethics to their student body.

This is because ethics essentially reflect, or at least, are supposed to reflect our basic core human values. Well if this is the case then do we not already have deep within each of us an awareness of such "human" values?

We should, that is if we are card carrying human beings. The idea that ethics, and hence core human values need to be "taught" to human beings is a sign that something has gone awry.

Have we forgotten our basic human values or even worse have we stopped being human beings? Well when you look around on this planet at the multitude of "inhuman" i.e. "unethical" acts you might come to the conclusion that human beings have disappeared from the face of the earth. If you've ever seen the 1950's classic movie "The Invasion Of The Body Snatchers" I think you'll see what I mean.

It is my feeling that there are still human beings on this planet albeit in a very unconscious state. By unconscious I mean that they are unaware of who they truly are and what they truly feel about their own choices.