One thing I find very interesting about today’s technology is that it’s basically bent around the idea of making people even lazier than they already are. Now some people may not view this as a bad thing, because in reality it saves them precious time and energy doing tasks that could plausibly be done by a robot, or a machine. But I personally think that if technology progresses at the rate it’s going (JAPAN) that we will just be left, sitting in a chair, ordering around robots to do everything for us… Where have I heard that before?
Another subject that’s been in the back of my mind is, what exactly defines ones actions to be “moral” or not. I guess you could say that an action is moral if everyone that is involved in the action learns something valuable and can take a life lesson from it. But what about if someone does something wrong, but everyone EXEPT the person, learns a valuable lesson… does that make the person’s actions moral?