First, I have to (pun intended) recognize your time and intellectual effort in making your case. I don't have much time at the moment, so I will be brief. If you feel I have slighted some particular point which you would like me to address -- please tell me! I will do my best to address it. I understand the assumption that free will is an prerequisite to morality. I don't think it is, but I believe I understand the perspective. I have just posted a new essay, at Stacker's unwitting behest, which shows how my perspective plays out in the moral realm. You may not agree with my position, but it should at least lay out the map of my position on morality in more detail. You say at one point: "When I act or don’t act in a certain way I cannot do differently than I do, but when it goes well or badly I can adjust my morality so that in the future I act differently. I can place myself in situations where I could not act differently than I did and so act morally." I do not disagree with this statement as I believe you intended it. I only disagree with the idea that free will exists as an uncaused causal event that is the source of your moral correction. As conscious beings, everything we experience has the potential to contribute to our future decisions. Thus, to use a noxious but useful metaphor, our software can get better over time. We differ, I think, in that you believe the essence of "you" is a super-physical (if not super-natural) decision making entity -- whereas I believe that what separates "me" from the average mechanical process in the rest of nature is consciousness. It is consciousness that makes the construction of the complex, but wholly physical, algorithms of decision-making possible.