Give the responsibility of children back to parents! As an educator, I realize how much power and influence that teachers have over children’s lives. If educators are immoral or have evil world views, they are going to rub off on the children, and turn them from the way in which parents would have them grow up. Educators need to be taught ethics, morals, and respect for others in college as well as pertinent subject matter. Parents need to step up and get control of their children – not expect schools to be the only discipline they get. Can we go back to the basics, please? Take responsibility for what you birthed…