Is Christianity Bad for Women?
Christianity teaches that men and women equally bear the image of God. In many quarters of Christendom, however, the roles assigned to men and women fall far short of equality. In contemporary American society, women have enjoyed increasingly greater opportunities for leadership, and greater protections under the law from gender-based harassment and violence. But Christianity has often been seen as an obstacle to such progress. What do the diverse traditions within Christendom really teach about the place and treatment of women in the church, family, and society? Do the church's actual practices match that teaching?