Why does a state have any right on how you treat your body? Things as important as that should not be left up to individual states but the entire country because what if you cross state lines to get an abortion? A lot of states want to make that illegal. If it effects 50% of the population then it should be left to the Federal Government and not the states