America was once a proud nation. A moral nation. One Nation Under God, where harlots, adultresses and femi-boot goosesteppers dared not show their face lest they be branded crimson with sin.
But as time has progressed, so has the temarity of the female species. American women have gone from being chaste doves of righteousness, vigorously flapping about to ensure the happiness of their husband and children. But where have these women gone?
Throughout the nation, in food courts, overweight children with self-esteem fill up on Taco Bell and Mountain Dew, because mother is not home to take care of them. She may be out wildly partying with her ‘liberated’ friends at drug-orgy raves or engaging in acts of fornication unknown with countless men at the office, because she long ago betrayed the man who would have been her husband.
Scholars ponder what is causing the quick and tragic demise of America. Is it our politicians, our commitment to war to ensure the safety of the lesser nations? Is capitalism somehow flawed? While liberals will allude to all these things, there is one truth that reigns paramount.
Until mothers instill proper values in their daughters, America will continue to decline.