USA: Is College Worth It?
The answer to this question once seemed like a no-brainer. During the years of the postwar boom, college was sold as a kind of normal stage of life for young Americans, and attaining a degree from a public university was a sure way toward a higher salary. It was often quite affordable as well, thanks to things like Pell Grants and a greater amount of public funding. Those days seem far away now.