The Depressing Truth About Why Women Need College Degrees

Posted by Liz Dwyer

college.women
Since the early 1980s, more women than men have enrolled in college, and since 1996, women have earned more bachelor's degrees. Now, a new survey from the Pew Research Center may reveal why women are more likely to go to college: they need the degree to overcome the sexism of the working world.

Although the majority of Americans still believe going to college is essential for anyone who wants to "get ahead in life," even more people believe it's important for women. Seventy-seven percent of respondents believe women need a college degree to "get ahead," compared to only 68 percent who think men need a degree.

The results are a tacit admission of the role gender inequality still plays in our society. There's so much working against women looking for jobs—as this 2010 Foreign Policy index makes clear—that they have to be better educated than their male counterparts just to level the playing field. An otherwise capable man without a college degree is simply more likely to get a well-paying job than a woman with the same qualifications. The Pew results also back up the findings from the latest report from Georgetown's Center on Education and the Workforce, which found that women and people of color consistently earn less than white males, even when they are more educated or work longer hours. 

Given that women believe a college degree is more important for their lives, it's not surprising that more women than men gave the America's higher education system a good grade. Fifty percent of female college graduates said college is a good value, compared to just 37 percent of male grads.

The belief that men don't need college undoubtedly hurts society as well as the individuals who give up increased earning potential by passing up higher education. It's well established that college grads are less likely to commit crimes and more likely to volunteer, vote, and be civically engaged, so a generation of poorly educated men isn't something to celebrate.

And while it's encouraging to see that more women are being educated than ever before and that people of both sexes believe that it's beneficial for women to go to college, it's troubling that our commitment to educated women is driven by the understanding that it's a sexist world out there. The right result, perhaps, but for all the wrong reasons.

photo (cc) via Flickr user Tulane Public Relations