Women obtain their degrees at a higher rate than men and they are also more likely to think it was worth it.
A recent survey by the Pew Research Center says about 50 percent of women who graduated from a four-year college gave the higher education system excellent or good marks for the value given the cost.
Only 37 percent of men surveyed gave it the same high marks.
More women than men also said they felt their college education helped them grow professionally and intellectually. Could this have anything to do with the fact that society tends to think women need their education more than men do?
"While a majority of Americans believe that a college education is necessary in order to get ahead in life these days, the public is somewhat more inclined to see this credential as a necessity for a woman than for a man. Some 77 percent of respondents say this about women, while just 68 percent say it about men," researchers reported.
What do you think? Is a college education more beneficial for a woman?