First, for your review, the articles in questions.
- Employers and Community College Students Aren't Sold on Online Degrees, Survey Finds written by Hannah Winston (other Chronicle articles by Winston)
- Not Yet Sold: What Employers and Community College Students Think about Online Education from Public Agenda and communicated through Carolin Hagelskamp, Ph.D.
Now that you've read them both, perhaps you can follow along with some cursory critique of the research being touted all over 2 of my Chronicle email lists. I'm only going to look at a few glaring problems that debunk this entire study and the article that followed.
If you were to go to the article and read it without reading the study you should find two glaring problems. First is the population size. The article (and study report) discuss the population surveyed.
- 200 Community College Students as reported by the Chronicle and 215 reported by the researchers
- More than 600 Employers as reported by the Chronicle and 656 reported by the researchers.
I really have two issues here. First (I guess 1A) is that literary license is taken for the sake of making the article more palatable but the imprecision is lazy and leads the reader to think there is more weight behind the report than is real. Second (1B) is aimed at the study itself. The sample is minute and certainly not indicative of either employers or community college students. Let's do the math.
- 215 Community College Students equates to 0.0027% of the Community College enrollment reported in early 2013 and 0.0028% of 2012 according to a March 2013 IBISWorld Industry Report (61121 - Community Colleges in the US). This can't be confused with a representative sample, nor does the report describe how the sample might be extrapolated to be indicative of the whole Community College Student population.
- The pool came from one community college. The same IBISWorld report calculates the number of institutions at 1738 making the institutional pool 0.058% of the entire industry. Again, hardly representative
I have a question about the student population too.
- What ages, majors, progress toward degree, and GPAs describe the students surveyed?
- 656 employers in 4 MSAs with over 50 employees is hardly a representative sampling. A precise percentage of the whole is difficult to provide without extensive calculations but we can more likely assume that in some 900 industry sectors, with a national employment of over 136 million employees, according to BLS, is not an encompassing or representative sample.
- The 4 MSAs are spread through the West Coast, Southwest, Northeast, and Northern Midwest. This not only leaves out two areas of the country it is also 1.05% of the MSAs in the United States (381).
We also don't know much about the sample. Here are the questions that should be asked in addition to sample size and representative sample above.
- What industries describe the employers surveyed?
- What disciplines were needed by the employers (medical, engineering, etc)?
The article and study also show some clear bias. Winston begins the report citing the for-profits and MOOCs both of which have been on the negative side of the news recently. She goes so far as to name the largest for-profit, University of Phoenix, just to make certain appeal to emotion fallacy sets in firmly. Full disclosure at some point I worked for University of Phoenix and graduated twice from the school. I'm no fanboy though. Those that know me know I am openly critical of the school's business practices and candid about their academics.
Public Agenda's study does the same thing with MOOCs but to their credit leaves out the for-profit bashing the Chronicle article includes so obviously.
Winston's Chronicle article also reports
And of the 200 community-college students surveyed, 42 percent said they had learned less from online courses than they had from learning in the classroom.
It would be easy to say that 58% of the students indicated the contrary but we can't be certain until we read the report from Public Agenda where the results indicate 53% learned about the same and 3% more than a face-to-face class. Didn't Answer or Don't Know made up the other 2%. So, the majority of students indicate that online was as good or more so than their traditional classes. Reporting the contrary shows the bias in the Chronicle.
Compare that to Public Agenda's other graph indicating difficulty and one sees that 38% indicated it was more difficult, 39% the same, and only 18% easier than traditional classes. Why couldn't that be reported?
According to the survey, students said not only were the online classes harder but they learned less.
That simply isn't the case and the report shows that. Public Agenda's own discussion indicates this as well and it is being misinterpreted.
What Would Make this Report Better?
A Toastmaster's mentor of mine always asked that question of his speeches. A few things could make this whole thing better.
Public Agenda Report
- The sample size needs to be increased significantly in order to generalizations across the online education sector.
- The obvious bias toward distance education needs to be removed
- The specifications about industry and student demographics need to be reported and the report should limit itself to representing only those specific areas.
- Journalistic bias is terrible thing and it is no wonder the Chronicle let it passed their editors. The Chronicle bashes distance education frequently especially when articles (rightly or wrongly) loop in the for-profits.
- Winston should take a deeper and more critical look at the report, the study, the research, and the discussion.
- The article should accurately report the study findings even if the study doesn't do the same.
Now, I can come to bed. (see the cartoon at the top).