February 21, 2013

THE WEEKLY BLAB

Volume 7, Issue 22 – February 21, 2013

 

Online Blues

I’m sure you’ve all seen an email that Meg Dillon sent on Tuesday, regarding an editorial in the New York Times entitled “The Trouble with Online College”.  You can see a copy of the full editorial here.  I’ve written a bit about MOOCs lately, but haven’t had a chance to talk about online learning in general, so this is a good excuse to do so.  I’ll start by analyzing the editorial.  I apologize in advance, because this will be a bit lengthy.  Please bear with me ‘cause this is important.

The Times editorial began by talking about MOOCs and reminded us that Stanford’s course on artificial intelligence, taught by two “celebrity professors”, attracted 150,000 students.  They then gave a warning: “This development, though, says very little about what role online courses could have as part of standard college instruction. College administrators who dream of emulating this strategy for classes like freshman English would be irresponsible not to consider two serious issues.” 

Hmm…I talk with lots of college administrators (at least in Georgia), and I’m unaware of a single one who thinks that a MOOC is an acceptable (or even rational) way to teach freshman English.  My own view on MOOCs was written up in detail in last week’s BLAB as well as several others in the past, and it’s not real positive.  It’s barely possible that college administrators think differently in New York, but I think it’s far more likely that the Times is setting up a straw man here—not the best way to begin an editorial.

The editorial goes on to address a real issue—withdrawal rates.  It says that huge online courses have a withdrawal rate of around 90% and that student attrition appears to be a problem even in small-scale online classes compared to face-to-face classes.  Hmm…The editorial doesn’t say that small-scale online classes have sky-high withdrawal rates, but since the 90% figure is the only number in the piece, it certainly is implying that’s the case.  On our own campus, there is indeed a difference in withdrawal rates:  face-to-face courses have an average withdrawal rate of 8% and online courses have an average withdrawal rate of 13%.  Neither of these numbers, of course, are anywhere near 90%.

Moving on, the editorial says that online courses “may be fine” for highly skilled, well-motivated students, but “are inappropriate” for students who are struggling and need close contact with instructors.  Note the loaded language here:  for the well-motivated, online courses may be fine, but for struggling students, online courses are inappropriate—apparently, no maybe about it.

Where does the editorial get its data from?  Links are provided to a series of studies conducted by Columbia University’s Community College Research Center.  All of them are the work of Shanna Smith Jaggars, and most include her research associate (a doctoral student), Di Xu.  Jaggars is the Assistant Director of the Research Center and the manager of a suite of studies sponsored by the Bill and Melinda Gates Foundation, studying community college student success.

The editorial reported that the studies indicated: “about seven million students — about a third of all those enrolled in college — are enrolled in what the center describes as traditional online courses. These typically have about 25 students and are run by professors who often have little interaction with students.” I wanted to see where this “little interaction with students” item came from, so I looked at the studies.  The editorial seems to be referring to two of them—a 2011 study called “Online and Hybrid Course Enrollment and Performance in Washington State Community and Technical Colleges” and a 2013 study called “Adaptability to Online Learning: Differences Across Types of Students and Academic Subject Areas”, both available online in their entirety.  Both looked at large numbers of courses (more than 500,000 in the latter) taken by large numbers of students (40,000 +) in community colleges in Washington State.  I couldn’t find anything about the amount of interaction between professors and students.  Perhaps this was a bit of editorializing by the Times, though its possible it showed up in one of the other studies that isn’t available online (and hence, that I didn’t read).

What did the studies show?  The more recent one showed there was a gap between persistence rates in online and face-to-face courses, with students completing 94.45% of face-to-face and 91.19% of online courses.  While this difference is statistically significant according to the researchers, it hardly seems Earth-shaking.  Students also did a bit less well grade-wise in the online courses, with an average quality point level of 2.98 (just below a “B”) for face-to-face compared to 2.77 (a bit above a “B-”) for online.  Again, not an Earth-shaking difference.  The study found that there were some subcategories of students who did worse than average: male students, black students, and students with less academic preparation did worse; and female students, older students, and various other ethnic groups did better than average.  The earlier study was similar.

What were the studies’ conclusions?  Quoting from the latter: “Overall, our findings indicate that the typical student has some difficulty adapting to online courses, but that some students adapt relatively well while others adapt very poorly. To improve student performance in online courses, colleges could take at least four distinct approaches: screening [i.e., restricting online courses to better students], scaffolding [teaching online skills within the online courses], early warning [catching students in trouble early on], and wholesale improvement [improving the quality of online courses in general].

Although many students face challenges in adapting to online learning, online coursework represents an indispensible strategy in postsecondary education, as it improves flexibility for both students and institutions and expands educational opportunities among students who are balancing school with work and family demands. Our results may help stakeholders involved in the planning, teaching, or supervision of online courses to consider strategies that will improve student outcomes in these courses.

No disagreement here—the suggested changes are pretty reasonable stuff.  Scaffolding, early warning, and course improvement are obviously good things and good online programs do them all.  Screening makes sense too—the most vulnerable students were ones in remedial math and English courses, areas which most would agree are less amenable to online instruction.  The overall conclusion, that online coursework is an “indispensible strategy” and that it “improves flexibility” and “expands opportunity”, are right on the money as well.

The Times editorial is more inflammatory and goes on to say:  “Many students, for example, show up at college (or junior college) unprepared to learn, unable to manage time and having failed to master basics like math and English.  Lacking confidence as well as competence, these students need engagement with their teachers to feel comfortable and to succeed. What they often get online is estrangement from the instructor who rarely can get to know them directly. Colleges need to improve online courses before they deploy them widely. Moreover, schools with high numbers of students needing remedial education should consider requiring at least some students to demonstrate success in traditional classes before allowing them to take online courses.”  This seems to be editorializing by the Times again, since the two studies never use the word “estrangement” at all, and never claim that faculty in online courses rarely get to know their students.

The editorial concludes:  “The online revolution offers intriguing opportunities for broadening access to education. But, so far, the evidence shows that poorly designed courses can seriously shortchange the most vulnerable students.

Overall, in my opinion, this was a pretty sucky editorial.  It alleged things that the underlying studies never said about lack of faculty-student engagement in online courses; it makes a ridiculous assertion about college administrators wanting to use MOOCs in freshman English; and it seemingly doesn’t realize that online courses are widely deployed now, despite itself reporting that 1/3 of 7 million students are taking at least some online courses.

Let’s examine the very last line of the editorial again, because it’s quite striking: “But, so far, the evidence shows that poorly designed courses can seriously shortchange the most vulnerable students.”  There’s a certain “duh” aspect to the sentence—of course poorly designed courses shortchange students—and it’s all students who are shortchanged, not just the most vulnerable ones (though the most vulnerable are presumably in the worst position to overcome poor course design).  The more serious flaw in the sentence is the seeming lack of awareness that poorly designed courses of any type (online, face-to-face, whatever) shortchange students of all types.

What I like about the Community College Research Center’s papers is that they are looking at courses from several criteria that really matter:  are students able to successfully complete these courses?  Are students able to get decent grades in them?  Are there any subgroups of students who are having unusual success or unusual problems?  In other words, they’re looking at student success rates.  Despite the Times editorial’s general suckiness, at least their heart is in the right place—they too are concerned about student success, especially that of disadvantaged students.  Faculty should be concerned about student success rates for courses—we’re here to help students succeed while maintaining appropriate standards and meeting appropriate outcomes.  If any course (online or face-to-face or whatever) has a success rate that is lower than reasonable, we need to do what the CCRC’s research papers are doing:  (1) research why that is the case and (2) suggest steps to try to fix the problem.

What I don’t like, in both the Times editorial and in some of the emails that have been going back and forth on the subject, is the ignoring of today’s and tomorrow’s reality.  Huge numbers of students are taking online courses successfully now.  The rate of increase is accelerating. There are very real reasons for this increase—the increased flexibility and the expanded opportunities for students that the CCRC identifies as the major elements that make online courses an “indispensible strategy”.  We can dig in our heels and try to hold the world back, but it’s a losing proposition.  There’s a heavy price for ignoring reality—increasing irrelevance.   Our focus should be on how best to improve our students’ success in all modalities: face-to-face, hybrid, and online.

 

 

Last Time’s Trivia Contest

Questions last time focused on the words that rhymed with “MOOC”.  Our winner was Bob Harbort, who got all five within two minutes.  Here are the correct answers:

  1. What you call someone who acts crazy.  A Kook.
  2. John Wayne.  The Duke.
  3. Darth Vader was his father.  Luke Skywalker.
  4. Wealthy king of Egypt who was overthrown by Gamal Abdul Nasser.  King Farouk.
  5. The first selective one was introduced in 1928 by the Automated Musical Instrument Company.  Jukebox.

 

This Week’s Trivia Challenge

In honor of this weekend’s big event, today’s quiz will focus on “The Oscars”.  As usual, the first with the most takes the prize.  No looking up the answers now!  SEND ALL ENTRIES BY EMAIL TO zszafran@spsu.edu, since if you put them as a response on the BLOG, everyone will be able to see them!

  1. Not Felix—the sloppy one.
  2. South African bladerunner, arrested for allegedly killing his girlfriend.
  3. Broadway composer, partnered with Richard Rodgers
  4. Canadian Jazz pianist.
  5. That is what I truly wanna be…
Advertisements
This entry was posted in Uncategorized and tagged . Bookmark the permalink.

3 Responses to February 21, 2013

  1. Rich Halstead-Nussloch says:

    Hi Zvi,
    Well researched. Well put. Would you consider re-posting in the specific BLOG Russ Hunt put up for this topic? Thanks.
    Rich H-N

  2. I think that we’re entering a time that we’ll be asked as institutions to describe the value we provide to students beyond awarding individual course credits. As an institution I believe we should articulate what we do that sets us apart from other academic credit providers (in whatever form they may take in the future). This will require a deeper understanding of how students progress (or don’t) through our programs, defining our goals for students outside of learning outcomes for individual courses and perhaps the broader development of our students.

    One of the best models for taking into account the educational experience for students is the Community of Inquiry Model ( http://communitiesofinquiry.com/model ). This brings into focus the role that interaction between students and students, students and faculty, as well as the environment in which the students learn. Others may disagree with the applicability of a constructivist philosophy in undergraduate courses, but I believe that our courses should provide more value to students in their personal development than the delivery of content.

  3. Bri Morrison says:

    Zvi,
    I agree with your assessment of the editorial and I also believe that it took some liberties in presenting the data acquired through the studies. I am particularly intrigued with the suggestions made to improve student success in online classes: screening, scaffolding, early warning, and wholesale improvement. Currently I believe that SPSU does an excellent job with one of these (wholesale improvement based on TADL), fair with two of them (screening and early warning), and nothing that I am aware of for the last one (scaffolding).
    I would love to see much more screening done for students who wish to enroll in online classes including a required “orientation” to online courses (scaffolding). One of the things I struggle with is the early warning system…if an online student hasn’t “appeared” within the first two weeks of the online class, what mechanism do we have to draw them in? Other than reporting them as a no-show or not engaged, it certainly doesn’t show promise for the rest of the course and by this time we have already accepted their tuition. Perhaps the screening could include whether or not they have attempted a previous online course (and were unsuccessful, specifically for not being engaged).
    For the record, I agree that online courses are here to stay and that we *must* find a way to make them work to allow the students to be successful. And personally I have seen a noticeable difference in the students who take and succeed in online courses today versus 3-5 years ago. The students are adapting to online courses and many of them learn as much (or more) than in traditional classes, however I don’t yet feel that this is the majority. Perhaps with some additional work on our part we can work toward that goal.
    Briana

Comments are closed.