Expanding our Academic Challenge Distinction beyond the First Year

Since 2011, two national studies of successful learning outcome improvement through educational assessment have highlighted our efforts at Augustana College.  First, the National Institute of Learning Outcomes Assessment (NILOA) published a report detailing the ways that a small group of uniquely successful institutions developed and maintain a positive culture of assessment and improvement.  Second, the National Survey of Student Engagement (NSSE) conducted an in-depth study of eight institutions, chosen from an original pool of 534 colleges and universities that had made significant gains on various NSSE benchmark scores, to identify some of the organizational values and practices that allow these institutions to make such clearly demonstrable improvements in their educational environments.

The data point that most clearly jumped out to both research teams involved the degree to which our first-year scores on the NSSE Academic Challenge benchmark increased between 2003 and 2009.  This benchmark scale asked a series of questions about the amount of time and effort students must put into their coursework to meet academic expectations and has been a staple of NSSE and the Wabash National Study.  As many of you know, we can pin our own improved Academic Challenge scores to the overhaul of our general education and LSFY programs about seven years ago, when a preponderance of earlier data simply didn’t comport with the kind of institution we wanted to be.  And even though we continue to note, discuss, and tweak perceived weaknesses that have emerged since implementing AGES, we shouldn’t let these more recently identified concerns detract from the fact that our earlier efforts were thoroughly successful in improving the educational quality of Augustana’s first year experience.

Yet the evidence of an improved educational environment (as represented by an increase in the academic challenge experienced by our students) did not seem to extend beyond the first year.  In our 2009 NSSE report, despite a significant difference in first-year academic challenge scores between Augustana and a group of 30 similar residential liberal arts colleges, our fourth-year academic challenge scores remained no different than other institutions.  Many of us were troubled by the possibility that the distinction in academic quality that we might have established in the first year could have eroded entirely by the end of the fourth year.  Although senior inquiry was intended to help us increase our level of academic challenge in the fourth year, the 2009 NSSE report did not reflect any impact of that effort (likely because SI was not fully implemented until 2010 or 2011).  So when we received our Wabash National Study four-year summary report a few weeks ago, I specifically wanted to examine our seniors’ overall score to the Academic Challenge scale to see if we’d made any progress on this rather important measure of educational quality.

(At this point, the empathetic side of my brain/soul/elven spirit/gaseous particles has guilted me into offering a pre-emptive apology.  I am going to talk about some numbers without giving you all the detailed context behind those numbers.  If you want more context, you know where to find me.  Otherwise, try to hang in there and trust that the changes these numbers represent are substantial and worth discussing.)

The Wabash National Study evidence suggests that, once again, our efforts to respond to assessment data with changes that will improve Augustana’s educational quality seem to have born fruit.  Between 2009 and 2012, our seniors’ Academic Challenge score jumped from 62.6 to 64.3 – a statistically significant increase.  Moreover, the difference between our mean score and the average Academic Challenge score of the 32 similar institutions that participated in the Wabash National Study (61.0) was statistically significant – suggesting that something we are doing during the fourth year distinguishes the academic quality that we provide from those institutions.  For my own information and confidence in this conclusion, I also looked at the 2012 NSSE annual report just to see if these Wabash Study numbers differed in any meaningful way from the much larger sample of institutions that participated in NSSE.  Again, our Academic Challenge scores placed us above the NSSE average of similar liberal arts institutions (62.5) and well above the overall NSSE average (58.4).

All of this evidence seems to point toward a familiar and heartening – if not downright exciting – conclusion.  Our efforts to improve the educational quality of an Augustana experience are working (or as the famous line goes, “I love it when a plan comes together!” . . . yes, I just quoted Hannibal Smith from the 1980s TV show “The A-Team” in a blog about institutional research.  I’m fired up – deal with it.).  The academic challenge our students’ experience in their fourth year appears to have increased.  And while we don’t have comparative data on the degree to which this effort has increased our students’ learning outcome gains (because we don’t have identical pretest-posttest outcomes data from 2009), it is clear from the Wabash National Study data that our 2012 Wabash Study participants repeatedly made larger learning outcome gains than students at the 32 similar institutions participating in same study.

Later this year we will receive the full Wabash study dataset that will allow us to examine the responses to each individual question in this scale.  I am looking forward to digging deeper into that data.  But for the time being, I think we deserve to take a moment and congratulate ourselves as a community of educators dedicated to the success of our students.  Although we continually hear critics of higher education lament that institutions refuse to collect the kind of data necessary to meaningfully assess themselves, or that faculty perpetually resist making the kind of changes that might substantively improve an institution’s educational quality, we now have multiple sources of evidence to demonstrate that, while we might not be without reproach, we have living, breathing evidence of our successful efforts to improve the Augustana education.

Are we there yet?  No.  Will we ever be there?  Of course, not.  But are we genuinely walking the walk of an institution committed to its students and its educational mission?  Absolutely.

Make it a good day,

Mark

 

62.6 to 64.3

Do student’s GPA suffer when they take more classes?

One claim (given as advice) that I’ve heard ever since I was a plump, pimple-faced college freshman is that taking a heavier academic load in a given term (no matter the calendar) increases the likelihood that one’s grades will suffer.  It seems intuitive:

more classes (and thus more homework) / the same number of hours in a week =          less study time to allocate to each class and therefore potentially lower grades

At Augustana we are understandably sympathetic to this concern because of the degree to which we often try to pack an extensive amount of learning into our shortened academic terms while maintaining the comparatively higher number of hours in class that we require for a credit hour.  Many of us can weave a harrowing tale of students’ swamped by the academic requirements of a four-course term, but it would be wise to wonder whether our individual anecdotes actually represent the experiences of most students.  So a few weeks ago, we decided to empirically examine this wide-spread belief.  Since this concern is often raised by faculty and administrators when discussing the merits of potential policy changes, this hypothesis seems a compelling argument to test.

So we examined our students’ term-by-term GPAs over the last three years (nine terms from the fall of 2009 to the spring of 2012), comparing the GPAs of students who attempted between 8 and 11 credits – less than four three-credit courses – with the GPAs of students who attempted 12 or more credits – four three-credit courses or more.  Moreover, we conducted this analysis in two stages.  In the first analysis we only tested whether the number of credits attempted significantly impacted students’ end-of-term GPA.  In our second analysis, we accounted for two potentially confounding factors: (1) a student’s pre-college academic ability, and (2) a student’s year in school, to make sure that any statistically significant effect we might find wasn’t a function of another plausible explanation.

Our first set of analyses surprised us.  Because we thought we’d find one of two possible outcomes – either the reigning hypothesis would hold true or we would find no significant difference between the two groups.  So we were pretty shocked when we found that in every academic term from fall of 2009 through spring of 2012, students who attempted 12 or more credits, on average, earned a HIGHER GPA (between .05 and .12 points) than those who attempted 8-11 credits.  Huh?

In the second stage of our analyses, we held constant students’ incoming ACT score and year in school.  At this point, I was sure that we’d end up with insignificant findings.  Instead, the finding from our first analyses held throughout.  Not only do students who are taking a heavier load not suffer in terms of a lower GPA for that term, but their GPAs (no matter the year in school or their incoming academic ability) were marginally higher.  Huh.

So what does this mean?  Certainly, the obligations of a heavier credit load can adversely affect a student’s stress level or sleep patterns even if they don’t necessarily impact grades.  And unfortunately, the only data we have readily accessible is term-by-term GPA and term-by term-credits attempted.  In addition, the findings might be different if we looked at each student’s term-by-term GPAs longitudinally instead of comparing all students cross-sectionally across a given term.  However, students must pay overage fees to take more than 33 credits a year, so the chances of a substantial portion of students consistently taking 12 or more credits, earning strong grades, and compromising this finding is pretty low.  In the end it seems that a heavier credit load doesn’t impact students’ grades in the way that we might have thought.

I wonder if this finding exemplifies a disconnect between the way that we tend to think students engage college and the way that they actually manage their college experience.  For years we have lamented the difference between the amount of time we think that our students should study and the amount of time our survey data suggests that they actually study.  Yet these same students graduate with an average GPA of 3.3, an increasing number of them graduate with honors, and many of them go on to successful, challenging professional lives.  And lest some might want to resurrect the allegation that this is further evidence of the corrosive effects of grade inflation, (1) we have multiple sources of evidence that suggest our students make more than respectable gains on various learning outcomes, and (2) we tested the grade inflation claim last year and found it to be explained by increases in our students’ incoming ACT scores over the past two decades.

I wonder if this is an indication that students are more capable of prioritizing their time and effort than we might give them credit sometimes.  And while I’m not suggesting that this finding should be used to require that they take a heavier academic load every term, I wonder if we might take our feet off of the academic gas pedal a little too easily sometimes – which is easy to do in the face of a roomful of scowling students to whom you have just assigned an additional assignment.  One student experience measured in the Wabash National Study that was particularly predictive of learning gains was the degree to which students were challenged to work harder than they thought they could to meet their instructor’s expectations.  Our finding regarding grades and course load suggests a similar result.  If we push our students, they might surprise us.

Make it a good day,

Mark

 

Applied Learning Opportunities and Perceptions of Worth

In 2007 the Association of American Colleges and Universities (AAC&U) published College Learning for the New Global Century to launch the Liberal Education and America’s Promise (LEAP) initiative.  This document asserted a new way of conceptualizing the primary learning outcomes of a college education, focusing on four categories of transferable knowledge, skills, and dispositions:

  • Knowledge of Human Cultures and the Physical and Natural World
  • Intellectual and Practical Skills
  • Personal and Social Responsibility
  • Integrative Learning

It wasn’t as if the shift from a focus on content knowledge acquisition to an emphasis on transferable skills and dispositions was a brand new idea.  But the public nature of this assertion from one of, if not the, major association of colleges and universities made a powerful statement to postsecondary institutions of all kinds that the cafeteria-style of content acquisition that had dominated most college curricula was no longer sufficient in preparing students to enter post-graduate life.

Throughout College Learning for the New Global Century, AAC&U urged colleges and universities to find ways for students to apply their learning in experiential settings.  They repeatedly cited the substantial body of research supporting the educational importance of application for deep and transformative learning.

At Augustana we’ve put a high value on these kinds of experiences, and our survey of seniors last spring directly asked about the degree to which students’ out-of-class experiences helped them connect what they learned in the classroom with real-life events.

Our seniors’ responses looked like this.

Strongly Disagree 2 0%
Disagree 13 3%
Neutral 77 15%
Agree 271 53%
Strongly Agree 141 28%

It is certainly heartening to see that more than 80% of our seniors indicated “agree” or “strongly agree.”  Moreover, this data confirms that many of the experiential opportunities that we provide for our students seem to be functioning in an educational capacity rather than simply serving as a respite from academic pursuits.  Analyses of other data from our participation in the Wabash National Study demonstrates that our students who engage in applied learning experiences make greater gains on a variety of learning outcomes than our students who do not.

But I want to point out another side of this finding that I think is worth considering.  I think that this data may be instructive as many of us – faculty, staff, administrators, and board members – continually try to make the case to prospective students and their parents that an Augustana education is worth the price they are asked to pay.  Moreover, not only does this data point help us focus our assertion that Augustana provides an education that is worth the cost, but I believe it should point us toward the way we need to think about the important yet slippery (and sometimes even a little bit uncomfortable) concept of “value proposition.”

At the end of the summer we analyzed our senior survey data to see if we could identify specific student experiences that increased the likelihood that our seniors would, if given the chance to relive their college decision, definitely choose Augustana again.  I think this is an important outcome question because it suggests the degree to which our seniors think that the money they spent to attend Augustana was worth it.  Since without tuition revenue we are out of business, this is an aspect of our work that we simply can’t ignore.

Our analyses revealed that the degree to which our seniors’ out-of-class experiences helped them connect their classroom learning with real-life events significantly increased the likelihood that they would definitely choose Augustana again.  I’d like to emphasize that we were testing whether students would DEFINITELY choose Augustana again – not “maybe” or “probably.”  In essence, in addition to being an important driver of student learning, I think our seniors explicitly recognized the educational value of these experiences.  As such, they were more than able to connect this educational value with the long-term benefits of the financial investment they had made.

I would suggest that this finding can guide the way that we talk about the value or worth of an Augustana education AND the way that we think about the admittedly amorphous notion of a value proposition.  At it’s essence, “value proposition” is supposed to represent the maximum synergy between the value promised by an institution and the perception by the student that this value will be fully delivered.  The difficulty, temptation, and sometimes suspicion, is that the folks who concentrate on establishing and strengthening a value proposition tend to focus more on the glitz of the marketing than the quality of the product.  Nonetheless, whatever your opinion of this phrase it’s hard to deny the concept’s importance.

In the context of this notion of value proposition, the data point I’ve described above puts in mind the famous line from the movie Field of Dreams. “If you build it, he will come.”  (No it’s not “they” . . . and yes, I was surprised too)  Every college in the country right now is pulling out all of the stops to create the most persuasive marketing campaign.  While we have admittedly been doing the same thing, we have also been concentrating on building an educational experience that is as fundamentally effective as it is precisely interwoven.  We may not have perfected our product, but we have developed an educational experience that is consistently producing robust evidence of strong learning outcomes.  I would humbly suggest that the key to maximizing our value proposition is in the product we build.  More than simply listing all of the experiential learning opportunities in which students can participate, when we can explain to students how each of these experiences is designed to help them apply and solidify an important aspect of their learning and development toward the person they aspire to be, we make a case for an Augustana education that is substantially more nuanced, adaptable, and compelling than the argument that prospective students hear from most other institutions.

I believe this is a way that we can ultimately communicate distinctiveness in a manner that is both powerful and personal.  More importantly, it allows us to live a story that never stops getting better.  And at the end of the day, that sure feels like we are doing what we were meant to do.

Make it a good day,

Mark

 

 

Hey, . . . how did we do that???

Welcome back!  I hope your engine is recharged for the spring term.

You might remember that about this time last year I was talking to anyone who would listen about the importance of the final round of data collection for the Wabash National Study of Liberal Arts Education (WNS).  The WNS was designed to combine learning outcome measures with student experience and pre-college characteristics data so that institutions could (1) assess student change over time on specific learning outcomes and (2) begin to identify the experiences that influenced that progress.  Augustana joined the third and final iteration of the WNS in 2008, so 2012/13  was our make or break year to get data from as many seniors as possible.  Since the study measured change over time, without senior year data, participation in the study would have been a giant waste of time.  After a nearly herculean effort and a paper bag full of gift cards to the Augie bookstore, we were able to entice about 190 seniors to participate – 120 of whom had also provided data during their freshman year.  All together, this dataset gives us a chance to thoroughly analyze the learning experience of a fairly representative sample of our 2012 graduates and make some generalizations about our overall educational effectiveness.

Last week we received the first of several long-awaited reports outlining our students’ results on the learning outcomes measured by the WNS.  I’d like to share one particular finding (I’ll share others with you over the course of the spring term) and ask your help in thinking about what might be behind it.  It’s not quite “a riddle wrapped in a mystery inside an enigma” (thank you, Winston Churchill), but it’s got me flummoxed.

One outcome of particular importance to religiously-affiliated liberal arts colleges is moral and ethical judgment.  For a lot of reasons we hope that our students develop a  sophisticated sense of the principles and values that shape their understanding of right and wrong.  Moreover, we hope that our graduates act as principled citizens who stand up for those values even in the face of pressure to conform or fear of reprisal.

It turns out that Augustana students made remarkable gains on the WNS measure of moral judgment.  In fact, our students’ gains were on average 50% larger than the average gains made by students at the 32 other small colleges that participated in the WNS.  Digging a little deeper, virtually all of that positive advantage (i.e., the 50% larger gain noted above) occurred during the first year.  After making substantially larger gains than students at comparable institutions, during the sophomore to senior year our students’ growth did not differ substantially from students at other institutions in the study.  In other words, our student raced out to big lead during the first year and held it through to graduation.

This finding is both exciting and, to be honest, a little troubling.  First, it is exciting that we now have some hard evidence to support our claim that Augustana graduates develop deeper and more sophisticated moral and ethical judgment.  One of the major criticisms of higher education institutions is that we make bold claims with very little proof to back them up.  Now we can say with some degree of certainty that we do what we say we do.

However, there is something about this finding that troubles me – and is the issue that I’d like your help with.  The findings from the WNS suggest that the bulk of our students’ growth in moral judgment happens during their first year.  Since we would like to think that we have intentionally designed the educational experience of our students, then we should be able to point to the program or combination of programs that likely produce this remarkable gain in moral judgment.  This is from whence my flummox cometh.

Now if we were only interested in proving our educational value, this data would make me think something along the lines of “game, set, match Vikings.”  But our interest in assessing student learning shouldn’t be merely about validating claims that we’ve already made. That is a dangerous game to play to be sure.  Rather, I want to know how we can do what we do just a little bit better.  Instead of merely proving our worth, I’m interested in improving our quality.

And I don’t think I can pinpoint any particular program that is designed to influence this outcome.  Our only curricular mandate for first year students is the LSFY sequence.  Are their other courses that we might to which we might attribute these gains, such as the Christian Traditions course?  I know the faculty who teach those courses do wonderful things, but I’m not sure the focus of that course is developing moral judgment.  Is there a program designed for first year students that is run by residence life or student activities?  I just don’t know.

The reason it seems important to me to be able to identify the experiences that are driving this gain is that we should want to take full advantage of this finding and figure out ways that we can take advantage of something that we are already doing well.  And this is where I’m stuck.  What are we doing that is working?  Is this just luck?  Coincidence?  I’d like to think not.

It seems pretty likely that there is something going on here that sets us apart from the other schools in the WNS.  The number of participants in the study and the size of the difference in gains is just too large for this to be a function of random chance.  So if you have an idea of what might be influencing our students’ gains in moral judgment, please post it in the comments section.  For us to be best able to (1) make our case as an institution to prospective students and families, and (2) maximize what we do in a way that takes full advantage of our talents and resources, we need to figure out what is driving these gains.

Make it a good day,

Mark