Taking advantage of a culture of student involvement

Some of you have expressed concerns about our students’ high level of involvement in extra-curricular activities and its potential effect on academic engagement as well as mental health.  There seems to be plenty of evidence to support this concern – some of the most poignant being the writings of our students themselves in the Observer and on the Augie Blog.

Yet there is one lesser known data point regarding our students’ social behaviors that reflects well on a cultural norm within the student community.  In addition, I’d like to suggest that we might learn something from our students’ social behaviors that could be a powerful lever in deepening academic engagement.

A question on the National Survey of Student Engagement asks students how often in the last year they “attended an art exhibit, play, dance, music, theatre, or other performance.”  The response options are 1= Never, 2= Sometimes, 3= Often, 4= Very Often.  Instead of focusing on the absolute numbers here (mostly because there is no consensus ‘window’ within which we want our students’ response to sit).  Rather, I want to present this data in the context of a comparison with other comparable private liberal arts colleges and what that might suggest.

It turns out that both our freshmen and seniors attend these kinds of performances substantially more often than students at comparable private liberal arts colleges. In fact, the difference in average response between our students (freshmen – 2.70, seniors – 2.60) and students at comparable colleges (freshmen – 2.44, seniors – 2.36) is extremely statistically significant, meaning that this difference is likely attributable to something happening here.

This suggests to me that there is something in the student culture that encourages and values supporting the arts.  Our students place a relatively high value on attending and supporting friends involved in those performances.  This finding corroborates independently gathered anecdotal evidence from the Office of Student Services.

What does this have to do with deepening academic engagement? If students are in the habit of supporting their friends’ co-curricular accomplishments, I would suggest that this apparent cultural norm provides a real opportunity to increase the relative value of and interest in students’ academic accomplishments. Public presentations of student scholarship can serve as a spark to inspire informal conversations among students about intellectual ideas and their application to the world around them.  Although traditionally associated with the fine arts, there is plenty of evidence to suggest placing a greater value on public presentations or exhibits of scholarship can deepen academic engagement outside of class across many disciplines.

By the way, our NSSE data also indicates that our students don’t talk about ideas from readings or classes with others outside of class at the same rate as students at comparable institutions. Hmmm . . .

Make it a good day,

Mark

Getting an handle on academic rigor

Like most colleges and universities, we believe that we should establish an educationally rigorous environment.  Unlike a lot of colleges and universities, we have a healthy body of quantitative and qualitative evidence from which we can explore, 1) whether this is in fact the case, and 2) whether appropriate academic rigor is experienced by students across the board or only in certain situations.

As you may know, for well over a decade we have been using various assessment mechanisms to measure student learning and academic rigor.  It seems that our efforts to increase our educational effectiveness and academic rigor have borne some fruit – especially on our National Survey of Student Engagement (NSSE) Academic Challenge scores among freshmen.  Those numbers have jumped markedly since we first used NSSE back in 2003.

Yet one of the hallmarks of a college that is truly focused on continual improvement is a perpetual inclination to ask questions, to compare findings with what we might already suspect or know by different means, to face what we uncover, and to take action.

With that in mind – and in light of my perpetual effort to help us all embrace a formative spirit, I’d like to present two data points from the 2006 and 2009 NSSE survey that seem especially worthy of further consideration.

In both 2006 and 2009, Augie students were asked how often they “come to class without completing readings or assignments.” (The response options are 1=Never, 2=Sometimes, 3=Often, and 4=Very Often.)  I would propose that the one thing we would not want to see is that seniors come to class unprepared more often than freshmen.  Unfortunately, this does not appear to be the case.  And if you were wondering, the difference between the average freshmen and senior response is large enough to be significant.

NSSE Year

Freshmen

Seniors

2006

1.91

2.13

2009

1.84

2.20

 

To add insult to injury, the change from freshmen to senior year looks worse when comparing our 2009 data to other small liberal arts colleges.  In this context, our freshmen actually come to class prepared significantly more often than freshmen at comparable institutions.  However, our seniors come to class prepared significantly less often than seniors at comparable institutions.

Does this match what we already suspect?  Are we ok with it?  How might we address this issue?

Make it a good day,

Mark

A reason to be proud of our efforts to improve student success

There was a time in higher education when an institution’s attrition rate was a point of pride and a supreme marker of academic rigor.  More recently, it sometimes seems as if retention and graduation rates have actually surpassed educational growth in ranking institutional quality.  In reality, these two markers are clearly intertwined.  Although educational growth is paramount, such growth seems a bit empty if an institution is also hemorrhaging students somewhere between matriculation and graduation.

 

So if a college could actually demonstrate substantial educational growth while simultaneously increasing retention rates, the faculty and staff at that institution would have a real reason to take great pride in their collective accomplishments.

 

It is becoming clear that Augustana College is one such institution.  We now have both direct and indirect evidence of educational growth.  Using the Collegiate Learning Assessment (CLA) to measure the growth of critical thinking skills between 2005 and 2009, our students improved by 28 percentile points – double the average of students participating in the Academically Adrift study.  In addition, during the last 10 years our NSSE Academic Challenge Benchmark scores have improved significantly among first year students – an accomplishment that was recently highlighted by the National Institute for Learning Outcomes Assessment.

 

By itself, this is well worth a hearty pat on the back.  However, it looks even better in the context of our increases in retaining students to the second year.  For a long time, Augustana’s first-to-second year retention rate hovered around 85%.  Three years ago, we retained the freshman class of 2008 at 82%, sparking some concern as the size of our incoming 2009 class also dropped.  After an increased focus on supporting struggling first year student, our retention rate among the 2009 class jumped to about 87%.  But we weren’t sure if that was an anomaly or a true reflection of our efforts.

 

Now that we have locked in our 10th day enrollment data this fall, we are able to look at our first-to-second year retention rate for the incoming class of 2010.  Some of us had wondered aloud whether our retention rate with this class would take a hit, presupposing that it’s a lot easier to retain students from a class of 616 (the freshman class of 2009) than students from a class of 753 (the 2010 freshmen class).

 

However, our retention rate for the 2010 freshmen class remained steady at 87.5%. Thus, despite an increase in class size of 137 students, we maintained Augustana’s highest retention rate on record.  Your efforts to help students succeed in the first year are bearing fruit.  We have a lot reasons to be very proud of our community.

 

Make it a good day,

 

Mark

Is grade inflation just a bunch of hot air?

I suspect that almost everyone has heard the “it was better in the good ol’ days” claim …if we haven’t even used it ourselves from time to time.  I would suggest that we have an academic version of this claim at Augustana.  The claim argues that there has been substantial grade inflation over the past several decades.   Apparently, this claim has carried some weight over the years, because we have created multiple mechanisms to prevent grade inflation – or at least stem the tide.

Luckily this is a claim we can test.  But before looking at the data, let’s make sure we share an understanding of this claim.  An assertion of grade inflation boils down to two points.

1)      Grades have been creeping upward.

2)      This is because faculty have shifted expectations for performance downward.

Grade inflation doesn’t just make an observation about changes in GPA; it also attributes the change to the failure of colleagues to hold the line on academic rigor.  In the context of a small college, it’s sort of a less physically damaging version of a circular firing squad.

So, testing this claim turns into two questions.  First, have grades gone up over time? And second, can we conclusively attribute this change in GPA to faculty grading practices?

Have grades gone up over time?    

Yes.

From about 1991 to the present, the average GPA of each class went up by about .15 of a grade point, whether you look at each entering cohort’s end-of-year grades from the first year to fourth year or you look at each subsequent cohort’s end-of-year grades from 1991 to the 2010.

Can we conclusively attribute this change in GPA to faculty grading practices?

No.

First, the increase in average GPA for each cohort from first to fourth year is predominantly explained by the departure – voluntary or otherwise – of students who struggled academically.  If you slice that group off the bottom of a class at the end of each year, and you recognize the likely influence of maturity and motivation for the students who remain, we would fully expect that the average GPA of a particular cohort of students would go up over time.

Second, from 1991 until 2010 the average ACT score of our incoming students improved by a full point – from 24.5 to 25.5. Since the ACT remained constant during that period, we can test whether the increase in GPA might be explained by the increase in students’ incoming academic ability.  It turns out that this increase in average test score explains virtually all of the change in GPA over the twenty year period in question.

The Verdict:

Faculty grading behaviors may well have changed over time – maybe for worse, maybe for better.  But we have little evidence to suggest a relationship between those behaviors and an increase in overall GPA.  In addition, we have better evidence to suggest that a change in our students’ pre-college academic ability might have influenced this change in GPA.  Interestingly, if faculty grading behaviors had changed in the way that the grade inflation claim suggests, ACT scores would have likely not been as powerful a predictor as they turned out to be.

So the next time you hear someone mention the good ol’ days in the context of academic standards and grades, you might remind them that there are other – and maybe better – explanations for this phenomenon.  You might also remind them of the relative trade-offs of a circular firing squad.

 

Make it a good day,

Mark