Why We All Need to Fully Invest in Symposium Day

Last Tuesday we held our third Symposium Day – an event intended to bring faculty, staff, and students together, collectively dig deeper into a specific social issue, and through our actions live the values that we espouse as a communally-conscious liberal arts college.  Pulling off such a production is no easy feat, and those who organized and administered each Symposium Day deserve substantial credit for their efforts.  Furthermore, as a college community we should be generally proud of our first year of Symposium Days, since participation across the college ranged from respectable to truly impressive.  However, Symposium Day has also exposed the tendency among some of us to lean toward our personal inclinations rather than genuinely commit to a communal endeavor.  Some of us only participated sparingly, others skipped the events altogether, and a few – despite the vote of the faculty to schedule no classes on Symposium Day – stubbornly conducted their classes anyway.  And just in case you think I’m throwing stones from my own glass house, I humbly admit that I didn’t engage in the full spirit of Symposium Day to the degree that I should have, either.

I was reminded of why we chose to embark on this grand experiment that we have called Symposium Day while reviewing our recent Wabash National Study data that noted our students’ static attitudes toward civic engagement over their four years in college.  So I’d like to share our results on this particular outcome in the hope that it will bolster our commitment to making Symposium Day as educationally beneficial as possible.

The Wabash Study asked students a set of questions about their interest and willingness to engage in collective action for the good of the community at the beginning of the freshman year, the end of the freshman year, and the end of the senior year.  Augustana had 126 students who provided data at all three data collections points – enough for us to be confident about the degree to which these data might represent our overall student population.  Here are Augustana’s average scores (on a 5 point scale) at each data collection point in the study.

Beginning of Freshman Year (Fall, 2008)   –  2.69

End of Freshman Year (Spring, 2009)        -  2.67

End of Senior Year (Spring, 2012)             –  2.66

Essentially, the importance that our students place on civic engagement didn’t change.  Equally alarming, since the response options for each question were laid out on a five point scale (strongly disagree=1, disagree=2, neutral=3, agree=4, and strongly agree=5), our students’ average score at each point in their Augustana career translates to just south of a robust “meh!” – not exactly the marker of graduates who, as our college mission statement describes, would be prepared for a “rewarding life of leadership and service in a diverse and changing world.”

So, despite our assertion that Augustana students develop a deeper awareness of and interest in making a difference in their communities, and despite our emphasis on volunteering and a marked increase in service-learning courses in recent years, these findings suggest that our students depart much as they enter – at best somewhat ambivalent about the importance of civic engagement.

This brings me back to the potential for Symposium Day to help us make good on our promise to prospective students and parents regarding civic engagement.  Each of the educational outcomes that we try to develop in our students encompasses both cognitive and affective dimensions.  For example, a deeper commitment to civic engagement requires a strengthened sense of self and an increasingly nuanced understanding of the interconnectedness of our world.  Furthermore, students’ ability to use these skills in real-world contexts depends upon the extent to which we teach them to apply theories, ideals, and information in various real-world situations.  Symposium Day as a re-occurring program – a program that students should experience twelve times during their four years at Augustana – sets up the potential for us to influence student growth along multiple dimensions and apply this development repeatedly in a variety of real-world contexts.  In order to do that, experiences designed for freshmen, sophomores, juniors, and seniors need to be developmentally appropriate, with each subsequent Symposium Day experience building on the last.  In addition, Symposium Day provides a key launching mechanism for students to take things they’ve learned in their courses and put them into action to better understand and address real world issues.  So students need to take least one course each term that somehow connects with the focus of that term’s Symposium Day in a way that prepares them to make the most of the experience and make meaning of it afterward.

For all of that to happen, students need to hear the same message everywhere they turn – that Symposium Day is a critical part of the Augustana learning experience.  It’s not a day off, intellectually or actually, and mere attendance shouldn’t be mistaken for authentic participation. In the end, we – faculty, administrators, and staff – have to exemplify the value of communal engagement and the importance of our impact on our local community.   That might mean adapting our courses to incorporate the theme of Symposium Day, even if this means a little extra prep work each year.  I readily admit that this might go against some of our longstanding notions of autonomy in academia.  But I hope we would all be willing to give up a little autonomy in order to foster the kind of student learning that we have long claimed to result from an Augustana education.  Symposium Day is a wonderful opportunity for us to create a community of more fully engaged citizens.  It would be a shame for us to miss that chance, especially when we do so many other things that appear to hit an educational home run.

Make it a good day,



Can we actually increase students’ intrinsic motivation in the first year?

We’d all love to believe that our students develop a love of learning “for learning’s sake.”  But more often than not we find ourselves dealing with students who seem motivated to learn only because of some combination of potential future rewards and/or the threat of penalties or punishment.  Some have lamented that the impact of extrinsic motivators in the primary and secondary educational system (NCLB, etc.) has so thoroughly turned students’ reasoning for learning into a return-on-investment equation that the die is cast long before they enter college.  Yet prior studies of changes in motivational orientation during college suggest that students’ orientation toward intrinsic motivation does increase between the freshman and senior years.  The question I’d want to know is whether there is anything we can do to influence the development of intrinsic motivation or if it is simply a function of maturity over time?

As a college committed to a liberal arts philosophy and the belief that our students are better off if their actions are spurred by intrinsic motivators, I think we’d want to know which particular experiences fuel the development of intrinsic motivation.  So in the fall of 2011 we began a four-year longitudinal study of the college experiences that impact intrinsic motivation among Augustana students.  During freshman orientation we asked students to complete a survey of motivational orientations.  In the spring of 2012, LSFY 103 instructors allowed us to survey freshmen again with the same motivational orientations measure.  In addition, we included a survey of about 25 questions taken from NSSE or the Wabash National Study that we already knew had been linked to important educational growth on a variety of outcomes.

Interestingly, we found a number of predictors of an increase in intrinsic motivation.  Some of them would be as you’d expect, particularly students’ aspirations to pursue graduate school after college.  These findings were important to account for in our analysis because we wanted to isolate the potential effect of first-year experiences . . . if we in fact found any.

Happily, we found two student experiences that appeared to increase students’ intrinsic motivational orientation.  The most prominent experience turned out to be about student-faculty interaction.  Students who said that their interactions with faculty shaped their intellectual and personal development tended to also show an increase in intrinsic motivation.  The second experience that produced a statistically significant effect was the degree to which students’ have informal interactions with people who are different from themselves.  These informal interactions primarily took the form of serious conversations outside of class.

Both of these findings are worth considering in more detail.  Based on our recent Wabash National Study data, while we excel in the quality of our student-faculty interaction during the senior year, our freshmen don’t report quite the same level of quality.  Though this might be attributable to all sorts of circumstances unique to the freshman year, I think it’s worth looking for ways to ensure that our freshmen are engaged in substantive conversations with faculty.  And this student experience is valuable for many reasons above and beyond positively influencing intrinsic motivation.

The impact of informal diverse interactions is also worth considering.  First, in addition to so many other findings on college students, this particular result reiterates the degree to which out-of-class experiences can influence the development of outcomes that are vital to academic success.  From the standpoint of faculty, this finding should further encourage us to develop a deeper understanding of our students’ out-of-class experiences and the way in which those experiences could be integrated with the curricular experience.  This will makes us better advisers, teachers, and mentors to students.

For student affairs professionals, this finding emphasizes the degree to which the impact of student affairs staff can and should be an educational one.  Increasing the degree to which students engage in diverse interactions is by no means impossible, but it surely takes intentionality to (1) expand students’ notion of difference beyond merely gender and skin color, and (2) encourage, cajole, coerce, or even require students to participate in activities that foster, or even directly create, these kinds of interactions.

Cultivating a general level of co-curricular involvement is not enough, for students left to their own devices tend to connect with others who are just like them – there is an understandable comfort in the familiar.  Cultivating a robust environment of diverse interactions requires that we stretch our students, pushing them beyond the familiar.  In order for students to allow themselves to be stretched, we have to think carefully about designing ideal environments for learning that appropriately balance challenge and support as we push them to expand their horizons and deepen their understanding of difference.

We are an institution that has proven its ability to improve by following the data and using that evidence make us better at what we do.  Finding ways to encourage quality student-faculty interaction and informal diverse interactions will help us continue to embody that trait.

Make it a good day,


Lest we rest on our laurels . . .

Last week I noted an important data point from our recently completed participation in the Wabash National Study of Liberal Arts Education (WNS) that suggested an increase in our seniors’ level of academic challenge.  This finding was particularly gratifying because when we instituted Senior Inquiry we had hoped that it would help us maintain the increased academic challenge that we had infused into our freshmen year through the AGES curriculum several years before.  Our 2009 NSSE data had shown a marked increase since 2006 in the academic challenge benchmark among freshman, but the parallel measure among seniors showed no change, suggesting that we might be taking our foot off of the academic gas pedal after the freshman year. This new WNS data provided evidence that we are indeed making progress toward a sustained level of academic rigor across our students’ four years.

But as with all good assessment data, the WNS data provides additional nuances that can help us continue to improve what we do even as we might (and should) celebrate our successes.  So I’d like to introduce two other data points from the WNS regarding academic challenge and student learning, consider them in the context of optimizing faculty work/life balance, and see if there might be something here worth thinking about.

College impact researchers have found that when students are (1) challenged to push their intellectual capacities through substantive assignments and (2) supported in that process with encouragement, direction, and precise and timely feedback, student are more likely to maximize their learning and growth.  In the WNS, two of the scales that address important aspects of challenge measure (a) the frequency of higher-order exams and assignments and (b) the degree to which faculty communicate and maintain high expectations for student performance.  Likewise, two other scales capture crucial aspects of support by assessment (a) the quality of students’ non-classroom interaction with faculty and (b) the frequency of prompt feedback on assignments and performance.

The two tables below report our students’ scores on the two challenge metrics and the two support metrics at the end of the first year and compare those scores to the average scores from the other similar small colleges in the WNS.  The asterisks indicate where the difference score is statistically significant (in other words, the “+” or “-” sign doesn’t necessarily mean anything by itself).

2009 Spring – Challenge Metrics


Comparisons Institutions

Difference Score

Frequency of higher-order exams and assignments



+2.3 *

Challenging classes and high faculty expectations



+2.5 *

2009 Spring – Support Metrics


Comparisons Institutions

Difference Score

Quality of non-classroom interaction with faculty




Prompt feedback




Essentially, this data suggests that in comparison to the other participant institutions in the WNS, we challenge our students during the first year a bit more while we support them at levels similar to other small colleges.

Now, look at what happens to these metrics by the end of our students’ senior year.  Again, remember the function of the asterisks in these tables.

2012 Spring – Challenge Metrics


Comparisons Institutions

Difference Score

Frequency of higher-order exams and assignments




Challenging classes and high faculty expectations




2012 Spring – Support Metrics


Comparisons Institutions

Difference Score

Quality of non-classroom interaction with faculty



+5.6 *

Prompt feedback



+3.5 *

Interestingly, the pattern in the fourth year data is reversed.  By the end of the senior year we appear to challenge our students at levels similar to the other institutions while supporting our students at levels that are significantly higher (statistically) than the other small colleges in the WNS dataset.

So what should we make of this?  I’ve got a couple of thoughts, although I’d love to hear what strikes you (if anything) about these data points.

First, it is worth parsing some of the aspects of academic challenge that impact learning.  The academic challenge measure I described last week asks questions about the number of assignments and the amount of time spent on assignments.  Obviously, it’s tough to push students to learn if they aren’t being asked to put in the time and regularly produce substantial work.  However, the degree to which any workload can effectively impact learning is powerfully influenced by how much of the work requires complex, higher-order thinking (as opposed to simple memorization and regurgitation) and how high faculty set and communicate their expectations of quality.  Otherwise, time on task often devolves into mind-numbing busy work, and there isn’t a more effective strangler of student motivation  than the perception that homework is nothing more than a black hole of directionless wheel-spinning.  Our WNS data suggests that among first year students we’ve ramped up both the amount of work expected AND the complexity of the assignments and faculty expectations (and therefore the educational potential).  However, it appears that while we’ve increased the amount of work expected of our seniors, we haven’t necessarily matched that increase with a similarly expanded expectation of educational complexity.  I suspect that we might be able to improve on our already impressive learning gains if we could find ways to distinguish the nature of our seniors’ academic challenge in a manner similar to what seems apparent in our freshman data.

Conversely, the inverse pattern of change in student support for learning between the first and fourth year simultaneously suggests reasons to celebrate and opportunities to improve.  We have ample evidence that the quality of our support for students in their latter years plays a pivotal role in their development.  However, as we examine ways to increase the success of first year students ever higher (and thereby increase our retention rates), it seems that we might benefit from considering ways to increase student support in the first year to match the level of challenge that we’ve already attained.  Although it would be nice to find a singular solution (the discussion of a Center for Student Success as well as the new mechanism for math placement and remediation may well make a profound impact), I suspect that we might find additional ways to improve by further examining the clarity and uniformity of the LSFY experience and the partnership between the curricular and co-curricular experiences during the first year.

Lastly, I wonder if taken together these findings might provide an insight into a way that we might improve our faculty work/life balance even as we maintain – or even increase – student learning.  Right now our balancing of challenge and support seems to tip toward challenge in the first year and support in the fourth year.  I wonder whether there might be an opportunity to adjust this balance slightly by adding mechanisms for support in the first year and challenge in the fourth year.  In so doing, I wonder if we might find that leaning just a bit less on student-faculty interaction for our upperclass students might allow some of our work to be not quite so time intensive.

At present, it is clear that our efforts are working – but they are clearly time intensive and come at a cost.  I am in no way suggesting that we should somehow become more cold or unfeeling toward our students.  However, as I see the burden of our efforts take its toll again during the spring term, I sincerely wonder if there are ways to reduce the amount of time we spend burning the candle at both ends.  It seems to me that caring for our students shouldn’t necessitate killing ourselves to do so.

I think we owe it to our students and ourselves to at least consider this possibility.

Make it a good day.



Applied Learning Opportunities and Perceptions of Worth

In 2007 the Association of American Colleges and Universities (AAC&U) published College Learning for the New Global Century to launch the Liberal Education and America’s Promise (LEAP) initiative.  This document asserted a new way of conceptualizing the primary learning outcomes of a college education, focusing on four categories of transferable knowledge, skills, and dispositions:

  • Knowledge of Human Cultures and the Physical and Natural World
  • Intellectual and Practical Skills
  • Personal and Social Responsibility
  • Integrative Learning

It wasn’t as if the shift from a focus on content knowledge acquisition to an emphasis on transferable skills and dispositions was a brand new idea.  But the public nature of this assertion from one of, if not the, major association of colleges and universities made a powerful statement to postsecondary institutions of all kinds that the cafeteria-style of content acquisition that had dominated most college curricula was no longer sufficient in preparing students to enter post-graduate life.

Throughout College Learning for the New Global Century, AAC&U urged colleges and universities to find ways for students to apply their learning in experiential settings.  They repeatedly cited the substantial body of research supporting the educational importance of application for deep and transformative learning.

At Augustana we’ve put a high value on these kinds of experiences, and our survey of seniors last spring directly asked about the degree to which students’ out-of-class experiences helped them connect what they learned in the classroom with real-life events.

Our seniors’ responses looked like this.

Strongly Disagree 2 0%
Disagree 13 3%
Neutral 77 15%
Agree 271 53%
Strongly Agree 141 28%

It is certainly heartening to see that more than 80% of our seniors indicated “agree” or “strongly agree.”  Moreover, this data confirms that many of the experiential opportunities that we provide for our students seem to be functioning in an educational capacity rather than simply serving as a respite from academic pursuits.  Analyses of other data from our participation in the Wabash National Study demonstrates that our students who engage in applied learning experiences make greater gains on a variety of learning outcomes than our students who do not.

But I want to point out another side of this finding that I think is worth considering.  I think that this data may be instructive as many of us – faculty, staff, administrators, and board members – continually try to make the case to prospective students and their parents that an Augustana education is worth the price they are asked to pay.  Moreover, not only does this data point help us focus our assertion that Augustana provides an education that is worth the cost, but I believe it should point us toward the way we need to think about the important yet slippery (and sometimes even a little bit uncomfortable) concept of “value proposition.”

At the end of the summer we analyzed our senior survey data to see if we could identify specific student experiences that increased the likelihood that our seniors would, if given the chance to relive their college decision, definitely choose Augustana again.  I think this is an important outcome question because it suggests the degree to which our seniors think that the money they spent to attend Augustana was worth it.  Since without tuition revenue we are out of business, this is an aspect of our work that we simply can’t ignore.

Our analyses revealed that the degree to which our seniors’ out-of-class experiences helped them connect their classroom learning with real-life events significantly increased the likelihood that they would definitely choose Augustana again.  I’d like to emphasize that we were testing whether students would DEFINITELY choose Augustana again – not “maybe” or “probably.”  In essence, in addition to being an important driver of student learning, I think our seniors explicitly recognized the educational value of these experiences.  As such, they were more than able to connect this educational value with the long-term benefits of the financial investment they had made.

I would suggest that this finding can guide the way that we talk about the value or worth of an Augustana education AND the way that we think about the admittedly amorphous notion of a value proposition.  At it’s essence, “value proposition” is supposed to represent the maximum synergy between the value promised by an institution and the perception by the student that this value will be fully delivered.  The difficulty, temptation, and sometimes suspicion, is that the folks who concentrate on establishing and strengthening a value proposition tend to focus more on the glitz of the marketing than the quality of the product.  Nonetheless, whatever your opinion of this phrase it’s hard to deny the concept’s importance.

In the context of this notion of value proposition, the data point I’ve described above puts in mind the famous line from the movie Field of Dreams. “If you build it, he will come.”  (No it’s not “they” . . . and yes, I was surprised too)  Every college in the country right now is pulling out all of the stops to create the most persuasive marketing campaign.  While we have admittedly been doing the same thing, we have also been concentrating on building an educational experience that is as fundamentally effective as it is precisely interwoven.  We may not have perfected our product, but we have developed an educational experience that is consistently producing robust evidence of strong learning outcomes.  I would humbly suggest that the key to maximizing our value proposition is in the product we build.  More than simply listing all of the experiential learning opportunities in which students can participate, when we can explain to students how each of these experiences is designed to help them apply and solidify an important aspect of their learning and development toward the person they aspire to be, we make a case for an Augustana education that is substantially more nuanced, adaptable, and compelling than the argument that prospective students hear from most other institutions.

I believe this is a way that we can ultimately communicate distinctiveness in a manner that is both powerful and personal.  More importantly, it allows us to live a story that never stops getting better.  And at the end of the day, that sure feels like we are doing what we were meant to do.

Make it a good day,




Hey, . . . how did we do that???

Welcome back!  I hope your engine is recharged for the spring term.

You might remember that about this time last year I was talking to anyone who would listen about the importance of the final round of data collection for the Wabash National Study of Liberal Arts Education (WNS).  The WNS was designed to combine learning outcome measures with student experience and pre-college characteristics data so that institutions could (1) assess student change over time on specific learning outcomes and (2) begin to identify the experiences that influenced that progress.  Augustana joined the third and final iteration of the WNS in 2008, so 2012/13  was our make or break year to get data from as many seniors as possible.  Since the study measured change over time, without senior year data, participation in the study would have been a giant waste of time.  After a nearly herculean effort and a paper bag full of gift cards to the Augie bookstore, we were able to entice about 190 seniors to participate – 120 of whom had also provided data during their freshman year.  All together, this dataset gives us a chance to thoroughly analyze the learning experience of a fairly representative sample of our 2012 graduates and make some generalizations about our overall educational effectiveness.

Last week we received the first of several long-awaited reports outlining our students’ results on the learning outcomes measured by the WNS.  I’d like to share one particular finding (I’ll share others with you over the course of the spring term) and ask your help in thinking about what might be behind it.  It’s not quite “a riddle wrapped in a mystery inside an enigma” (thank you, Winston Churchill), but it’s got me flummoxed.

One outcome of particular importance to religiously-affiliated liberal arts colleges is moral and ethical judgment.  For a lot of reasons we hope that our students develop a  sophisticated sense of the principles and values that shape their understanding of right and wrong.  Moreover, we hope that our graduates act as principled citizens who stand up for those values even in the face of pressure to conform or fear of reprisal.

It turns out that Augustana students made remarkable gains on the WNS measure of moral judgment.  In fact, our students’ gains were on average 50% larger than the average gains made by students at the 32 other small colleges that participated in the WNS.  Digging a little deeper, virtually all of that positive advantage (i.e., the 50% larger gain noted above) occurred during the first year.  After making substantially larger gains than students at comparable institutions, during the sophomore to senior year our students’ growth did not differ substantially from students at other institutions in the study.  In other words, our student raced out to big lead during the first year and held it through to graduation.

This finding is both exciting and, to be honest, a little troubling.  First, it is exciting that we now have some hard evidence to support our claim that Augustana graduates develop deeper and more sophisticated moral and ethical judgment.  One of the major criticisms of higher education institutions is that we make bold claims with very little proof to back them up.  Now we can say with some degree of certainty that we do what we say we do.

However, there is something about this finding that troubles me – and is the issue that I’d like your help with.  The findings from the WNS suggest that the bulk of our students’ growth in moral judgment happens during their first year.  Since we would like to think that we have intentionally designed the educational experience of our students, then we should be able to point to the program or combination of programs that likely produce this remarkable gain in moral judgment.  This is from whence my flummox cometh.

Now if we were only interested in proving our educational value, this data would make me think something along the lines of “game, set, match Vikings.”  But our interest in assessing student learning shouldn’t be merely about validating claims that we’ve already made. That is a dangerous game to play to be sure.  Rather, I want to know how we can do what we do just a little bit better.  Instead of merely proving our worth, I’m interested in improving our quality.

And I don’t think I can pinpoint any particular program that is designed to influence this outcome.  Our only curricular mandate for first year students is the LSFY sequence.  Are their other courses that we might to which we might attribute these gains, such as the Christian Traditions course?  I know the faculty who teach those courses do wonderful things, but I’m not sure the focus of that course is developing moral judgment.  Is there a program designed for first year students that is run by residence life or student activities?  I just don’t know.

The reason it seems important to me to be able to identify the experiences that are driving this gain is that we should want to take full advantage of this finding and figure out ways that we can take advantage of something that we are already doing well.  And this is where I’m stuck.  What are we doing that is working?  Is this just luck?  Coincidence?  I’d like to think not.

It seems pretty likely that there is something going on here that sets us apart from the other schools in the WNS.  The number of participants in the study and the size of the difference in gains is just too large for this to be a function of random chance.  So if you have an idea of what might be influencing our students’ gains in moral judgment, please post it in the comments section.  For us to be best able to (1) make our case as an institution to prospective students and families, and (2) maximize what we do in a way that takes full advantage of our talents and resources, we need to figure out what is driving these gains.

Make it a good day,


The Value of Providing an Intentional Curriculum

Most of us have heard about – or tried to defuse – at least one student who blew a gasket over their inability to get into a course that they thought they had to take during the next term.  Since we’ve just finished the registration period for spring, I’ve been thinking a bit more about our analysis of one item on the 2012 senior survey that relates to students’ course taking experience.  Seniors were asked to respond to the following statement.

“The courses I needed to take were available in the order in which I needed to take them.”

There were five response options ranging from “strongly disagree” to “strongly agree” (we scored them from 1 to 5 for the purposes of statistical analysis).  Our 2012 seniors’ average response score was 3.42.  Their responses were distributed like this:

Strongly Disagree 26 5%
Disagree 87 17%
Neutral 97 19%
Agree 250 49%
Strongly Agree 47 9%

Of course, we’d probably like the vast majority of our students to indicate “agree” or “strongly agree.”  However, just over 40% of our seniors selected “strongly disagree,” “disagree,” or “neutral.”  This begs two questions:

  1. To what degree is our students’ response to this item important?
  2. What could we do to influence our students’ responses in the future?

Most of the time, the story of the aforementioned panicking student concludes with a successful resolution – at least in terms of whether or not they were able to take the courses required to graduate in four years.  Often, a student’s panic can be assuaged when they realize that there are multiple course-taking patterns that will get them to the same outcome.  So, how important should it be to us whether students think that they were able to get into the classes they wanted to take when they thought they needed to take them?

It turns out that it may actually be pretty important.  We conducted a series of analyses of our senior survey data to identify the experiences that might be most directly influential on two outcomes, 1) the degree to which a senior would choose Augustana again if they could relive their college decision (a proxy for the value that a student thinks they got out of their education), and 2) the degree to which a senior is certain that their post-graduate plan is a good fit for who they are and where they want their life to go (a proxy for the student’s sense of the quality and clarity of their preparation for life after college).  Even after accounting for differences in students’ race, sex, pre-college ACT score, socio-economic status, and a variety of other curricular experiences, the degree to which courses were available in the order the student needed to take them proved to be a positive, statistically significant predictor of both outcomes.  In other words, students who felt courses were available in the order they needed to take them were also more likely to say that they would definitely choose Augustana again and were more certain that their post-graduate plans were a good fit for who they are and where they want their life to go.

It seems to me as if two things are going on here.  First, students often perceive themselves to be customers (sometimes to our great aggravation) and expect that the education for which they’ve enrolled – and are paying a lot of money – should be available in the manner that they choose it.  So if a student didn’t get into the classes they initially wanted to take, or were not able to take all of the major courses that interested them, they may well think that they didn’t get the full value of their investment.  While we’d like to provide an environment in which every student was able to take the courses they want when they want them, we all know that this is simply impossible.  This reality further emphasizes the value of an advising conversation that helps students understand their college education as replete with options and opportunity rather than constrained to a single checklist.

Second, although our students’ sense of Augustana’s educational worth is important, I am particularly intrigued by the statistically significant positive relationship between our students’ sense of sequence in their course-taking experience and their certainty that their post-graduate plans are a good fit for them.  The history of curricular design in higher education reveals a substantial shift from an entirely prescriptive curriculum with few – if any – choices a century ago to a sort of modern day modular smorgasbord where students select from a range of choices across a series of categories.  As institutions have focused more specifically on student learning we are repeatedly finding that this cafeteria approach, while it might give faculty more freedom to teach what they want to teach, ends up numbing students to the possibility of a holistic learning experience.  In some cases, especially at larger institutions, it also produces an almost laughable lack of awareness of what is going on outside of a given faculty member’s courses or department.  For our students, I suspect that a more sequential course-taking experience allows them to see the developmental nature of their education and to integrate each of the pieces into an accumulative whole.  In addition, it allows faculty to talk about the curriculum as a developmental construction in conversations with students.

The correlation between students’ sequential course-taking experience and their certainty of post-graduate plan fit suggests to me that the value of a more intentional curriculum can be framed around its benefits for student learning – not just about better “customer service” (a phrase that makes my skin crawl when used to refer to educational concepts).  Establishing a curriculum that embodies the developmental nature of learning encourages students to think about their own growth and, through that process, become more confident in their own progress toward their future goals.

So if you are in the midst of a conversation about curricular revision, I hope you’ll be able to shape your efforts around an explicitly intentional design.  And when you are talking with students about their course-taking choices, I hope you’ll suggest to them a strategic way of thinking about course selection.

Make it a good day,




The Educational Benefits of Student Employment

One clear trend among college students during the past several decades is the increasing proportion of students who maintain a job while attending school.  At Augustana, more than half of our students work on campus, while many more hold jobs off campus.  Typically, this phenomenon has been cast as a detriment to the college student experience since – as the argument goes – the obligations of work take away from the time that students might spend involved in co-curricular activities or studying for their courses.  I have sometimes heard folks talk about the ideal student employment as a position where the student can do their homework while sitting at a desk.  However, I’d like to suggest that work – especially if it is conceived as an educational experience – can be powerfully beneficial to our students’ development.

A few weeks ago I met with our Student Affairs senior staff to talk about ways that we can use our student data to support their work.  Soon our conversation turned to the possible educational impact of the Community Adviser (CA) position on the students who hold these jobs.  It’s time-intensive work that can sometimes be especially challenging when sorting through the whims and wiles of first year students.  And even though this position might seem to be a hybrid of co-curricular involvement and student employment, the requirements of the position obligate CAs to forego other opportunities on and off-campus.  So we thought it would be useful to test whether or not students who hold CA positions gain some unique educational benefit from the experience.

We chose to compare responses of CAs and non-CAs on one question from the senior survey that asks students to respond to the statement, “My co-curricular involvement helped me develop a better understanding of my leadership skills.”   The response options ranged from “strongly disagree” (1) to “strongly agree” (5).  The CA’s average response score was a 4.63, while the non-CA’s average response was a 4.26.  Statistically, this difference proved to be significant despite the small number of CAs (20) out of the total number of responses (511).  This suggests that there might indeed be something about a CA position that provides a unique educational benefit for those students.

While this specific finding might not qualify as the most rigorous quantitative analysis, it replicates other research on the educational benefits of student employment.  After examining the impact of work across the 2006 cohort of the Wabash National Study, my colleagues and I found that students who worked made gains on several aspects of leadership skills that non-working students did not (you can read the full study here).  Furthermore, the more hours per week that students worked, the larger the educational gain.  This held true even after we accounted for students’ other co-curricular involvement.

Now I’m not suggesting that co-curricular involvement is somehow frivolous.  There are lots of powerful educational benefits that can come from involvement in a variety of activities.  But these findings suggest that maybe work shouldn’t be considered a detriment to the student experience.  In fact, I would suggest that each of us who oversee student workers have an opportunity to uniquely influence their development in important ways.  We only miss that opportunity if we don’t conceive of the employment opportunity as a learning experience.  In the same way that we would like to develop our students as autonomous learners, we should hope to develop our student employees as autonomous workers.  That means giving them more than a simple checklist of things to do and instead, asking them to help solve problems and contribute to the quality of the working environment.

So I hope that you will take the time to think about your student workers as students, and see your role in overseeing their work as an educational one.

Make it a good day,


How much could we realistically improve retention?

While we consider a variety of measures to assess our educational effectiveness, we focus on our retention rate (the proportion of full-time first year students who return for a second year) for some pretty crucial reasons.  First, it’s a legitimate proxy for the quality of our educational and socially-inclusive environment.  Second, as a tuition-dependent institution every student we lose represents lost revenue; and there is real truth to the old adage that it costs more to recruit students than it does to retain them.  So every year we calculate our retention rate, hold it up next to the last five or ten years-worth of numbers and ask ourselves:

Did we do a good job of retaining students?

Most of the time, we end up telling ourselves that our retention rate falls somewhere between “decent” and “pretty good” – especially considering all of the things we can’t control.  But this conversation always leads us to the next question; one that is substantially more difficult to answer:

What should our retention rate be?

And that is where people in charge start to daydream and folks in the trenches start to cringe.  Because it’s all too common for a small group of folks – or even one folk – to arbitrarily decide on the institution’s goal for 1st-to-2nd year retention without any sense of whether or not that number is a reasonable goal.  And there’s nothing more corrosive to an educational organization’s long-term quality than assigning an unrealistic goal to the people you depend on to accomplish it.  So over the last few months, I’ve been wondering how we could get closer to figuring out what Augustana’s ideal retention rate should be.  I don’t know if I have an answer yet – or if there really is a right answer – but I’d like to share some numbers and consider their implications.

Since research on retention suggests that a primary predictor of student success is a student’s incoming academic ability or preparation, it seems reasonable to use our students’ ACT score as a starting point to test whether or not we could realistically expect to improve our retention rate.  If most of the students that we lose are also those who enter with low ACT scores, it suggests that the students we lose depart because they are academically unprepared and it’s therefore more likely that we’re already pretty close to our optimum retention rate.  However, if most of the students we lose enter with ACT scores comparable to our average freshman ACT score, then it’s likely that we still have room to improve.  And if this latter possibility proves to be so, we could consider a few additional factors and come closer to identifying a “ceiling” retention rate from which we could begin to choose a plausible goal.

To begin this process, we took the two most recent cohorts for which we can calculated retention rates (2010 and 2011) and broke down the students who departed before the beginning of their second year by incoming ACT scores.  The table below shows the number of students in each of three different categories – the bottom quartile (<22), the middle 50% (22-28), and the top quartile (>28) – that departed before the second year.


<22 ACT

22 – 28 ACT

>28 ACT









Clearly, in both of these cohorts the majority of the students who left entered with ACT scores in the middle 50% rather than the bottom quarter.  Thus, to the degree that ACT score is a proxy for pre-college academic preparation, it appears that there might be some room for us to realistically improve our 1st-to-2nd year retention rate.

However, ACT score doesn’t necessarily reflect the degree to which a student has the personality traits and personal habits (persistence, time management, motivation, etc.) to succeed in college.  And there are plenty of students who enter with low ACT scores and thrive at Augustana.  So another way to explore this data is to consider the number of students who left in good academic standing.  Even though good academic standing at Augustana is a 2.0, in an effort to be conservative in this analysis, I set the bar at a GPA of 2.5.

From the 2010 cohort, 48 of the students who left departed with a GPA above a 2.5.  From the 2011 cohort, 58 students fit into this category.  Again, both of these numbers suggest some degree of opportunity for improvement.  I emphasize caution here because there are many reasons why students depart that are beyond our control (health issues, financial exigency, or family emergencies).  In addition, some students leave for non-academic reasons that aren’t accounted for in this rudimentary analysis.  So we would be wise to estimate a number substantially below the 48 or 58 students noted above.

Where does that leave us?  Well, I would suggest that a reasonable starting point would be to build out from the 2010 cohort.  As it stands, our retention rate with that group was 87.6% – the highest on record.  If we assume that, with some combination of improved programming , advising, and student support, half of those 48 students could have been retained, that means that we could estimate an additional 24 students – or an increase of about 3 percentage points in our retention rate.  That would put us at an optimum retention rate – a best possible scenario – of between 90% and 91%.

How does that compare to colleges like to us?  A 90% retention rate would be significantly higher than colleges like Augustana that enroll a similarly student profile.  What kind of financial investment would this require?  Although that is an even more difficult question to answer, the comprehensive effort necessary to improve our relatively strong retention rate would not be free and would likely require some tradeoffs.

Two final thoughts stick out in my mind.  First, while we might have some room to improve, I’d suggest that the we aren’t that far away from our optimum rate.  Second, since there are as many moving parts in this equation as there are students at risk of departure, effective change may result from subtle shifts in institutional culture just as much as it might be influenced by a new program or policy.

So can we improve our average retention rate? Probably.  Will it be easy?  Probably not.  Is it the right thing to do?  Of course.  But we had better not assume that we will see a surge in revenue even if we are successful.

Make it a good day,



Wrestling with Creativity as a Student Learning Outcome

Before the holiday break, I described the evidence from our overall IDEA scores that our students’ Progress on Relevant Objectives (PRO) scores had increased substantively in the past year.  It is clear from looking at our data that this didn’t happen by accident and I hope you have taken a moment or two to take pride in your colleagues.  Admittedly, it is gratifying to see that all of the effort we have put toward maximizing our use of the new IDEA course feedback forms pay off.  So in the spirit of that effort, I want to highlight one other piece of data from our most recent overall report – the low proportion of courses that selected “Developing Creative Capacities” as an essential or important learning objective – and to advocate for more emphasis on that objective.

Of the 12 different learning objectives on the IDEA faculty forms, “Developing Creative Capacities” was selected by only 16% of the courses offered during the fall term – the least common selection (by comparison, 69% of courses indicated “gaining factual knowledge” as an essential or important learning objective).  As you might expect, “developing creative capacities” was chosen almost exclusively by fine arts courses, seemingly reflecting a traditional conception of creative capacities as something reserved for artistic expression.

Yet, as a liberal arts college, it seems that “developing creative capacities” should represent a central element of our educational goals and the culmination of a liberals arts education.  The parenthetical description of “creative capacities” in that objective includes “writing,” “inventing,” and “designing.”  Of course, these skills transcend any specific discipline.  Every time a student tries to make an argument with language, portray a concept visually, solve a problem that doesn’t have a singular solution, or articulate the implications of multiple sources of information on a particular point, their ability to do so hinges on these skills.

Moreover, in the updated version Bloom’s Taxonomy, “creating” is the highest cognitive domain.  Not unlike synthesizing, creating requires each of the skills listed in the preceding levels of the taxonomy (remembering, understanding, applying, analyzing, and evaluating).  It strikes me that this broadened definition of creating could apply to virtually all senior inquiry projects or other student work expected of a culminating experience.  For a more detailed discussion of creating as a higher-order skill, I’d suggest the IDEA paper that examines Objective #6.

So how do we infuse “developing creative capacities” more fully into our students’ educational experience?  I regularly hear faculty talk about the difficulty that many students exhibit when trying to synthesize disparate ideas and create new knowledge.  It’s complicated work, and I’ll bet that if we were to look back on even the best of our own undergraduate work, we would likely cringe in most cases at what we might have thought at the time was the cutting edge of genius.  Thankfully, this objective doesn’t say, “Mastering Creative Capacities.”  This learning outcome is developmental and will likely be something that most students miss at least as often as they hit.  But three ideas come to mind that I’d like to propose for your consideration . . .

  1. Students need practice.  This starts with simple experiences connecting ideas and deriving insights from those connections.  Students will surely be less capable of successfully wielding this key skill when it is needed if they haven’t explicitly been asked to develop it through previous courses and experiences.
  2. Students won’t take risks if they don’t trust those who ask them to do it.  Developing creative capacities requires learning from all manner of failure.  Students won’t take the kinds of risk necessary to make real progress if there isn’t space for them to fall down and get back up – and a professor who will help them to their feet.
  3. Eventually, you just have to jump.  If nothing else, we are experts at paralysis by analysis.  Although there is always a critical mass of information or content knowledge that students must know before they can begin to effectively connect ideas or form new ones, we sometimes get caught trying to cover more material at the expense of developing thinking skills in students.  Often, it is through trying to integrate and connect ideas without having all of the pieces that teaches the importance of seeking new knowledge and the awareness that there might be details critical to the development of an idea that we don’t yet know.

As you look at the role of your courses in the collective scheme of our students’ growth, I hope you’ll consider the possibility of adding this learning objective.  You may find that you are already doing many of the things in your course that make this happen.  You may find that you need to take a few risks yourself in the design of your course.  Whatever you decide, I hope you will consider the ways that you help students develop creative capacities as complex, higher-order thinking skills.  For our students to succeed in the world they will inherit, I would suggest that our collective future depends on the degree to which we develop their creative capacities to solve problems that we have not yet even seen.

Make it a good day,



Reveling in our IDEA results: A gift we gave to our students and each other

We spend a lot of time talking about the things that we would like to do better.  It’s a natural disposition for educators – continually looking for ways to perfect what is, at its core, a fundamentally imperfect enterprise.  As long as we keep in mind that our efforts to perfect are really about improvement and not about literal perfection, this mindset can cultivate a healthy environment for demonstrably increasing our educational effectiveness.

However – and I admit that I’m probably a repeat offender here – I don’t think we spend enough time reveling in our success.  Often we seem to jump from brushfire to brushfire – sometimes almost frantically so.  Though this might come from a genuinely honorable sense of urgency, I think it tends to make our work more exhausting than gratifying.  Conversely, taking the time to examine and celebrate our successes does two things.  First, it bolsters our confidence in our ability to identify a problem, analyze its cause(s), and implement a successful solution – a confidence that is vital to a culture of perpetual improvement.  Second, it helps us more naturally approach problems through a problem-solving lens.  There is a lot of evidence to show that examining the nature of a successful effort can be more beneficial than simply understanding every painful detail of how we screwed up.

So this last week before Christmas break, I want to celebrate one such success.  If I could hang mistletoe over the campus, I’d likely start doling out kisses (the chocolate kind, or course).  In the four terms since we implemented the IDEA Center course feedback process, you have significantly increased the degree to which students report learning in their courses.  Between fall of 2011 and fall of 2012, the average Progress on Relevant Objectives (PRO) score for a course has increased from a 3.8 to a 4.1.  In addition, on 10 of the 12 individual IDEA learning objectives, students in Augustana courses during the fall of 2012 (last term) reported higher average learning progress scores than students from the overall IDEA data base.  More specifically, the average learning gains from our own courses last term were higher than our overall Augustana average from the previous three terms on 10 out of 12 IDEA learning objectives.

Looking deeper into the data, the evidence continues to support the conclusion that our faculty have steadily improved their teaching.  Over four terms, faculty have reduced the number of objectives they select and narrowed the gap (i.e., variance – for those of you jonesing for statistical parlance) between progress on individual objectives chosen for a given course.  This narrowing precision likely indicates an increasing clarity of educational intent on the part of our faculty.  Moreover, this reduction in selected learning objectives has not come at the expense of higher order thinking objectives that might be considered more difficult to teach.  On the contrary, the selection of individual learning objectives remains similarly distributed – and equally effective – across surface and deep learning objectives.  In addition, students’ responses to the questions regarding “excellent teacher” and “excellent course” went up from 4.2 to 4.3 and from 3.9 to 4.0, respectively.  Finally, when asked whether “as a result of this course, I have more positive feelings about this field of study,” students’ average responses increased from 3.9 to 4.0.

Are there some reasons to challenge my conclusions?  Maybe.  While last year’s participation in the IDEA course feedback process was mandated for all faculty in an effort to develop institutional norms, only about 75% of courses participated this fall.  So it’s possible that the courses that didn’t participate in the fall would have pulled down our overall averages.  Or maybe our faculty have just learned how to manipulate the system and the increased numbers in both PRO scores, individual learning objectives, and teaching methods and styles are nothing more than our improved ability to game the system.

To both of these counter-arguments, in the spirit of the holiday I say (respectfully) . . . humbug.  First of all, although older faculty are traditionally least likely to employ course evaluations (as was the case this fall), I think it is highly unlikely that these faculty are also our worst instructors.  On contrary, many of them are master teachers who have found long ago that they needed to develop other methods of gathering course feedback that matched their own approach to teaching.  Moreover, even if there were some courses taught by senior faculty in which students would have reported lesser degrees of learning, there were courses with lower PRO scores taught by faculty from all classifications.  Second, while there might be some potential for gaming the IDEA system, what I have seen some people refer to as “gaming” has actually been nothing but intentionally designed teaching.  If a faculty member decides to select objective 11, “learning to analyze and critically evaluate ideas, arguments, and points of view,” and then tells the students that this is a focus of the course, asked students to develop this skill through a series of assignments, discussions, projects, or papers, and then explains to students when and how they were making progress on this objective . . . that all sounds to me like plain ol’ good teaching.  So if that is gaming the system or teaching to the test, then (in the words of every kid who has ever played football in the street), “GAME ON!”

Are there other data points in last term’s IDEA aggregate report that we ought to examine and seek to improve?  Sure.  But let’s have that conversation later – maybe in January.  Right now, let’s revel in the knowledge that we now have evidence to show the fruits of our labor to improve our teaching.  You made the commitment to adopt the IDEA course feedback system knowing that it might require us to step up our game.  It did, and you responded in kind.  Actually, you didn’t just meet the challenge – you rose up and proved yourselves to be better than advertised.  So congratulations.  You thoroughly deserve it.  Merry Christmas.

Make it a great day,