Increasing Students’ Inclination toward Intrinsic Motivation

Welcome to 2016!  It’s great to see all of you scurrying around campus.

Earlier this fall, I shared the first set of findings from our four-year assessment of Augustana students’ motivational orientations. As you might remember, this is the first set of results from our institutional outcomes assessment protocol in which we rotate through all of the college-wide student learning outcomes so that each year we have a new set of freshman-to-senior results for a different learning outcome. Moreover, because of the student experience data we collect over four years, we also have ways to identify experiences that appear to influence student gains (or losses) on those outcomes.

Intrinsic motivation is a key element of our students’ intrapersonal development. We talk about this attribute most specifically when we refer to the importance of a liberal arts education cultivating our students as life-long learners. Interestingly, although we aspire to strengthen our students’ orientation toward intrinsic motivation, the results of our four-year assessment revealed a change between the average freshman score and the average senior score was not statistically significant. In other words, our students’ orientation toward intrinsic motivation didn’t change – even though other research on motivational orientations suggests that individuals become more inclined toward intrinsic motivation as they get older.

Even though the average scores didn’t move like we would have hoped, there were certainly students who showed statistically significant gains in their orientation toward intrinsic motivation. So what makes those students different from the rest? And more importantly, are there lessons that we can learn from those students’ experience that we could apply more broadly?

To find answers to those questions, we designed an analysis that would test the impact of a variety of Augustana experiences. We tested the impact of curricular experiences, advising experiences, pre-college demographics and values, and co-curricular experiences. Sure enough, students who showed significant positive growth in their inclination toward intrinsic motivation also had two experiences in common, experiences that we ought to consider cultivating more broadly as we continue to improve the quality of the Augustana experiences.

First, the nature of the students’ co-curricular experiences produced a robust and statistically significant positive effect on the inclination toward intrinsic motivation. Specifically, as students more strongly agreed with the statement, “My out-of-class experiences have helped me develop a deeper understanding of myself,” they exhibited stronger gains in their inclination toward intrinsic motivation. Importantly, our findings suggest that mere participation in co-curricular experiences wasn’t enough. Instead, the effect came from the students’ perceived impact of those co-curricular experiences.

Second, students’ engagement in Symposium Day also produced a statistically significant, albeit smaller, effect. As students’ more strongly agreed with the statement, “Symposium Day activities influenced the way I now think about real world issues,” they made larger gains in their inclination toward intrinsic motivation. This is an item where the average score has increased each year since Symposium Day was introduced, even though senior survey results indicate that there is still substantial room for improvement (42.4% of 2015 seniors disagreed or strongly disagreed with this statement).

Somewhat surprisingly, neither the nature of the students’ classroom experience or advising experiences generated any effect on an orientation toward intrinsic motivation. Likewise, we accounted for sex, race, and socioeconomic status in our analysis and none of those variables produced a statistically significant effect.

So what might we make of these findings? First (as if we needed more evidence at this point), out-of-class experiences matter, a lot. And it’s not about quantity; it’s about quality. This is exactly the philosophy that undergirds the entire integration emphasis in the Augustana 2020 strategic plan. Students need to engage in experiences that help them grow in important ways. That kind of development doesn’t happen automatically. And every experience doesn’t necessarily produce the same type of growth, or any growth at all. This finding seems to re-emphasize the value of designing co-curricular experiences so that key teachable moments are most likely to occur, then prodding students to reflect on those moments with an eye toward how their own responses might teach them something about themselves.

Second, we’ve begun to notice some anecdotal suggestions of the educational value of Symposium Day and this finding presents further evidence that Symposium Day can, and is, impacting the growth of our students in important ways. One key take-away from this study reiterates that participation is necessary but not sufficient. The impact of Symposium Day in this study appears to come from the degree to which students felt that the experience shaped the way that they think about real world events. In other words, the value of Symposium Day in the context of this study is in the applicability of the learning and the way in which the experience can inspire students to reflect on their perceptions of real world events in the context of their Symposium Day experience. This suggests that all of the ways in which faculty and staff can link the curricular or co-curricular work they are doing with students to elements of Symposium Day may well be producing more than a deeper understanding of content knowledge.

As we build toward the next Symposium Day on January 20th, I hope you will find more ways to connect your work with students to the events and presentations scheduled for that day. And as we continue to reassess and redesign our students’ out-of-class experiences to maximize their educational and developmental benefit, I hope you will look for ways to link these experiences to our students’ understanding of themselves.

Make it a good (albeit cold) day,


Micro-Retention: Do fall-to-winter term rates tell us anything?

Trying to identify the critical factors that influence our students’ decisions to persist or withdraw is a tricky business. In addition to tracking our overall fall-to-fall retention rates for first year students (the only retention number that is widely reported), we track the fall-to-fall retention rates for each of the other cohorts (even 5th year seniors). Furthermore, we break those cohort retention rates down by a variety of demographic categories (e.g., race/ethnicity, gender, socioeconomic status, incoming academic preparation, and first-generation status).

But tracking the fall-to-fall retention rates only tells us of part of the story. The decision to persist or withdraw isn’t a simple or momentary decision, and research clearly indicates that the major decision to stay or leave is preceded by a multitude of minor decisions that combine to pull the student toward (or push the student away from) the brink of this ultimate choice. So if we want to more fully understand the nature of this series of decisions, another way to look at it is to examine the term-to-term retention rates. Although this approach is still based on evidence of the ultimate choice to leave Augustana, it might allow us to better understand something about the factors that influence student to leave after the fall term, winter term, and spring term (since we ask students who leave why they are leaving in an exit interview), thereby giving us the opportunity to see if there are differences in the reasons students give for leaving across these three departure points. It is this kind of knowledge that might help us figure out what kind of interventions to prioritize over the course of the academic year.

Below are three sets of fall-to-winter retention rates for our traditional student cohorts. Please note that each of these percentage rates represent the proportion of students in each cohort who were enrolled during the prior term. These rates do not represent the proportion of a entering cohort that is still enrolled at Augustana.

2015 Fall-Winter Retention        2014 Fall-Winter Retention
1st year –  96.5%                              95.9%
2nd year – 98.6%                              98.3%
3rd year –  97.9%                              97.1%
4th year –  99.4%                              97.4%
5th year –  63.3%                              42.4%
By comparison, below are the Four-Year Average Fall-Winter Retention
1st year –  96.6%
2nd year – 97.9%
3rd year –  98.3%
4th year –  98.3%
5th year –  54.9%

As you can see, our fall-to-winter retention rates increased for every cohort of students. In the case of the 1st through 4th year cohorts, I’d say this is a good thing. For the 5th year students, it’s more complicated (e.g., is the fact that more of them returned for the winter term a function of their particular choice of academic programs?  Or is it a function of our inability to offer them the courses they needed in a timely manner?).

What more are we to make of these numbers? By themselves, it seems to suggest what we already know – Augustana loses more first year student in the fall term than second, third, or fourth year students. This year, for example, we lost 24 first years, 9 second years, 10 third years, and 3 fourth years. While we might be able to improve among our first year students, it appears there might not be much more we could do systematically to increase fall-to-winter retention among all but the first year students. At the same time, if we are going to hang our hat on being a college that is very good at building relationships with all students, then those 22 non-first-year students each represent an opportunity for us to improve. The important thing to note about the first-year students’ departure patterns is that the vast majority of them didn’t even complete the first term. Although in some cases there may not be much we can do, this fact emphasizes the degree to which we need to build relationships with our students right away instead of waiting for them to open up or make the first move.

As you can expect, we are in the process of further analyzing our data, especially in connection with the freshman data we collected right before winter registration (i.e. about week 7). To be sure, you will be the first to know if we find anything!

Make it a good day,


Some old-school advice about studying turns out to still be true

Although I’d love to think that I’m some sort of innovatus maximus, when students ask me for advice I’m pretty sure that I just repeat what somebody told me when I was in college. This is particularly true when it comes to study habits. I was emphatically told to study during the day and never study in my dorm. I suppose the reason I think this advice was so good is because when I didn’t follow it my grades tanked. But just because some bits of sage advice have been around for a long time doesn’t necessarily mean that they are still accurate or applicable to everyone. Given the wealth of changes that have impacted undergraduate lives since I was in college (i.e., the late 1980s and early 1990s), it struck me that I’d better test these study habit assumption to see if they still hold.

Now I know that some of you might be chomping at the bit to raise the “correlation doesn’t equal causation” fallacy. Maybe I was dumber than a bag of hammers when I was in college and no amount of studying would have helped. Or maybe students who come to college with a boatload of smarts can study anywhere at anytime without any consequence. In all seriousness, given the vast changes in technology and the availability of library resources online, maybe the “where” isn’t all that important any more.

Luckily, we have exactly the data necessary to test this question. By linking first-year student data collected prior to enrollment, during the first year, and after the spring term, we can look at the relationship between pre-college academic preparation, study habits involving “where” and “when” one studies, and first-year cumulative GPA.

To account for pre-college academic preparation, we used the student’s ACT score and their Academic Habits score from the Student Readiness Survey (a score derived from each student’s self-assessment of their academic habits; things like preparing for exams early instead of cramming the night before the test). To account for studying “where” and “when” we used responses to three questions on the end of the first-year survey:

  • Of all the time you spent studying this year, about how much of it was in your dorm room? (1=none, 2=a little, 3=about half, 4=most, 5=all)
  • How often did you study – by yourself or in small groups – in the CSL (Tredway Library, 4th floor study spaces, Brew, or Dining Hall)? (1=never, 2=rarely, 3=sometimes, 4=often, 5=very often)
  • I made sure to set aside time to study during the day so that I wouldn’t have to do it all at night. (1=never, 2=rarely, 3=sometimes, 4=often, 5=very often)

And to account for cumulative first-year GPA, we used the final cumulative GPA in the college’s dataset that is constructed after all grades from the spring term have been logged.

I’ve inserted the results of the regression equation below, placing the statistically significant results in bold text.

Variable Coefficient Standard Error Significance
ACT Score*** .067 .011 .000
Academic habits .087 .090 .337
Studying in Dorm* -.099 .044 .028
Studying in CSL -.033 .042 .433
Studying During the Day* .096 .038 .014

Based on these regression results, the old-school studying advice seems to have withstood the test of time. As we would expect, pre-college academic preparation predicts first-year cumulative GPA. But even after accounting for pre-college preparation, “where” one studies (or more specifically, where one does NOT study) and “when” one studies still matters. Studying in one’s dorm room is a significant negative predictor, meaning that the more one studies in his or her dorm room the lower the first-year cumulative GPA. Conversely, studying during the day is a significant positive predictor, meaning that the more one studies during the day the higher the first-year cumulative GPA.

Interestingly, the question about studying in the CSL didn’t produce a statistically significant result. This may be the result of the question’s lack of precision. Because there is such a range of study environments in the CSL, studying the Brew may produce a much different effect than studying on the quiet floors of the library. In the end, the effects of those differences may well cancel each other out. Moreover, this possibility might further support the notion that the problem with studying in one’s dorm room isn’t the location itself, but rather the frequency and availability of distractions from friends, neighbors, TVs, game systems, and whatever else one might have stashed away in their dorm room.

It’s always nice to find that some sage old advice still holds true. But what I find compelling about these findings is the fact that they come directly from Augustana students who were first-year students in 2014-15. With this in mind, we can confidently tell our advisees that Augustana students who study away from their dorm room and study during the day earn better grades than similar students who study at night in their dorm rooms. In my recent experience, it appears that our students tend to respond to guidance supported by data more than they respond to sage old advice from the balding, middle-aged quasi-intellectual. Oh well.

Welcome back from Thanksgiving break, everyone! I’m looking forward to enjoying the holiday season on campus with all of you.

Make it a good day,


A short post for a short week!

Its hard not to think that Thanksgiving might need a better lobbyist. Every year the warm anticipation of the holiday seems to get overrun by Christmas decoration sales that start the day after Halloween, and then drowned out by the deafening onslaught of Black Friday advertising during the week of the actual holiday.

I’m not looking to start a movement, but over the weekend I decided to take back my own Thanksgiving. So from my little corner of the world, I want thank all of you, my friends and family at Augustana College. Thanks for inviting the Office of Institutional Research into your conversations, valuing our observations, and utilizing our insights. Thanks for trusting us with all of your data and for making us feel like genuine contributors to the life of the college.

I hope each of you get the chance to find a moment of peace on Thanksgiving.

Make it a good day,


Peer Mentorship: A good thing that we might make even better

No matter what you do at Augustana, I hope you found some time to get away and recharge over the break. Now that we are all back for the winter term, I’d like to introduce a new feature of Delicious Ambiguity that has no plan other than knowing that it will happen from time to time starting today. (I can’t undermine the title of the blog by imposing some sort of precisely organized plan, right?). There are a number of folks on campus who have conducted interesting and thought-provoking studies of our students’ experience. This work needs the chance to be highlighted and shared broadly. So without further ado, here is a post from Dr. Brian Leech from a study he conducted last year.


Guest Post by Brian Leech

Our college employs a number of students who provide mentorship to their peers, especially first-year students; yet, these mentors tend to get overlooked when we talk about the first-year experience. Many faculty in particular often know very little about how these programs can help students adjust to college. Each program either assists a specific segment of the student population or it helps the general student population in a specific way.

Here is a brief run-down of some mentoring programs available on campus:

Peer Mentors: Work with faculty to help first-year students adjust to life at Augustana.

Global Ambassadors: Help newly-arrived students from other countries with culture shock.

Multicultural Ambassadors: Help students who often have trouble connecting to the Augustana community.

ACI/Chicago Network: Small group works with students from Chicago to help with adjustment.

Community Advisors: Coordinate programming at residence halls, provide emergency assistance, and perform many mentoring activities, including referrals and informal peer counseling.

Career Ambassadors: Help students with resumes and assist with career programming.

Reading/Writing Center Tutors/Fellows: Assist students with academic reading and writing. The campus also hosts a growing number of tutors in other subjects.

Admissions Ambassadors: Provide campus tours, help visitors, host overnights, and assist with visit days. Often essentially serve as mentors before students are even enrolled.

Interviewing both the people who manage these programs as well as a number of students involved left me impressed. The fact that some students devote so much of their own time to helping their peers is quite admirable. Although the college does typically pay them too, which I’m sure is a factor, but it’s not like these are particularly easy jobs. Students performing peer-mentoring duties are on the front lines of campus inclusion. Joining a majority-white community, for instance, often serves as a shock to many incoming students. The same can be said for international students arriving in the Midwest. No matter their background, many, if not all, undergraduates struggle to adapt to increasing academic expectations. Mentors who do some campus jobs, such as serve as math tutors or as writing fellows for first-year classes, can therefore be of great importance to students and the faculty who teach them.

Yet these mentoring programs can also use a boost. Below are the top three areas for improvement, as identified by the people involved.

Problem: Lack of knowledge. Many people across campus simply don’t know what mentoring programs are available to students, whether students they know are in a particular mentoring program, or what the different mentoring programs do. Therefore, students who could really benefit from this help are often not getting it in time. Better information sharing across campus can fix this problem.

Problem: Over-committed student mentors. Student mentors tend to be over-involved and sometimes don’t see their mentoring duties as a priority. We therefore need to improve students’ connection to and belief in their group’s mission. In other words, mentor positions should seem as vital to the student experience as they actually are. Certainly praise can help, but faculty, staff, and administrators can do more. Faculty, for instance, could partner with certain mentoring groups, help with on-going professional development, or assist student leaders within each group.

Problem: Training. Many of the above groups provide extensive training to mentors before the academic year begins. Once the year starts, however, little time exists for busy students to squeeze in further professional development. It is therefore worth exploring how the college can create and better support training that is accessible, useful, and compelling. Would an online module help? A workshop that involves joint faculty-staff-student training in mentorship?

As our college tries to improve students’ first-year experience, we should keep in mind the many student mentors who sometimes have as much, if not more than, an effect on incoming students’ lives as faculty, staff, and administrators.


Thanks, Brian. This is an excellent example of one way that we could take advantage of existing programs to more fully integrate our students’ learning experience instead of adding something new.

If you’ve conducted a study of our students that you think the campus should know about, send me an email or meet me for coffee. I’d love the chance to share your work with the rest of the Augustana community.

Make it a good day,


Differences in a sense of belonging on campus by race and sex

One critical predictor of a student’s likelihood to persist (at the same college or university) is the degree to which that student feels like he or she belongs on campus. For this reason, many surveys of college students (including our own freshman and senior surveys) ask for a response to the statement “I feel a strong sense of belonging on campus” on a 5-point scale of strongly disagree, disagree, neutral, agree, and strongly disagree. After we collect this data, we converted the responses to a numerical scale ranging from 1 (strongly disagree) to 5 (strongly agree) so that we could run a variety of statistical analyses. The average score from our most recent graduating class was 3.94, which roughly translates to an “agree” response. This would seem to suggest that things overall are pretty good. But looking deeper, we found some differences that might help us focus our continuing efforts to ensure that Augustana is indeed a truly inclusive campus.

The first stage of this analysis involves parsing the responses by race/ethnicity. In last spring’s graduating class there were enough students in three race/ethnic categories to analyze separately – White, Black, and Hispanic. Although there were also graduates who identified as Asian, Native American, and multiracial (among others), the numbers in these categories were so small that we were obliged to organize them into one group for the purposes of this analysis.

When we generated average sense of belonging scores for each of these four groups (White, Black, Hispanic, and other), a particularly important difference appeared. Here are the average scores for each of the four groups.

  • White – 4.00
  • Hispanic – 3.91
  • Black – 3.29
  • Other – 4.16

Clearly, the Black students’ average response suggests a substantial gap between their sense of belonging on campus and each of the other groups. Further testing determined this difference to be statistically significant, validating what faculty and staff who interact closely with our Black students often report that they hear about these student’s experiences at Augustana.

Since we often find that sex also plays a role in shaping our students’ experience, we added a second layer to our analysis to see if the interaction of race/ethnicity and sex would produce even more exacting differences in the data. Interestingly, we did find such a difference among one specific group. Here is how this additional stage of analysis played out.

Race/Ethnic Category             Male            Female

White                                       3.90             4.06

Hispanic                                   3.46             4.14

Black                                        3.25             3.30

Other                                        4.00             4.25

In addition to male and female Black students experiencing a lesser sense of belonging on campus, Hispanic men also expressed a lower sense of belonging on campus. This difference was statistically significant when compared to either the overall average or Hispanic women.

So what should we make of these findings? First, I think it’s important to be reminded that the multiple dimensions of diversity within our student body play out it in tangible ways that can profoundly shape our students’ sense of belonging at Augustana College. Second, these findings further affirm that race/ethnicity and sex are still influential lenses through which students see and experience this community. No matter what we would like to hope to be true, the sense of belonging on campus among Black students and Hispanic men at Augustana appears to differ in a way that can have powerfully detrimental consequences. Third, designing ways to help students who feel less of a sense of belonging is complicated. There are very few universal quick fixes, and the ones that exist were likely put into place a long time ago. Now our work requires a recognition of nuance and the degree to which different perspectives shaped before coming to college can impact students’ lives. Finally, all of us – students and educators – play a critical role in addressing this dynamic.

Make it a good day,


We Are Improving a Key Aspect of the Academic Feedback Loop (And We Can Prove it!)

A few years ago we began to ask our freshmen about the degree to which they received academic feedback early enough in the term for them to adjust their study habits. The survey item read like this:

  • “I had access to my grades or other feedback early enough in the term to adjust my study habits or seek additional academic help.”

Students could respond by selecting:

  • strongly disagree
  • disagree
  • neutral
  • agree
  • strongly agree

One of the reasons we began to ask this question was because we wanted to gather more information on the nature and scope of the feedback our freshmen received during their first term. Based on the wealth of research on the critical impact of regular and clear feedback, we have always known that this is an important aspect of an ideal learning environment. However, we had been surprised by the number of struggling first-year students who claimed to be unaware of how poorly they were doing in their classes.

Upon reviewing our first round of data near the end of the 2013/14 academic year, we had to swallow hard. 46.0% of the respondents selected disagree or strongly disagree, while only 34.1% agreed or strongly agreed with that statement.

During the 2014/15 academic year we had serious, and at times even tense, conversations about these findings. Even though it was certainly possible that some students who claimed to be unaware of their grades had simply chosen to avoid checking the grades that were clearly posted for just this purpose, these conversations led to several faculty development workshops and a lot of reconsideration of the scheduling of student assignments and the nature of the feedback provided to students. In addition, a number of conversations delved deeper into the degree to which students need to be shown how to use the feedback they receive and learn how to approach learning at Augustana differently than they may have approached learning in high school.

Over the subsequent two years, we’ve seen substantial movement on this item. In 2014/15, 36.7% of respondents selected disagree or strongly disagree (a roughly 10 percentage point decrease from the prior year) and 38.6% agreed or strongly agreed (a 4.5 percentage point increase from the prior year). Although this was encouraging to see, many instructors had already planned their course for that year by the time we had begun to discuss the findings from the prior year.

Now that faculty have had a full year to contemplate and infuse this concept into course syllabi, our 2015/16 data suggests that freshman are experiencing a substantially improved learning environment.  Examining responses from 515 freshmen (a 75.8% response rate), only 24.9% disagreed or strongly disagreed while 52.8% agreed or strongly agreed.

In two years, we’ve seen a (roughly) 20 percent swing toward an improved learning environment for our students. Although there are certainly plenty of reasons to drill deeper and continue to improve the ways that we cultivate a vibrant feedback loop between instructor and student (i.e., instructor gives student feedback, student applies feedback to improve academic work, instructor sees evidence of improvement in subsequent student work, instructor give student feedback that notes improvement and points to further opportunities to improve. etc.), I think we deserve to take a moment and realize that we’ve just accomplished something that many colleges only dream of but rarely get to see: actual evidence of improvement in the act of educating. This data provides concrete evidence that we identified an opportunity to get better, did the work to plug that finding into our daily efforts, and produced a real and significant change for the better.

I’m really proud of us.

Make it a good day,


“Close the Gap” Passes the Test

It’s no secret that students who choose to attend Augustana College (or for that matter, any other private liberal arts college like us) make a substantial financial commitment to their undergraduate education. As numerous economic trends over the last decade have combined to squeeze most families’ financial resources, this commitment has increasingly come under pressure to produce results. In examining our own data, it has become more and more clear that this financial pressure also contributes to students’ decision to persist or withdraw after the first year. In recent years, we’ve noted the large subpopulation of departing students who leave with a respectable, if not enviable, GPA after their first year. At the same time, we’ve seen an uptick in the number of students who claim that financial issues are a significant reason for their choice to depart.

In preparation for the incoming cohort of 2014, Augustana developed a financial aid program called “Close the Gap” to help those students who appeared to need some extra financial assistance to attend Augustana. By now, many of you know of this program. Many of you contributed to it. And the story of its success has been well documented, with about 100 freshmen in the class of 2014 receiving some assistance from this endeavor.

But with all warm and fuzzy stories of philanthropy comes the stickier question. Is this program actually effective? Does it affect more than the initial decision to attend Augustana? Specifically, would it have any impact on these students’ decision to return after their first year and continue toward graduation?

This is a tough thing to test because it’s hard to find a legitimate comparison group. We didn’t (and wouldn’t) create some sort of shadow “control” group within the first year class of students who needed the money but didn’t get it. And we can’t really compare these Augustana students with similar students at other institutions because 1) we don’t have access to those institutions’ data, and 2) those students didn’t choose Augustana so their first year experience isn’t similar. In the end, the only plausible and reasonable way to test the success of this program was to identify students from prior cohorts who would have likely been offered Close the Gap funds if such a program existed and then see if the first-to-second year retention rates of these students differed from the rate of the students who actually received Close the Gap funds.

This plan gets dicey, too, because very little stays exactly the same in the world of recruitment and enrollment. Scholarship amounts change, patterns of classifying the interest level of prospective student change, and the individuals who actually do the recruiting change. Nonetheless, although this approach might not gets us to an exact apples-to-apples comparison, it does get us within a pickpocket’s reach of the same fruit stand. (Yeah, I made that up.)

So here are the retention rates of students who likely would have received Close the Gap funds in 2012 and 2013, compared with the students who received those funds in 2014.

  • 2012 Cohort – 77.8%
  • 2013 Cohort – 77.3%
  • 2014 Cohort – 88.2%

There are some pretty good reasons to take this finding with a grain of salt. First, we have instituted a number of other campus-wide programs and support systems to assist our retention efforts. Second, when we put the Close the Gap program in place we also set in motion an increased effort to track these students, which in turn likely increased our inclination to informally support these particular students during their first year. Third, every incoming class is different and the overall make up of the 2014 group may well have fortified the environment most conducive to these students’ success.

Yet, even with all of these caveats in mind, an 11 percentage point swing is big. In tracking the retention rates of many different subpopulations of students (e.g., race/ethnic categories, gender, first generation status, etc.), we never see a swing that large between two years, especially if the two years prior are almost identical.

I think it’s reasonable to suggest that the Close the Gap program has improved the retention rate of students with this particular level of need, and it appears that this improvement did contribute to an increase in our overall retention rate between last year and this year. This is certainly cause for celebration. We seem to be getting better at addressing the different needs of different types of students.

Yes, we’ve got plenty more work to do. And we are diving into those challenges, too. But for today, I think it’s o.k. to smile, celebrate some success, and give a shout-out to the folks who initiated and continue to raise the funds for this program. Thanks and Congrats!

Make it a good day,


Transparency Travails and Sexual Assault Data

The chill that dropped over campus on Monday seems like an apt metaphor for the subject that’s been on my mind for the past week. Last spring, Augustana participated in a multi-institutional study focused on sexual assault campus climate that was developed and administered by the Higher Education Data Sharing Consortium (HEDS). We hoped that the findings from this survey would help us, 1) get a better handle on the nature and prevalence of sexual assault and unwanted sexual contact among our students, and 2) better understand our campus climate surrounding sexual assault and unwanted sexual contact. We actively solicited student participation in the survey, collaborating with student government, faculty, and administration to announce the survey and encourage students to respond. The student response was unusually robust, particularly given the sensitivity of the topic. Equally important, many people across campus – students, faculty, administrators, and staff alike – took note of our announced intentions to improve and repeatedly asked when we would have information about the findings to share with the campus community. You saw the first announcement of these results on Sunday in a campus-wide email from Dean Campbell. If you attended the Monday night screening of The Hunting Ground and the panel discussion that followed, you likely heard additional references to findings from this survey. As Evelyn Campbell indicated, the full report is available from Mark Salisbury (AKA, me!) in the IR office upon request.

It has been interesting to watch the national reporting this fall as several higher ed consortia and individual institutions have begun to share data from their own studies of sexual assault and related campus climate. While some news outlets have reported in a fairly objective manner (Inside Higher Ed and The Chronicle of Higher Education), others have tripped over their own feet trying to impose a tale of conspiracy and dark motives (Huffington Post) or face-planted trying to insert a positive spin where one doesn’t really exist (Stanford University). Moreover, the often awkward word-choices and phrasing in the institutional press releases (e.g., Princeton’s press release) announcing these data seem to accentuate the degree to which colleges and university aren’t comfortable talking about their weaknesses, mistakes, or human failings (not to mention the extent to which faculty and college administrators might need to bone up on their quantitative literacy chops!).

Amidst all of this noise, we are watching two very different rationales for transparency play out in entirely predictable ways. One rationale frames transparency as a necessary imposition from the outside, like the piercing beam of an inspector’s flashlight pointed into an ominous darkness to expose bad behavior and prove a supposition. The other rationale frames transparency as a disposition that emanates from within, cultivating an organizational dynamic that makes it possible to enact and embrace meaningful and permanent improvement.

For the most part, it seems that most of the noise being made in the national press about sexual assault data and college campuses comes from using transparency to beat institutions into submission. This is particularly apparent in the Huffington Post piece. If the headline, “Private Colleges Keep Sexual Assault Data Secret: A bunch of colleges are withholding sexual assault data, thanks to one group,” doesn’t convey their agenda clearly enough, then the first couple of paragraphs walks the reader through it. The problem in this approach to transparency is that the data too often becomes the rope in a giant tug-of-war between preconceived points of view. Both (or neither) points of view could have parts that are entirely valid, but the nuance critical to actually identifying an effective way forward gets chopped to bits in the heat of the battle. In the end, you just have winners, losers, and a lifeless coil of rope that no one cares about anymore.

Instead, transparency is more likely to lead to effective change when it is a disposition that emanates within the institution’s culture. The folks at HEDS understood this notion when they designed the protocol for conducting the survey and conveying the data. The protocol they developed specifically prohibited institutions from revealing the names of other participant institutions, forcing institutions to focus the implications of their results back on themselves. Certainly, a critical part of this process at any institution is sharing its data with its entire community and collectively addressing the need to improve. But in this situation, transparency isn’t the end goal. Rather, it becomes a part of a process that necessarily leads to an improvement and observable change. To drive this point home, HEDS has put extensive efforts into helping institutions use their data to create change that reduces sexual assault.

At Augustana, we will continue to share our own results across our community as well and tackle this problem head-on. Our own findings point to plenty of issues that will likely improve our campus climate and reduce sexual assault. I’ll write about some of these findings in more detail in the coming weeks. In the meantime, please feel free to send me an email requesting our data. I’ll send you a copy right away. And if you’d like me to bring parts of the data to your students so that they might reflect and learn, I’m happy to do that too.

Make it a good day,


Motivated Much? Meh . . .

Intellectual curiosity is a fundamental goal of a liberal arts education. So it’s no surprise that we included it as one of Augustana’s nine learning outcomes. In our own words we chose to call this outcome “Wonder,” describing it as “a life-long engagement in intellectual growth,” and describing the students who exhibit this attribute as individuals who “take responsibility for learning.” It seems pretty clearly implied in these descriptions that we believe the graduates who exemplify intellectual curiosity would have developed a motivational orientation toward learning that is:

  • optimistic about the potential that additional learning provides,
  • continually seeking to grow and develop,
  • and intrinsically driven to pursue deeper knowledge.

As an aspirational goal, all of that sounds bright and shiny and downright wonderful. But the realities of dealing with our students’ motivations aren’t always quite so dreamy. We are often keenly aware of our students’ tendency toward external rewards such as high grades, acceptance to a prestigious grad school, or the allure of a high-paying job. Most of us have seen the blank look on a student’s face when we extoll the benefits of learning just because it’s interesting and even exciting to learn. Moreover, we all understand how much more difficult it is to shift a student’s motivational tendencies when they come to college after twelve years (or more) of high-stakes testing. In short, although we each might have had some flash of brilliance about how to stoke a student’s intrinsic motivation (or maybe in some cases just get a single flame to flicker), we know less about how to reliably team up with students to build that fire and keep it burning. If that weren’t enough, we’re not even sure about the degree to which we can influence a student’s motivational orientations at all. Maybe those orientations are mostly hard-wired by earlier life experience and aren’t really malleable again until well into adulthood.

Four and a half years ago, we decided to tackle this question in more depth by studying if, and how, our students’ motivational orientations change during their college career. As a part of our rolling outcomes assessment plan (our way of utilizing each incoming cohort to study how students change on a particular aspect of our learning outcomes), the 2011 cohort took a survey instrument assessing orientations toward three different types of motivation during Welcome Week. These three orientations approximate intrinsic, extrinsic, and impersonal (i.e., when one is motivated to avoid something) motivation. You can learn more about the instrument we used here. Last spring, those same students took the same survey as a part of the senior survey, allowing us to test how their responses changed over four years. In addition, we will be able to use their responses to the senior survey questions to explore which experiences might statistically predict change on any of these three motivational orientations.

The consensus understanding of how motivational orientations change suggests that as people age, they develop a stronger orientation toward intrinsic motivation and a weaker orientation toward both extrinsic motivation and impersonal orientation. These findings seem to match up with what we know about the maturation process as well as other research findings that suggest the way that people’s values shift over time. With these prior findings in mind, we tested our freshman and senior year data, hypothesizing that our students’ orientation toward intrinsic motivation would go up and their orientations toward extrinsic and impersonal motivation would go down.

Well, we were partially right.  We had complete data from 397 students and only included those cases in the analysis presented below. The range for each orientation scale is 1-5. The three asterisks (***) indicate that the change between freshman year and senior year is statistically significant (for the stats junkies, that p-value is <.001).

Minimum Maximum Mean Std. Deviation
Freshman year – Intrinsic Orientation 2.88 5.00 4.1243 .37228
Senior year – Intrinsic Orientation 1.00 5.00 4.0783 .51475
Freshman year – Extrinsic Orientation 1.94 4.24 3.1235  .38384
Senior year – Extrinsic Orientation *** 1.00 4.06 2.9623  .46230
Freshman year – Impersonal Orientation 1.69 4.00 2.8638 .40125
Senior year – Impersonal Orientation *** 1.29 4.12 2.7108 .50168

Our data suggests an interesting, and potentially troubling, possibility.  Although both orientations toward extrinsic and impersonal motivation dropped over four years, the orientation toward intrinsic motivation did not change significantly. This doesn’t reflect what we hypothesized and what prior research findings would have predicted. Furthermore, the notion that our students’ orientation toward intrinsic motivation hasn’t changed doesn’t match well with our goal of developing a more robust sense of intellectual curiosity.

There are numerous ways to explain this finding as an anomaly. Maybe our students’ relatively high scores on the intrinsic motivation scale as freshmen made it harder for them to score much higher. But that doesn’t seem to comport with many faculty opinions on campus regarding an absence of intrinsic motivation in most students. Maybe the 2011 cohort of students was just an unusual group and that changes in other cohorts would parallel other research findings. Yet our analysis of Augustana’s Wabash National Study data from our 2008 cohort revealed an even more troubling pattern where markers of intrinsic motivation dropped precipitously between the freshman and senior year. Or maybe the measurement instrument we used doesn’t really capture the construct we are trying to measure. However, this is an instrument that seems to have been validated repeatedly by a variety of researchers to reasonably capture these three aspects of motivation.

Cultivating intrinsic motivation is certainly not an easy thing. But if one of our core goals as a liberal arts college is developing young people who possess a more substantial orientation toward intrinsic motivation at the end of their senior year than they had at the beginning of their freshman year, then it seems to me that this finding should give us pause. In future posts I’ll share the experiences that we find statistically predict an increase in intrinsic motivational orientation.  If you can think of something that we should test, by all means shoot me an email and we’ll see what happen!

Make it a good day,