What’s the Problem We’re Trying to Address?

If you’ve had to sit through more than one meeting with me, you’ve almost certainly heard me ask this question. Even though I can see how the question might sound rhetorical and maybe even a little snarky, I’m really just trying to help. Because I know from my own experience how easy it is to get lost in the weeds when trying to tackle a complex issue that is full of dicey trade-offs and unknown unknowns. So sometimes I’ve found that it can be useful to pause, take a couple of deep breaths and refocus on the problem at the core of the conversation.

By now you’ve almost certainly heard about the discussion about transitioning from an academic calendar based on trimesters to one based on semesters. Last week, Faculty Council provided a draft proposal to the faculty to be discussed, vetted, and even adjusted as legitimate concerns are identified by the community. Since I’ve already seen a calendar discussion sap us of most of our energy twice (or once if you count the two-year discussion a few years back as a single event), I hope that this time we can find a way to get through this without quite so much emotional fallout.

With that in mind, after listening to the calendar conversation for the last few months I thought it might be helpful to revisit the question at the top of this post:

What’s the problem we’re trying to address?

It is true, in one very real sense, that there is not a single answer. In fact the “problem” looks different depending upon where you sit. But since the topic of semesters was formally put back onto the front burner by the senior administration and the Board of Trustees, it’s probably useful to understand the problem as they see it. From their perspective, the problem we are facing is actually a pretty straight-forward one. In a nutshell we, like a lot of colleges and universities these days, have a balance sheet problem. In other words, we are having an increasingly difficult time ensuring that our revenues keep pace with our expenses (or put differently, that our expenses don’t outpace our revenues).

The reasons for this problem have been presented countless times, so I’ll try not to dive down that rabbit-hole too far again. But suffice it to say that since American family incomes have been stagnant for a long time, each year that our costs go up we lose a few more prospective families that might otherwise be willing to pay what we charge. Combine that with a shrinking population of high school graduates in the Midwest overall, and you can imagine how it gets harder and harder to come up with the increased revenue necessary to pay for inescapable increases in expenses like electricity, gas, and water, not to mention reasonable salary raises, building and sidewalk repairs, and replacements of worn out equipment.

The possible solutions to a straight-forward balance sheet problem like ours are also relatively straight-forward. If we decide to think of it primarily as insufficient revenue, then we would likely choose a way to increase revenue (e.g., enroll more students, add graduate programs, start online programs . . . each of the examples in this category are perceived by many as a potential threat to our philosophical core). If we decide to think of this problem primarily as excessive expenses, then we would likely choose a way to reduce expenses (e.g., make the college demonstrably smaller, eliminate Augie Choice . . . the only examples in this category that I can think of are pretty depressing). If we don’t see plausible options to increase revenues or reduce expenses, then the only other possibility is to find ways to become more efficient (i.e., achieve similar results from smaller expenditures). Of course, we could concoct some combination of all three approaches.

From the administration’s perspective, the possibility of moving to a semester-based academic calendar addresses the balance sheet problem by giving the college access to an expanded set of opportunities for increased efficiency (i.e., achieving similar results from smaller expenditures). Some of those efficiencies are more self-evident, such as reducing the number of times we power up and power down specific buildings. Some of them are more abstract, such as reducing the number of times we conduct a large-scale process like registration. But the central problem that the semester idea attempts to address is an issue of imbalance between revenues and expenses.

Although some have suggested otherwise, the semester idea is not primarily intended to improve retention rates or increase the number of mid-year transfer students. It is possible that a semester calendar might be more conducive to retaining students who struggle initially or attracting transfer students just after the Christmas break. But there are plenty of similar institutions on semester calendars with lower retention rates and fewer transfer student. Of course, that doesn’t disprove anything either; it just demonstrates that a move to semesters doesn’t guarantee anything. Increases in retention and mid-year transfers will happen (if they happen at all) as a result of what we do within a new calendar, not because we move to a new calendar.

I truly don’t have a strong opinion on the question of calendar. Both trimesters and semesters can be done well and can be done badly. This is why Faculty Council and others have thought long and hard about how to construct a semester system that maintains our commitment to an integrated liberal arts education and delivers it in a way that allows faculty to do it well. Nonetheless, I think it is useful to remind ourselves why we are having this conversation and the nature of the problem we are trying to address. If you think that we should address our balance sheet issues by expanding revenue sources or by reducing expenses, then by all means say so. If you don’t think a balance sheet problem exists, then by all means say so. But let’s make sure we understand the nature of the problem we are trying to address. At the least, this will help us have a more transparent conversation that leaves us in a healthier place at the end, no matter what we decide to do.

And one more thing. Let’s not equate “increasing efficiency” with “doing more with less.” Increasing efficiency is doing differently with the same resources in a way that is more effective. If we are in fact continually doing more with less, in the long term we’re doing it wrong.

Make it a good day,



Improving Advising in the Major: Biology Drives our Overall Increase

Last week I shared a comparison of the overall major advising data from seniors in 2014 and 2015. Although not all of the differences between the two years of data met the threshold for statistical significance, taken together it seemed pretty likely that these improved numbers weren’t just a function of chance. As you might expect by now, another aspect of this finding piqued my curiosity. Is this change a result of a relatively small campus-wide improvement or are the increases in the overall numbers a result of a particular department’s efforts to improve?

Since the distribution of our seniors’ major choices leans heavily toward a few departments (about half of our students major in Biology, Business, Psychology, or Education), it didn’t take too long to isolate the source of our jump in major advising scores. Advising scores in Business, Psychology, and Education didn’t change much between 2014 and 2015. But in Biology? Something pretty impressive happened.

Below is a comparison of the increases on each advising question overall and the increases on each advising question for Biology and Pre-Med majors.  In particular, notice the column marked “Diff.”

Senior Survey Questions             Overall    Biology/PreMed
2014 2015  Diff 2014 2015  Diff
Cared about my development 4.11 4.22 +.11 3.70 4.02 +.32
Helped me select courses 3.93 4.05 +.12 3.49 3.90 +.41
Asked about career goals 3.62 3.73 +.11 3.39 3.81 +.42
Connected with campus resources 3.35 3.47 +.12 3.11 3.36 +.25
Asked me to think about links btwn curr., co-curr., and post-grad plans 3.41 3.57 +.16 3.04 3.48 +.44
Helped make the most of college 3.85 3.97 +.12 3.36 3.80 +.44
How often you talked to your adviser 3.62 3.51 -.11 3.09 3.27 +.18

It’s pretty hard to miss the size of the increased scores for Biology and Pre-Med majors between 2014 and 2015. In every case, these increased scores are three or four times larger than the increases in overall scores.  In a word: Impressive!

So what happened?

Advising is a longstanding challenge for Biology and Pre-Med faculty. For decades this department has struggled to adequately advise a seemingly endless flow of majors. Last spring, Biology and Pre-Med graduated almost 150 students and at the beginning of the 2014-15 academic year there were 373 declared majors in either program. Moreover, that number probably underestimates the actual number of majors they have to work with since many students declare their major after the 10th day of the term (when this data snapshot was archived).

Yet the faculty in the Biology and Pre-Med department decided to tackle this challenge anyway. Despite the overwhelming numbers, maybe there was a way to get a little bit better by making even more of the limited time each adviser spent with each student. Each faculty adviser examined senior survey data from their own advisees and picked their own point of emphasis for the next year. Several of the Biology and Pre-Med faculty shared with me the kinds of things that they identified for themselves. Without fail, each faculty member decided to make sure that they talked about CORE in every meeting, be it the resources available in CORE for post-graduate preparation or just the value of making a visit to the CORE office and establishing a relationship. Several others talked about making sure that they pressed their advisees to describe the connections between the classes they were taking and the co-curricular activities in which they were involved, pushing their students to be intentional with everything they chose to do in college. Finally, more than one person noted that even though advising had always been important to them, they realized how easy it was to let one or more of the the usual faculty stresses color their mood during advising meetings, (e.g., succumbing to the stress of an upcoming meeting or a prior conversation). They found ways to get themselves into a frame of mind that improved the quality of their interaction with students.

None of these changes seem all that significant by themselves.  Yet together, it appears that the collective effort of the Biology and Pre-Med faculty – even in the face of a continued heavy stream of students, made a powerful difference in the way that students’ rated their advising experience in the major.

Improvement isn’t as daunting as it might sometimes seem. In many cases, it just takes an emphasis on identifying small changes and implementing them relentlessly. So three cheers for Biology and Pre-Med. You’ve demonstrated that even under pretty tough circumstances, we can improve something by focusing on it and making it happen.

Make it a good day,


We’ve gotten better at advising, and we can (almost) prove it!

With all of the focus on reaccreditation, budget concerns, employee engagement, and the consideration of a different academic calendar, it seems like we’ve spent a lot of time dwelling on things that aren’t going well or aren’t quite good enough. However, in the midst of these conversations I think we might do ourselves some good to remember that we are in the midst of doing some things very well. So before we plunge ourselves into another brooding conversation about calendar, workload disparity, or budget issues, I thought we could all use a step back from the precipice and a solid pat on the back.

You’d have to have been trapped under something thick and heavy to have missed all of the talk in recent years about the need to improve advising. We’ve added positions, increased the depth and breadth of training, and aspired to adopt an almost idyllic conception of deeply holistic advising. This has stretched many of us outside of our comfort zones and required that we apply a much more intentional framework to something that “in the old days” was supposed to be a relaxing and more open-ended conversation between scholar and student.

With this in mind, I thought it might be fun to start the spring term by sharing a comparison of 2014 and 2015 senior survey advising data.

Our senior survey asks seven questions about major advising. These questions are embedded into a section focused on our seniors’ experience in their major so that we can be sure that the student’s responses refer to their advising experience in each of their majors (especially since so many students have more than one major and therefore more than one major adviser). The first six questions focus on aspects of an intentional and developmental advising experience. The last question provides us with a way to put those efforts into the nitty gritty context of efficiency. In an ideal world, our student responses would show a trend toward higher scores on the first six questions, while average scores for the seventh question would remain relatively flat or even declining somewhat.

Here is a list of the senior survey advising questions and the corresponding response options.

  • My major adviser genuinely seemed to care about my development as a whole person. (strongly disagree, disagree, neutral, agree, strongly agree)
  • My major adviser helped me select courses that best met my educational and person goals. (strongly disagree, disagree, neutral, agree, strongly agree)
  • How often did your major adviser ask you about your career goals? (never, rarely, sometimes, often, very often)
  • My major adviser connected me with other campus resources and opportunities (OSL, CORE, the Counseling Center, etc.) that helped me succeed in college. (strongly disagree, disagree, neutral, agree, strongly agree)
  • How often did your major adviser ask you to think about the connections between your academic plans, co-curricular activities, and your career or post-graduate plans? (never, rarely, sometimes, often, very often)
  • My major adviser helped me plan to make the most of my college career. (strongly disagree, disagree, neutral, agree, strongly agree)
  • About how often did you talk with your primary major adviser? (never, less than once per term, 1-2 times per term, 2-3 times per term, we communicated regularly through each term)

A comparison of overall numbers from seniors graduating in 2014 and 2015 seems to suggest a reason for optimism.

Senior Survey Question 2014 2015
Genuinely cared about my development 4.11 4.22
Helped me select the right courses 3.93 4.05
How often asked about career goals 3.62 3.73
Connected me with campus resources 3.35 3.47
How often asked to think about links between curricular, co-curricular, and post-grad plans 3.41 3.57
Helped me make the most of my college career 3.85 3.97
How often did you and your adviser talk 3.62 3.51

As you can tell, the change between 2014 and 2015 on each of these items aligns with what we would hope to see. We appear to be improving the quality of the student advising experience without taking more time to do so. Certainly this doesn’t mean that every single case reflects this overall picture, but taken together this data seems to suggest that our efforts to improve are working.

I suspect that more than a few of you are wondering whether or not these changes are statistically significant. Without throwing another table of data at you, here is what I found. The change in “how often advisers asked students to think about the links between curricular, co-curricular, and post-grad plans” (.16) solidly crossed the threshold of statistical significance. The change in “genuinely cared about my development” (.11) was not statistically significant. The change in each of the other five items (from .12 to .15) turned out to be “marginally significant,” meaning, in essence, that the difference between the two average scores is worth noting even if it doesn’t meet the gold standard for statistical significant.

The reason I would argue that these changes, when taken together, are worth noting is a function of looking at all of these changes together. The probability that all seven of these items would move in our intended direction randomly is less than 1% (.0078 to be exact). In other words, it’s likely that something is going on that would push all of these items in the directions we had hoped. Given the scope of our advising emphasis recently, these findings seem to me to suggest that we are indeed on the right track.

I know that there are plenty of reasons to pull the “correlation doesn’t equal causation” handbrake. But I’m not arguing that this data is inescapable proof. Rather, I’m arguing that these findings make a pretty strong case for the possibility that our efforts are producing results.

So before we get ourselves tied into knots about hard questions and tough choices over the next 10 weeks, maybe take a moment to remember that we can tackle issues that might at first seem overwhelming. It might not be easy, but where is the fun in “easy”?

Make it a good day,


Instructor behaviors can impact students’ ability to balance

One well-known feature of our trimester calendar is the crazy crush of busyness during the last few weeks of the term. You can see it in the eyes of almost every student as they trudge up and down the quad (and if you look closely you can see it in many instructors’ eyes too). Although it’s not fun, I’m not sure that experiencing this kind of crunch of deadlines is such a bad thing. After all, our students will more than likely be required to successfully juggle multiple time-sensitive pressures often after they graduate. So the issue may not be whether or how we might dial back the seemingly inevitable onslaught of curricular deadlines. Instead, our greatest contribution to our students might be to:

  1. help them learn how to successfully navigate these experiences, and
  2. ensure that our instructional interactions allow them to focus their energies on increasingly efficient navigating strategies instead of adding extraneous noise, unnecessarily vague direction, or downright confusion into the mix.

Last week, I shared findings that explored the things students can do to help them develop better time-management skills. This week I thought it might be useful to focus on the potential impact of instructional behaviors on this equation. Just like last week, these findings are the result of teamwork in the IR office. Katrina, one of my three student workers, and I worked together to develop the analysis and run the statistical tests; I’m responsible for writing up what we found.

In our mid-year freshman survey (data collected during the last third of fall term), we ask first-year students to tell us how often they struggle to balance their academics with their extra-curricular activities. They can choose from five response options that range from “most or all of the time” to “never.” Most of the students select “sometimes,” the middle response (i.e., a three), while the second most popular response selected is “rarely” (AKA a four).

Like last week’s analysis, we wanted to hold constant some of the typical potentially confounding variables – demographic traits and pre-college academic preparation – so that we could hone in on instructional behaviors that appear influential for all types of students. (This turned out to be particularly important for reasons that I’ll discuss later.) After a series of preliminary tests, we added three items to our analysis to see if they produced statistically significant results.

  • I had access to my grades or other feedback early enough in the term to adjust my study habits or seek additional academic help as necessary.
  • My LSFY/Honors instructor helped me develop at least one specific way to be a more successful college student.
  • My first year adviser helped me understand my Student Readiness Survey (SRS) results.

The first two items offer five response options that range from “strongly disagree” to “strongly agree.” The question about the SRS includes a different set of four response options that seem more appropriate to the question: “We never talked about them (What is the SRS?),” “Only briefly,” “Yes, but they weren’t all that useful,” and “Yes, and they influenced how I approached the beginning of my freshman year.” We hypothesized that all three of these items might have positive effect on students’ frequency of struggling to balance academic and extra-curricular activities.

Our analysis confirmed our hypotheses in two out of the three cases (last week we were right once, wrong once, and neither once, so maybe we’ve improved our prognosticating skills a little bit since last week!). Even though one of the elements of the Student Readiness Survey focuses on academic habits, the first year adviser’s use of the Student Readiness Survey didn’t have any effect one way or the other. Maybe it was a bit aspirational to think that one conversation could impact something as potentially continuous or more likely intermittent as struggling to balance academic and co-curricular activities.

However, both early access to grades or other feedback and an LSFY/Honors instructor who helped the respondent develop at least one specific way to be a better student produced statistically significant positive effects. In other words, as students agreed more strongly that they had received access to information that helped them calibrate the effectiveness of their study habits they struggled less often to balance academics and co-curricular activities. Similarly, as students agreed more strongly that their LSFY/Honors instructor helped them become a better college student, the students indicated that they struggled less often in balancing academics and co-curricular activities. In both cases, these findings held even after accounting for differences in race/ethnicity, gender, socio-economic status, and pre-college academic preparation (i.e., ACT score).

Thus, it appears that instructional behaviors can play an important role in helping students develop the ability to effectively balance their curricular and co-curricular obligations. Instructors can certainly design their courses to provide substantive feedback early in the term instead of waiting until a mid-term exam or late term paper assignment. Likewise, LSFY/Honors instructors can insert experiences or assignments into their courses that, instead of merely focusing on the production of intellectual work, also focus on the process of more efficiently and effectively producing intellectual work.

Now, remember my foreshadowing about the importance of demographic traits in this analysis? Even in the final equation, being female or being white still produced statistically significant positive effects. In other words, even after accounting for all of the other variables in our equation, white students and female students struggled to balance less often than their non-white and male peers.

It makes intuitive sense that gender would appear significant in this equation. We know from all sorts of research on men that their time-management skills and general maturity is usually behind women at this age. This finding highlights the additional effort that we need to make to help male students grow. Finally, the finding about race struck me as particularly important. In the context of the other research we’ve published about a lower sense of belonging scores for non-white students, maybe the findings in the current study reflect another way in which the collateral damage that comes from feeling less like you belong on our campus hinders the success of minority students. We have all experienced the negative effects of disturbing distractions that hinder productivity. Maybe the persistent negative effect of being non-white is a function of just such a destructive distraction and is another confirmation of the corrosive effect of stereotype threat.

We all want our students to be active and involved on campus. We also want them to develop and maintain a healthy balance between their academic and co-curricular obligations. It seems to me that historically we have put the onus for maintaining this balance on our students, providing all sorts of guidance on time management, priority setting, and life organization. Maybe, just maybe, we could also contribute to their success by focusing more on what we do to set them up for success. And I don’t mean dialing back our expectations of them. Instead, we might add to their potential for growth if we help them obtain and use the right information at the right time in order to continually (re)calibrate their balancing efforts. In the midst of the seemingly accelerated trimester calendar, this might be even more important than usual.

Good luck with week 10, everyone. For all of you who by now have noted that I’ve pointed out the increased busyness that occurs at the end of the term and followed that with an unusually long blog post that has just soaked up that much more of your time . . . I’m truly sorry.  Ok, well kind of sorry.

Make it a good day,


Another reason to study during the day . . .

A couple of months ago, I shared our findings from last year’s freshman data about the impact of study behaviors on GPA.  In essence, we found that studying in one’s dorm room appears to reduce GPA while studying during the day appears to increase it. One of my student workers, Katrina Friedrich, and I have been digging further into this set of data to identify significant predictors of another indicator of academic success: time management. It turns out that even after accounting for several other important student characteristics, studying during the day (again) makes a difference.

At the end of the first year, we ask freshman to respond to the statement, “During the year I got better at balancing my academics with my out-of-class activities.” Students can choose from a set of five “strongly agree” to “strongly disagree” response options. Since we also ask these students to evaluate their own academic habits in the Student Readiness Survey (a survey they take before they enroll at Augustana), as well as asking them near the end of the first term how often they think they struggle to balance academics and extra-curricular activities, we have a pretty good way of holding constant their pre-college and first-term assessment so that we can hone in on their sense of improvement over the course of the first year.

In addition to accounting for where our students might be on a spectrum of time management prior to college, our analysis also needs to account for variations in unscheduled time during the first year. Balancing curricular and co-curricular obligations is certainly impacted by the activities one chooses to join. For example, the pressure to balance multiple responsibilities would likely be higher for student-athletes because they have less discretionary time. The same is likely true for students who participate in music ensembles, for those who have committed themselves to be highly involved in a student organization or club, or even those students who work.

With both of these factors (early assessments of time management and potentially confounding first-year experiences) held in check, we tested three study behaviors to see if, and how, they might influence students’ sense of their own improvement in balancing curricular and co-curricular obligations:

  • Using a planner
  • Studying in one’s dorm room
  • Studying during the day

For those of you scoring at home, studying during the day was the clear winner. The more students said they studied during the day, the higher they rated their improvement in balancing academics and co-curricular activities – regardless of their other obligations or pre-college time management skills. Although studying in the dorm negatively affected GPA in our prior study, it didn’t have any effect one way or the other this time around. I suppose that it’s good to know that studying in the dorm didn’t hurt time management, but that isn’t exactly a ringing endorsement for studying in one’s dorm room, either.

The finding regarding the use of a planner was perplexing. Although the effect barely missed the threshold for statistical significance (for all of you stats nerds, the p-value equalled .053 when it needed to be less than .05), the direction of the effect was negative. In other words, student’s increased use of a planner appears to inhibit their sense of improvement in balancing academics and co-curricular activities.

With my apologies to the stats gods, for the purposes of this discussion let’s round that .053 to a .05 and say that our finding was statistically significant (after all, we are still talking about a 94.7% likelihood that this finding is not a function of chance). Why might the increased use of a planner reduce one’s sense of improvement in balancing academics and co-curricular activities? Katrina suggests that many students don’t use their planner for anything beyond keeping a list of what they have to do, as opposed to allotting differing amounts of time to complete specific tasks, take care of various chores, etc. Or maybe students, particularly freshmen in this case, don’t really have the ability to estimate how long most academic tasks take. Some of you might have insights to share here, since I think we all probably tell students to use a planner, assuming that they will know how to use it.

In the end, we again found that making it a priority to study during the day is beneficial to student success, this time in terms of the development of a key skill: time management. In conjunction with the previous positive effects we’ve found that result from studying during the day, it seems that this should be one of the things we emphatically encourage our students to do.

But I’m curious to hear what you might think of the findings involving the use of a planner. It could be something messy in the data or the result of a variable that we haven’t accounted for yet. Even though it didn’t quite meet the significance threshold, the direction of the effect makes me wonder if this is another thing that we tell students to do while not realizing that they need much more information on the “how” and “why” in order to gain the benefit that we assume will come from it. Curious, indeed.

Make it a good day,


No matter how bad you have it . . .

Hi Everyone,

I suspect some of you are still scarred from my many requests for syllabuses in preparation for the HLC visit. I won’t soon forget the enormity of that project.

Yet I am outdone once again by a news report of a syllabus collecting project that makes my little effort seem like a game of pick-up sticks.

Thanks to Cyrus Zargar for sharing this with me – even if I am a little concerned by how quickly my name came to mind!

“What a Million Syllabuses Can Teach Us”


Make it a good day,



Making the Most of Symposium Day

There is a lot going on this week, so I hope I can make this brief! (I know, I know . . .)

Last week I shared the second round of results from our study of Augustana students’ changing (or not changing) motivational orientations.  The first layer of results was a mixed bag . . . our students inclination toward extrinsic motivation decreased, but their inclination toward intrinsic motivation after four years remained unchanged.  The second layer of results found that, perhaps surprisingly to some, none of our measures of curricular experiences impacted our students’ inclination toward intrinsic motivation.  Instead, the primary driver of an increased intrinsic motivation orientation was our students’ perception of how much their out-of-class experiences had impacted their understanding of themselves.

But it was the other finding highlighted in my last post that I thought was particularly appropriate for the upcoming week. In addition to the impact of students’ out-of-class experiences, we found that the degree to which our students said that “Symposium Day activities influenced the way I now think about real world issues” significantly predicted (in a statistical sense) an increase in our students’ inclination toward intrinsic motivation.  In responding to my post one reader raised a good question, asking if we could discern whether these students’ engagement in Symposium Day was a reflection of their own dispositions or if this engagement in Symposium Day was a function of faculty and staff educational designs.  In essence, this question goes to the heart of our effort to tease out relationships in our data that we can rely on to design influential change.  And it gets at a pretty important issue for higher education folks: Can our educational efforts alter the course of our student’s learning or is their learning a function of the dispositions and abilities that they bring with them to college?

Of course, the question I just posed is really a false dichotomy.  It would be pretty hard to believe that any college’s educational efforts could completely overrun the dispositions that our students bring with them to college.  But we can test whether the things that we do light a spark that wouldn’t otherwise appear, functioning as an accelerator for those who are already heading in the right direction or as a pair of jumper cables for those whose engine just won’t seem to turn over.  Sparing the automotive metaphors, we can examine our data to find out if integrating learning experiences is actually improving our students’ growth.

So do we have any way to identify predictors of our students’ response to the statement “Symposium Day activities influenced the way I now think about real world issues”?  In fact, we do.  In our freshman year survey, we also ask our first-year students to respond (along five-response options ranging from strongly disagree to strongly agree) to the statement, “My instructors integrated themes or ideas from Symposium Day into their courses.”  Although this data comes from a different set of students than the senior survey data, I think its reasonable to suggest that whatever we find in our first-year data is applicable to all Augustana students.

It turns out that the degree to which first-year students say that their instructors integrated Symposium Day into their courses is a powerful predictor of the degree to which students say that Symposium Day activities influenced the way that they now think about real world issues, even after accounting for these students’ pre-college dispositions and traits.  While I wasn’t too surprised that this statistical relationship would appear, I was impressed by the size of the effect.

To put it simply, it seems pretty likely that when instructors integrate Symposium Day into their courses students come away from Symposium Day genuinely influenced by what they’ve heard.  And based on the analysis I shared last week, it may also be true that integrating courses with Symposium Day indirectly impacts our students’ development of a stronger inclination toward intrinsic motivation.  Maybe some students arrive at a presentation only because they have been required to attend. But by the time they leave, it appears that the chances are good that they will come away with something that sticks.

Well, I’ll be … who’d a thunk it!  Wednesday of this week just happens to be Symposium Day.  It isn’t too late to find a way to link your course to something happening on that day. Integrating learning experiences makes a difference.  More and more, we have evidence to show it.

Make it a good day (today and Wednesday),



Increasing Students’ Inclination toward Intrinsic Motivation

Welcome to 2016!  It’s great to see all of you scurrying around campus.

Earlier this fall, I shared the first set of findings from our four-year assessment of Augustana students’ motivational orientations. As you might remember, this is the first set of results from our institutional outcomes assessment protocol in which we rotate through all of the college-wide student learning outcomes so that each year we have a new set of freshman-to-senior results for a different learning outcome. Moreover, because of the student experience data we collect over four years, we also have ways to identify experiences that appear to influence student gains (or losses) on those outcomes.

Intrinsic motivation is a key element of our students’ intrapersonal development. We talk about this attribute most specifically when we refer to the importance of a liberal arts education cultivating our students as life-long learners. Interestingly, although we aspire to strengthen our students’ orientation toward intrinsic motivation, the results of our four-year assessment revealed a change between the average freshman score and the average senior score was not statistically significant. In other words, our students’ orientation toward intrinsic motivation didn’t change – even though other research on motivational orientations suggests that individuals become more inclined toward intrinsic motivation as they get older.

Even though the average scores didn’t move like we would have hoped, there were certainly students who showed statistically significant gains in their orientation toward intrinsic motivation. So what makes those students different from the rest? And more importantly, are there lessons that we can learn from those students’ experience that we could apply more broadly?

To find answers to those questions, we designed an analysis that would test the impact of a variety of Augustana experiences. We tested the impact of curricular experiences, advising experiences, pre-college demographics and values, and co-curricular experiences. Sure enough, students who showed significant positive growth in their inclination toward intrinsic motivation also had two experiences in common, experiences that we ought to consider cultivating more broadly as we continue to improve the quality of the Augustana experiences.

First, the nature of the students’ co-curricular experiences produced a robust and statistically significant positive effect on the inclination toward intrinsic motivation. Specifically, as students more strongly agreed with the statement, “My out-of-class experiences have helped me develop a deeper understanding of myself,” they exhibited stronger gains in their inclination toward intrinsic motivation. Importantly, our findings suggest that mere participation in co-curricular experiences wasn’t enough. Instead, the effect came from the students’ perceived impact of those co-curricular experiences.

Second, students’ engagement in Symposium Day also produced a statistically significant, albeit smaller, effect. As students’ more strongly agreed with the statement, “Symposium Day activities influenced the way I now think about real world issues,” they made larger gains in their inclination toward intrinsic motivation. This is an item where the average score has increased each year since Symposium Day was introduced, even though senior survey results indicate that there is still substantial room for improvement (42.4% of 2015 seniors disagreed or strongly disagreed with this statement).

Somewhat surprisingly, neither the nature of the students’ classroom experience or advising experiences generated any effect on an orientation toward intrinsic motivation. Likewise, we accounted for sex, race, and socioeconomic status in our analysis and none of those variables produced a statistically significant effect.

So what might we make of these findings? First (as if we needed more evidence at this point), out-of-class experiences matter, a lot. And it’s not about quantity; it’s about quality. This is exactly the philosophy that undergirds the entire integration emphasis in the Augustana 2020 strategic plan. Students need to engage in experiences that help them grow in important ways. That kind of development doesn’t happen automatically. And every experience doesn’t necessarily produce the same type of growth, or any growth at all. This finding seems to re-emphasize the value of designing co-curricular experiences so that key teachable moments are most likely to occur, then prodding students to reflect on those moments with an eye toward how their own responses might teach them something about themselves.

Second, we’ve begun to notice some anecdotal suggestions of the educational value of Symposium Day and this finding presents further evidence that Symposium Day can, and is, impacting the growth of our students in important ways. One key take-away from this study reiterates that participation is necessary but not sufficient. The impact of Symposium Day in this study appears to come from the degree to which students felt that the experience shaped the way that they think about real world events. In other words, the value of Symposium Day in the context of this study is in the applicability of the learning and the way in which the experience can inspire students to reflect on their perceptions of real world events in the context of their Symposium Day experience. This suggests that all of the ways in which faculty and staff can link the curricular or co-curricular work they are doing with students to elements of Symposium Day may well be producing more than a deeper understanding of content knowledge.

As we build toward the next Symposium Day on January 20th, I hope you will find more ways to connect your work with students to the events and presentations scheduled for that day. And as we continue to reassess and redesign our students’ out-of-class experiences to maximize their educational and developmental benefit, I hope you will look for ways to link these experiences to our students’ understanding of themselves.

Make it a good (albeit cold) day,


Micro-Retention: Do fall-to-winter term rates tell us anything?

Trying to identify the critical factors that influence our students’ decisions to persist or withdraw is a tricky business. In addition to tracking our overall fall-to-fall retention rates for first year students (the only retention number that is widely reported), we track the fall-to-fall retention rates for each of the other cohorts (even 5th year seniors). Furthermore, we break those cohort retention rates down by a variety of demographic categories (e.g., race/ethnicity, gender, socioeconomic status, incoming academic preparation, and first-generation status).

But tracking the fall-to-fall retention rates only tells us of part of the story. The decision to persist or withdraw isn’t a simple or momentary decision, and research clearly indicates that the major decision to stay or leave is preceded by a multitude of minor decisions that combine to pull the student toward (or push the student away from) the brink of this ultimate choice. So if we want to more fully understand the nature of this series of decisions, another way to look at it is to examine the term-to-term retention rates. Although this approach is still based on evidence of the ultimate choice to leave Augustana, it might allow us to better understand something about the factors that influence student to leave after the fall term, winter term, and spring term (since we ask students who leave why they are leaving in an exit interview), thereby giving us the opportunity to see if there are differences in the reasons students give for leaving across these three departure points. It is this kind of knowledge that might help us figure out what kind of interventions to prioritize over the course of the academic year.

Below are three sets of fall-to-winter retention rates for our traditional student cohorts. Please note that each of these percentage rates represent the proportion of students in each cohort who were enrolled during the prior term. These rates do not represent the proportion of a entering cohort that is still enrolled at Augustana.

2015 Fall-Winter Retention        2014 Fall-Winter Retention
1st year –  96.5%                              95.9%
2nd year – 98.6%                              98.3%
3rd year –  97.9%                              97.1%
4th year –  99.4%                              97.4%
5th year –  63.3%                              42.4%
By comparison, below are the Four-Year Average Fall-Winter Retention
1st year –  96.6%
2nd year – 97.9%
3rd year –  98.3%
4th year –  98.3%
5th year –  54.9%

As you can see, our fall-to-winter retention rates increased for every cohort of students. In the case of the 1st through 4th year cohorts, I’d say this is a good thing. For the 5th year students, it’s more complicated (e.g., is the fact that more of them returned for the winter term a function of their particular choice of academic programs?  Or is it a function of our inability to offer them the courses they needed in a timely manner?).

What more are we to make of these numbers? By themselves, it seems to suggest what we already know – Augustana loses more first year student in the fall term than second, third, or fourth year students. This year, for example, we lost 24 first years, 9 second years, 10 third years, and 3 fourth years. While we might be able to improve among our first year students, it appears there might not be much more we could do systematically to increase fall-to-winter retention among all but the first year students. At the same time, if we are going to hang our hat on being a college that is very good at building relationships with all students, then those 22 non-first-year students each represent an opportunity for us to improve. The important thing to note about the first-year students’ departure patterns is that the vast majority of them didn’t even complete the first term. Although in some cases there may not be much we can do, this fact emphasizes the degree to which we need to build relationships with our students right away instead of waiting for them to open up or make the first move.

As you can expect, we are in the process of further analyzing our data, especially in connection with the freshman data we collected right before winter registration (i.e. about week 7). To be sure, you will be the first to know if we find anything!

Make it a good day,


Some old-school advice about studying turns out to still be true

Although I’d love to think that I’m some sort of innovatus maximus, when students ask me for advice I’m pretty sure that I just repeat what somebody told me when I was in college. This is particularly true when it comes to study habits. I was emphatically told to study during the day and never study in my dorm. I suppose the reason I think this advice was so good is because when I didn’t follow it my grades tanked. But just because some bits of sage advice have been around for a long time doesn’t necessarily mean that they are still accurate or applicable to everyone. Given the wealth of changes that have impacted undergraduate lives since I was in college (i.e., the late 1980s and early 1990s), it struck me that I’d better test these study habit assumption to see if they still hold.

Now I know that some of you might be chomping at the bit to raise the “correlation doesn’t equal causation” fallacy. Maybe I was dumber than a bag of hammers when I was in college and no amount of studying would have helped. Or maybe students who come to college with a boatload of smarts can study anywhere at anytime without any consequence. In all seriousness, given the vast changes in technology and the availability of library resources online, maybe the “where” isn’t all that important any more.

Luckily, we have exactly the data necessary to test this question. By linking first-year student data collected prior to enrollment, during the first year, and after the spring term, we can look at the relationship between pre-college academic preparation, study habits involving “where” and “when” one studies, and first-year cumulative GPA.

To account for pre-college academic preparation, we used the student’s ACT score and their Academic Habits score from the Student Readiness Survey (a score derived from each student’s self-assessment of their academic habits; things like preparing for exams early instead of cramming the night before the test). To account for studying “where” and “when” we used responses to three questions on the end of the first-year survey:

  • Of all the time you spent studying this year, about how much of it was in your dorm room? (1=none, 2=a little, 3=about half, 4=most, 5=all)
  • How often did you study – by yourself or in small groups – in the CSL (Tredway Library, 4th floor study spaces, Brew, or Dining Hall)? (1=never, 2=rarely, 3=sometimes, 4=often, 5=very often)
  • I made sure to set aside time to study during the day so that I wouldn’t have to do it all at night. (1=never, 2=rarely, 3=sometimes, 4=often, 5=very often)

And to account for cumulative first-year GPA, we used the final cumulative GPA in the college’s dataset that is constructed after all grades from the spring term have been logged.

I’ve inserted the results of the regression equation below, placing the statistically significant results in bold text.

Variable Coefficient Standard Error Significance
ACT Score*** .067 .011 .000
Academic habits .087 .090 .337
Studying in Dorm* -.099 .044 .028
Studying in CSL -.033 .042 .433
Studying During the Day* .096 .038 .014

Based on these regression results, the old-school studying advice seems to have withstood the test of time. As we would expect, pre-college academic preparation predicts first-year cumulative GPA. But even after accounting for pre-college preparation, “where” one studies (or more specifically, where one does NOT study) and “when” one studies still matters. Studying in one’s dorm room is a significant negative predictor, meaning that the more one studies in his or her dorm room the lower the first-year cumulative GPA. Conversely, studying during the day is a significant positive predictor, meaning that the more one studies during the day the higher the first-year cumulative GPA.

Interestingly, the question about studying in the CSL didn’t produce a statistically significant result. This may be the result of the question’s lack of precision. Because there is such a range of study environments in the CSL, studying the Brew may produce a much different effect than studying on the quiet floors of the library. In the end, the effects of those differences may well cancel each other out. Moreover, this possibility might further support the notion that the problem with studying in one’s dorm room isn’t the location itself, but rather the frequency and availability of distractions from friends, neighbors, TVs, game systems, and whatever else one might have stashed away in their dorm room.

It’s always nice to find that some sage old advice still holds true. But what I find compelling about these findings is the fact that they come directly from Augustana students who were first-year students in 2014-15. With this in mind, we can confidently tell our advisees that Augustana students who study away from their dorm room and study during the day earn better grades than similar students who study at night in their dorm rooms. In my recent experience, it appears that our students tend to respond to guidance supported by data more than they respond to sage old advice from the balding, middle-aged quasi-intellectual. Oh well.

Welcome back from Thanksgiving break, everyone! I’m looking forward to enjoying the holiday season on campus with all of you.

Make it a good day,