“Close the Gap” Passes the Test

It’s no secret that students who choose to attend Augustana College (or for that matter, any other private liberal arts college like us) make a substantial financial commitment to their undergraduate education. As numerous economic trends over the last decade have combined to squeeze most families’ financial resources, this commitment has increasingly come under pressure to produce results. In examining our own data, it has become more and more clear that this financial pressure also contributes to students’ decision to persist or withdraw after the first year. In recent years, we’ve noted the large subpopulation of departing students who leave with a respectable, if not enviable, GPA after their first year. At the same time, we’ve seen an uptick in the number of students who claim that financial issues are a significant reason for their choice to depart.

In preparation for the incoming cohort of 2014, Augustana developed a financial aid program called “Close the Gap” to help those students who appeared to need some extra financial assistance to attend Augustana. By now, many of you know of this program. Many of you contributed to it. And the story of its success has been well documented, with about 100 freshmen in the class of 2014 receiving some assistance from this endeavor.

But with all warm and fuzzy stories of philanthropy comes the stickier question. Is this program actually effective? Does it affect more than the initial decision to attend Augustana? Specifically, would it have any impact on these students’ decision to return after their first year and continue toward graduation?

This is a tough thing to test because it’s hard to find a legitimate comparison group. We didn’t (and wouldn’t) create some sort of shadow “control” group within the first year class of students who needed the money but didn’t get it. And we can’t really compare these Augustana students with similar students at other institutions because 1) we don’t have access to those institutions’ data, and 2) those students didn’t choose Augustana so their first year experience isn’t similar. In the end, the only plausible and reasonable way to test the success of this program was to identify students from prior cohorts who would have likely been offered Close the Gap funds if such a program existed and then see if the first-to-second year retention rates of these students differed from the rate of the students who actually received Close the Gap funds.

This plan gets dicey, too, because very little stays exactly the same in the world of recruitment and enrollment. Scholarship amounts change, patterns of classifying the interest level of prospective student change, and the individuals who actually do the recruiting change. Nonetheless, although this approach might not gets us to an exact apples-to-apples comparison, it does get us within a pickpocket’s reach of the same fruit stand. (Yeah, I made that up.)

So here are the retention rates of students who likely would have received Close the Gap funds in 2012 and 2013, compared with the students who received those funds in 2014.

  • 2012 Cohort – 77.8%
  • 2013 Cohort – 77.3%
  • 2014 Cohort – 88.2%

There are some pretty good reasons to take this finding with a grain of salt. First, we have instituted a number of other campus-wide programs and support systems to assist our retention efforts. Second, when we put the Close the Gap program in place we also set in motion an increased effort to track these students, which in turn likely increased our inclination to informally support these particular students during their first year. Third, every incoming class is different and the overall make up of the 2014 group may well have fortified the environment most conducive to these students’ success.

Yet, even with all of these caveats in mind, an 11 percentage point swing is big. In tracking the retention rates of many different subpopulations of students (e.g., race/ethnic categories, gender, first generation status, etc.), we never see a swing that large between two years, especially if the two years prior are almost identical.

I think it’s reasonable to suggest that the Close the Gap program has improved the retention rate of students with this particular level of need, and it appears that this improvement did contribute to an increase in our overall retention rate between last year and this year. This is certainly cause for celebration. We seem to be getting better at addressing the different needs of different types of students.

Yes, we’ve got plenty more work to do. And we are diving into those challenges, too. But for today, I think it’s o.k. to smile, celebrate some success, and give a shout-out to the folks who initiated and continue to raise the funds for this program. Thanks and Congrats!

Make it a good day,


Motivated Much? Meh . . .

Intellectual curiosity is a fundamental goal of a liberal arts education. So it’s no surprise that we included it as one of Augustana’s nine learning outcomes. In our own words we chose to call this outcome “Wonder,” describing it as “a life-long engagement in intellectual growth,” and describing the students who exhibit this attribute as individuals who “take responsibility for learning.” It seems pretty clearly implied in these descriptions that we believe the graduates who exemplify intellectual curiosity would have developed a motivational orientation toward learning that is:

  • optimistic about the potential that additional learning provides,
  • continually seeking to grow and develop,
  • and intrinsically driven to pursue deeper knowledge.

As an aspirational goal, all of that sounds bright and shiny and downright wonderful. But the realities of dealing with our students’ motivations aren’t always quite so dreamy. We are often keenly aware of our students’ tendency toward external rewards such as high grades, acceptance to a prestigious grad school, or the allure of a high-paying job. Most of us have seen the blank look on a student’s face when we extoll the benefits of learning just because it’s interesting and even exciting to learn. Moreover, we all understand how much more difficult it is to shift a student’s motivational tendencies when they come to college after twelve years (or more) of high-stakes testing. In short, although we each might have had some flash of brilliance about how to stoke a student’s intrinsic motivation (or maybe in some cases just get a single flame to flicker), we know less about how to reliably team up with students to build that fire and keep it burning. If that weren’t enough, we’re not even sure about the degree to which we can influence a student’s motivational orientations at all. Maybe those orientations are mostly hard-wired by earlier life experience and aren’t really malleable again until well into adulthood.

Four and a half years ago, we decided to tackle this question in more depth by studying if, and how, our students’ motivational orientations change during their college career. As a part of our rolling outcomes assessment plan (our way of utilizing each incoming cohort to study how students change on a particular aspect of our learning outcomes), the 2011 cohort took a survey instrument assessing orientations toward three different types of motivation during Welcome Week. These three orientations approximate intrinsic, extrinsic, and impersonal (i.e., when one is motivated to avoid something) motivation. You can learn more about the instrument we used here. Last spring, those same students took the same survey as a part of the senior survey, allowing us to test how their responses changed over four years. In addition, we will be able to use their responses to the senior survey questions to explore which experiences might statistically predict change on any of these three motivational orientations.

The consensus understanding of how motivational orientations change suggests that as people age, they develop a stronger orientation toward intrinsic motivation and a weaker orientation toward both extrinsic motivation and impersonal orientation. These findings seem to match up with what we know about the maturation process as well as other research findings that suggest the way that people’s values shift over time. With these prior findings in mind, we tested our freshman and senior year data, hypothesizing that our students’ orientation toward intrinsic motivation would go up and their orientations toward extrinsic and impersonal motivation would go down.

Well, we were partially right.  We had complete data from 397 students and only included those cases in the analysis presented below. The range for each orientation scale is 1-5. The three asterisks (***) indicate that the change between freshman year and senior year is statistically significant (for the stats junkies, that p-value is <.001).

Minimum Maximum Mean Std. Deviation
Freshman year – Intrinsic Orientation 2.88 5.00 4.1243 .37228
Senior year – Intrinsic Orientation 1.00 5.00 4.0783 .51475
Freshman year – Extrinsic Orientation 1.94 4.24 3.1235  .38384
Senior year – Extrinsic Orientation *** 1.00 4.06 2.9623  .46230
Freshman year – Impersonal Orientation 1.69 4.00 2.8638 .40125
Senior year – Impersonal Orientation *** 1.29 4.12 2.7108 .50168

Our data suggests an interesting, and potentially troubling, possibility.  Although both orientations toward extrinsic and impersonal motivation dropped over four years, the orientation toward intrinsic motivation did not change significantly. This doesn’t reflect what we hypothesized and what prior research findings would have predicted. Furthermore, the notion that our students’ orientation toward intrinsic motivation hasn’t changed doesn’t match well with our goal of developing a more robust sense of intellectual curiosity.

There are numerous ways to explain this finding as an anomaly. Maybe our students’ relatively high scores on the intrinsic motivation scale as freshmen made it harder for them to score much higher. But that doesn’t seem to comport with many faculty opinions on campus regarding an absence of intrinsic motivation in most students. Maybe the 2011 cohort of students was just an unusual group and that changes in other cohorts would parallel other research findings. Yet our analysis of Augustana’s Wabash National Study data from our 2008 cohort revealed an even more troubling pattern where markers of intrinsic motivation dropped precipitously between the freshman and senior year. Or maybe the measurement instrument we used doesn’t really capture the construct we are trying to measure. However, this is an instrument that seems to have been validated repeatedly by a variety of researchers to reasonably capture these three aspects of motivation.

Cultivating intrinsic motivation is certainly not an easy thing. But if one of our core goals as a liberal arts college is developing young people who possess a more substantial orientation toward intrinsic motivation at the end of their senior year than they had at the beginning of their freshman year, then it seems to me that this finding should give us pause. In future posts I’ll share the experiences that we find statistically predict an increase in intrinsic motivational orientation.  If you can think of something that we should test, by all means shoot me an email and we’ll see what happen!

Make it a good day,


Celebrate another number – Zero!

It’s been almost two years in the making, but today we submit the completed Assurance Argument to the Higher Learning Commission in order to maintain our status as an accredited institution of higher learning.  The completed file is 33,839 words long and comes with a massive file of supporting documents.  These supporting documents include ten years of financial statements, meeting minutes from all of the most prominent faculty committees, a litany of strategic planning documents, all of the handbooks and catalogs that we use, and a host of other copies of emails, memos, spreadsheets, data summaries, and who knows what else. If that weren’t enough, we also submitted over 1,300 pages of documents to show that we’ve met the long list of federal compliance standards.

So after months of writes, rewrites, edits, re-edits, a couple of start-overs, and even a re-rewrite (and way too much wordsmithing – we are academics and we can’t help ourselves!), we have declared it to be done. Sometimes its entirely appropriate to celebrate selfishly. And for the IR office and Academic Affairs, this is just such a day. Yahoo! Zero more days of HLC assurance arguing!

Of course, there is no way that one office or one committee could possibly put all of this together.  Many of you contributed to this project by writing parts of these documents, finding evidence of a change that Augustana made at some point in the last ten years, editing big parts or small sections of these documents, identifying further evidence that might bolster an argument or better answer a question, or just reading over various parts of the text and telling us that things looked pretty good from your point of view.

Please accept a gigantic thank you on behalf of myself and all of us who’ve been consumed with this project for the last six months.

Lastly, there is no way that this project would look anything like it’s final iteration if it weren’t for Kimberly Dyer.  Many of you know Kimberly already and know that she is the main reason why the IR office is able to pull off all of the things that we do.  She spent countless hours writing, editing, researching, and scouring the backwaters of our computer drives and servers for just the right documentation to go into the Assurance Argument.  She also kept an amazingly complicated record of all the evidence items that were needed, requested, found, missing, entered, and ultimately linked in the text.

Next month (October 19 and 20 to be exact), a team of external reviewers will come to Augustana to follow up with us, ask additional questions of faculty, administrators, staff, and students, and conduct their due diligence to complete the process of the HLC accreditation cycle.  You’ll see a lot more information about that visit as we get closer to it, no doubt.  But for now, I just wanted to give everyone who contributed to creating the Assurance Argument a big shout out.  And if you see Kimberly around campus this week, don’t hesitate to applaud, bow, thank, or express gratitude in whatever way you choose. She deserves it.

Oh . . . what’s that?  You say you’d like to read the whole thing?  Really???  I believe that the entire file will still be available on the HLC site if you want to use the login and password that we set up last spring for anyone who wanted to review it.  If that doesn’t work, let me know and I’ll find a way to get you a copy of the final documents.  In the meantime, revel in the fact that no one is going to ask you to write an accreditation document for the long while!

Make it a good day,


Take a Moment to Be Happy about a Number!

With all the talk of a shrinking high school student population, changing demographics within that population, and the increasing number of college students who take courses online or transfer on a whim, it’s hard not to feel like the sky is about to come crashing down on higher education institutions like ours. Although I have no idea how to gauge the “threat level” given all of the external changes that are happening simultaneously (what ever happened to our good ol’ color-coded threat barometer from Homeland Security?), if you listen hard enough you can hear the entire system creaking and groaning like an old ship in tumultuous water. So even if it’s not the beginning of a fiery apocalypse, surviving all of this stuff isn’t necessarily a foregone conclusion and survival is not the same as coming out no worse for wear.

Yet in the midst of all this high anxiety, it’s easy to get so caught up in the fear of the unknown that we forget to notice moments worth celebrating. A big part of navigating change is keeping a balanced frame of mind and paying attention to evidence that we might be moving in the right direction. With this in mind, today I’d like to point to one number that is worth smiling about.

86.1%.  That is the proportion of the 2014 cohort of freshmen who returned for their second year at Augustana.  For short, we call that our retention rate.

The reason why that number is worth celebrating is that over the last few years we’ve been retaining somewhere between 85.1% and 82.9% of freshmen to the second year.

There are certainly several reasons to keep this party to a dull roar. Retention rates fluctuate, and even though we have instituted several good programs to help different types of students find a niche and succeed, managing the decision-making patterns of 19-year-olds is not a precise exercise. But today, it is worth noting that our retention rate of first-to-second year students is higher than it has been in three years.

That is worth letting yourself smile for a moment. It’s even worth going to someone on campus who works with first year students – LSFY instructors, 100 and 200 level course instructors, first-year advisers, financial aid administrators, learning commons administrators, librarians, residence life staff, coaches, and student life administrators (you get the idea at this point . . . there are a lot of people who influence the lives of first-year students) – and congratulate them. If you are one of the many who play a role in first-year students’ lives, take moment to smile and be proud of your effort.

Make it a good day,




Want to Improve Our Work Culture? Own Up to Your Blind Spots

Whether you want to call it employee “climate,” “culture,” “satisfaction,” “or engagement,” I think we all know the difference between a vibrant and a corrosive working environment. A vibrant work environment can make it feel like you love every minute on the job (believe it or not, that is actually possible!). A corrosive work culture makes it feel like you can’t get out the door fast enough. Even though we’d probably all like to think otherwise, if we’re honest I suspect we can all remember experiencing both kinds of workplace vibes in our professional lives, maybe even here at Augustana.

You might remember that last spring we conducted two surveys of Augustana employees to better understand the nature of our workplace culture and employee engagement. Although those of us who are here during the summer started mulling over the wide array of findings right away, now that everyone is back on campus the Employee Engagement Taskforce (full disclosure: I’m the chair of this Taskforce) has officially begun to delve deeper into the results of those surveys. Our charge from President Bahls is two-fold. First, we need to learn about the underlying factors that produced our employee’s responses by talking with people across all of the functional areas of the college. Second, after triangulating the data from our surveys with the insights gathered from these conversations, we need to identify a set of changes (recommendations that will almost certainly vary based on local circumstances) that we can make in ourselves, our policies, or our organizational structure that will help us improve the culture in which we all work, ultimately improving our overall level of employee engagement.

Yet while the Employee Engagement Taskforce is doing its work, it seems strange to me that we might all simply “keep calm and carry on” while waiting for some edict from on high. In fact, a wealth of research on the nature of organizations has found that it is the collective “we” that plays the dominant role in shaping employee culture, not the amorphous “they” (no matter how badly I’d love to blame some else for my annoyances du jour). So if there were something that I could do right away, I wouldn’t want to wait to read it in a report.

It turns out that an influential predictor of a healthy work environment that repeatedly pops up in our own analyses is something that we could all plug into our work right away. Consistently, how often we thought that our co-workers tried to understand the perspectives of others on campus predicted higher perceptions of transparency and trust. In turn, higher perceptions of transparency and trust predicted workplace satisfaction. Even more specifically, while perceptions of the degree to which co-workers tried to understand the perspectives of others on campus mattered regardless of the role of the co-worker, this effect was most pronounced when respondents perceived that administrators exhibited this trait.

Both findings are important. First, all of us can make our campus a better place by purposefully trying to understand issues from the perspectives of others. This doesn’t mean that you have to change your mind about something or acquiesce to someone else’s wishes. It just means that it needs to be apparent to others that you’ve recognized a measure of legitimacy in their perspective. Second, if you are someone in an administrative role, the impact of adopting this behavior is potentially transformative. With referent capital (the power that comes from a position of authority), the choice to genuinely show others that you want to understand their perspective – even if you ultimately choose to take a different course – goes a long way toward cultivating an environment that increases employee engagement across the board.

But in order to adopt this behavior, we all have to own up to our own blind spots. We’ve all got them, even if we aren’t so good at admitting it. In my case, I’ve got more than a few potential blind spots. For example, I can be overly (bordering on naively) optimistic. In addition, although I know something about student learning, I don’t have nearly the direct experience in the classroom like that of a seasoned faculty member. In fact, because I don’t interact with students nearly as much as most of you I am susceptible to confusing what I see in our quantitative data with the true breadth of our student population. Finally, I don’t know what it’s like to work in any other place at Augustana than Academic Affairs. So even though I’ve held plenty of other jobs in my life, I could assume that I know more than I do about the working lives of our non-academic employees.

These are just a few of my blind spots. My perpetual challenge is to make sure that I own up to them and seek to understand how the world looks through the lens of others before starting to dream up possible solutions. One of the early exercises of the Employee Engagement Taskforce was to collectively own up to each of our potential blind spots and to realize that others on the committee can help shine a light for each other. Furthermore, to a person we recognized that we will have to go outside of our group often if we are to fully understand the nature of the employee experience at Augustana and to identify the right set of recommendations to improve our work culture and employee engagement.

What are your blind spots? If you can own up to them, you are that much closer to making Augustana a better place to work. If we can all do that together, hmmm . . . .

Uh, oh – I think my optimism might be kicking into overdrive!

Make it a good day,



Most of us have heard the old light bulb joke . . . .

Question: “How many faculty does it take to change a light bulb?”

Answer: “Change?!?!”

Even if that quip sparks something between a snicker and a harrumph (depending on your point of view and sense of humor), the snark underlying it should really be applied to all of higher education. Most higher education institutions’ response to stagnant or slipping retention numbers makes for a telling example of this phenomenon. After decades of shifting student demographics, the dominant narrative about student persistence continues to emphasize the degree to which the students who leave are in some way not smart enough, not mature enough, or not adaptable enough to acclimate to the rarified air of the college campus. In short, the prevailing opinion is that students need to change to fit in, regardless of the cultural distance between their lives prior to college and the embedded environment at their college. So although the demographic makeup of college students has been changing for a long time, most institutions have done little more than add small spaces or programs at the margins while leaving the historically homogenous dominant culture of the campus intact.

Until the early part of the last decade, Augustana could probably have gotten away with that approach. For example, looking over the proportions of students who were not white throughout the 1990s, the percentages hovered between five and eight percent. But in the early 2000s that proportion began to climb substantially. By the fall of 2014, the proportion of non-white students had reached 20% of the total student population.

The scope of this change really jumps out if we look at two numbers across a roughly 25-year span. In the fall of 1990, there was a total of 134 non-white students at Augustana (scattered throughout an overall student population of 2,253). By comparison, in the fall of 2014 there were 146 non-white students in the freshman class alone and a total of 489 non-white students among an overall student population of 2,473.

Although this change is substantial, it is only one of several ways in which Augustana’s student demographics has shifted dramatically. In the last ten years, the number of students who qualify for a Pell Grant has almost doubled (from 355 to 606) and the proportion of freshmen with unmet financial need has jumped almost twenty percentage points (from 39.6% to 56.6%). At the same time, the proportion of Lutheran students has dropped by more than half since 1990 (from 31.7% to 12.8%) while the proportion of students with no religious affiliation has almost doubled (from 9.2% to 17.2%). Add to these changes a growing LGBT population (a number we didn’t even track until a few years ago), and the multi-dimensional scope of change in our student demographic makes previously narrow definitions of diversity – especially those that limit their focus to the color of one’s skin – surprisingly insufficient. Furthermore, the implications of this explosion of difference suggest that merely revising our assumptions, or even adding more layers of assumptions, about the backstory of our students will almost certainly leave us short. Things are changing in too many ways simultaneously for us to merely come up with a new “normal.” Even if we were to come up with a new background template for the typical Augustana student, we would almost certainly be wrong more often than we are right.

Instead, the extended scope of this change and the increased prominence of this tapestry requires that we revisit an old but useful adage. We must genuinely know our students. That doesn’t mean just knowing their names, their high school, and their academic ability. We must know their backstory; the multi-layered context through which they will make meaning of this educational experience. It is the nuance of each individual context that will define the lens through which each student sees us and the way that they hear what we say. Knowing this context and knowing how this context might shape our students’ first impressions will make a world of difference in helping all of us – student, educator, and institution – adapt together to ensure that every student succeeds.

Make it a good day,


… and Warm Fuzzy beats Cranky Skeptic by a nose!

It’s the last week of our spring horse race. No, I’m not referring to the Preakness (although that horse race was run this weekend and the most poignant reason yet to run a spell-checker won again). I’m referring to the horse race that we all feel at the end of the year, thundering around the final turn (some great horse race calls here or Spike Jones epic spoof here) to finish classes, deal with students, turn in grades, and send our graduates off to the next phase of their lives – all so that we can get out to our own summer pastures.  In the midst of trying to slog through all of this end-of-the-term slop (not unlike the muddy track at the Preakness on Saturday), it’s easy to let our cranky side get the best of us.

So in honor of the end-of-the-year horse race, those wonderfully quirky horse names, and the warm fuzzy that we could all use right about now, I thought it would be a perfect time to share some data fresh from our 2015 senior survey that is worth smiling about – maybe even worth a solid pat on the back.

Here are three years’ worth of results from three senior survey questions that, if I were forced to cut the survey down to a handful of items, these would be among the questions I’d keep. Read ’em and smile!

  • I felt a strong sense of belonging on campus. (response options scored from 1-5: strongly disagree, disagree, neutral, agree, or strongly agree)
    • 2013 – 72.1% agree or strongly agree
    • 2014 – 66.8% agree or strongly agree
    • 2015 – 75.4% agree or strongly agree!
  • I am certain that my post-graduate plans are a good fit for who I am right now and where I want my life to go. (response options scored from 1-5: strongly disagree, disagree, neutral, agree, or strongly agree)
    • 2013 – 75.5% agree or strongly agree
    • 2014 – 76.7% agree or strongly agree
    • 2015 – 81.2% agree or strongly agree!
  • If you could relive your college decision, would you choose Augustana again? (response options scored from 1-5: definitely no, probably no, not sure, probably yes, and definitely yes)
    • 2013 – 80.6% agree or strongly agree
    • 2014 – 72.3% agree or strongly agree
    • 2015 – 83.0% agree or strongly agree!

When three items are all moving in the same positive direction over time, I think we can put aside our wonky skeptical stuff for a few minutes and enjoy it.  That’s right – kick back, relax for a moment, and smile a big toothy grin.

You worked hard to make Augustana a better place for our students this year.  It just might have paid off.  Now let yourself enjoy it – you deserve it.

Make it a good day (and a great summer),


So after the first year, can we tell if CORE is making a difference?

Now that we are a little over a year into putting Augustana 2020 in motion, we’ve discovered that assessing the implementation process is deceptively difficult. The problem isn’t that the final metrics to which the plan aspires are too complicated to measure or even too lofty to achieve. Those are goals that are fairly simple to assess – we either hit our marks or we don’t. Instead, the challenge at present lies in devising an assessment framework that tracks implementation, not the end results. Although Augustana 2020 is a relatively short document, in actuality it lays out a complex, multi-layered plan that requires a series of building blocks to be constructed separately, fused together, and calibrated precisely before we can legitimately expect to meet our goals for retention and graduation rates, job acquisition and graduate school acceptance rates, or improved preparation for post-graduate success. Assessing the implementation, especially at such an early point in the process, by using the final metrics to judge our progress would be like judging a car manufacturer’s increased production speed right after the company had added a faster motor to one of the assembly lines. Of course, without having retrofitted or changed out all of the other assembly stages to adapt to this new motor, by itself such a change would inevitably turn production into a disaster.

Put simply, judging any given snapshot of our current state of implementation against the fullness of our intended final product doesn’t really help us build a better mousetrap; it just tells us what we already know (“It’s not done yet!”). During the process of implementation, the focus of assessment is much more useful if it identifies and highlights intermediate measures that give us a more exacting sense of whether we are moving in the right direction. In addition, assessing the process should tell us if the pieces we are putting in place will work together as designed or if we have to made additional adjustments to make sure the whole systems works as it should. This means narrowing our focus on the impact of individual elements on specific student behaviors, testing the fit between pieces that have to work together, and tracking the staying power of experiences that are intended to permanently impact our students’ trajectories.

With all of that said, I thought that it would be fitting to try out this assessment approach on arguably the most prominent element of Augustana 2020 – CORE. Now that CORE is finishing its first year at the physical center of our campus, it seems reasonable to ask whether we have any indicators in place that could assess whether this initiative is bearing the kind of early fruit we had hoped. Obviously, since CORE is designed to function as a part of a four-year plan of student development and preparation, it would be foolhardy to judge CORE’s ultimate effectiveness on some of the Augustana 2020 metrics until at least four years has past. However, we should look to see if there are indications that CORE’s early impact triangulates with the student behaviors or attitudes necessary for improved post-graduate success. This is the kind of data that would be immediately useful to CORE and the entire college. If indicators suggest that we are moving in the right direction, then we can move forward with greater confidence. If the indicators suggest that things aren’t working as we’d hoped, then we can make adjustments before too many other things are locked into place.

In order to find data that suggests impact, we need more than just the numbers of students who have visited CORE this year (even though it is clear that student traffic in the CORE office and at the many CORE events has been impressive). To be fair, these participation patterns could simply be an outgrowth of CORE’s new location at the center of campus (“You’ve got candy, I was just walking by, why not stop in?”). To give us a sense of CORE’s impact, we need to find data where we have comparable before-and-after numbers. At this early juncture, we can’t look at our recent graduate survey data for employment rates six months after graduation since our most recent data comes from students who graduated last spring – before CORE opened.

Yet we may have a few data points that shine some light on CORE’s impact during its first year. To be sure, these data points shouldn’t be interpreted as hard “proof.” Instead, I suggest that they are indicators of directionality and, when put in the presence of other data (be they usage numbers or the preponderance of anecdotes), we can start to lean toward some conclusions about CORE’s impact in its first year.

The first data point we can explore is a comparison of the number of seniors who have already accepted a job offer at the time they complete the senior survey. Certainly the steadily improving economy, Augustana’s existing efforts to encourage students to begin their post-graduate planning earlier, and the unique attributes of this cohort of students could also influence this particular data point. However, if we were to see a noticeable jump in this number, it would be difficult to argue that CORE should get no credit for this increase.

The second data point we could explore would be the proportion of seniors who said they were recommended to CORE or the CEC by other students and faculty. This seems a potentially indicative data point based on the assumption that neither students nor faculty would recommend CORE more often if the reputation and result of CORE’s services were no different than the reputation and results of similar services provided by the CEC in prior years. To add context, we can also look at the proportion of seniors who said that no one recommended CORE or the CEC to them.

These data points all come from the three most recent administrations of the senior survey (including this year’s edition, to which we already have 560 out of a 580 eligible respondents). The 2013 and 2014 numbers are prior to the introduction of CORE, and the 2015 number is after CORE’s first year. I’ve also calculated a proportion that includes all students whose immediate plan after graduation is to work full-time in order to account for the differences in the size of the graduating cohorts.

Seniors with jobs accepted when completing the senior survey –

  • 2013 – 104 of a possible 277 (37.5%)
  • 2014 – 117 of a possible 338 (34.6%)
  • 2015 – 145 of a possible 321 (45.2%)

Proportion of seniors indicating they were recommended to CORE or the CEC by other students –

  • 2013 – 26.9%
  • 2014 – 24.0%
  • 2015 – 33.2%

Proportion of seniors indicating they were recommended to CORE or the CEC by faculty in their major or faculty outside their major, respectively –

  • 2013 – 47.0% and 18.8%
  • 2014 – 48.1% and 20.6%
  • 2015 – 54.6% and 26.0%

Proportion of seniors indicating that no one recommended CORE or the CEC to them –

  • 2013 – 18.0%
  • 2014 – 18.9%
  • 2015 – 14.4%

Taken together, these data points seem to suggest that CORE is making a positive impact on campus.  By no means do these data points imply that CORE should be ultimately judged as a success, a failure, or anything in between at this point. However, this data certainly suggests that CORE is on the right track and may well be making a real difference in the lives of our students.

If you’re not sure what CORE does or how they do it, the best (and probably only) way to get a good answer to that question is to go there yourself, talk to the folks who work there, and see for yourself.  If you’re nice to them, they might even give you some candy!

Make it a good day,


What about faculty retention?

Last week my colleague in the institutional research office, Kimberly Dyer, suggested that although we talk about student retention all the time, it’s reasonable to argue that faculty retention may also be an important metric worth tracking. Since turnover and longevity are well-documented markers of a healthy organizational environment, it certainly makes sense for us to delve into our employee data and see what we find.

From my perspective, this question also presents an opportunity to spell out the critical importance of context in making sense of any institutional data point. In the same way that we want our students to develop the ability to withhold judgment while evaluating a claim, we help ourselves in all sorts of ways by knowing how to place institutional metrics in their proper context before concluding that everything is “just peachy,” or that “the sky is falling,” or that, more realistically, we are somewhere in between those two extremes.

Although it would be interesting to look at employee retention across all the different positions that Augustana employees hold, the variation across these positions makes it pretty hard to address the implications of all those differences in a single blog post. So today I’ll focus on faculty retention primarily because, since faculty work is so closely tied to the traditional academic calendar, we can apply an already familiar framework for understanding retention (i.e., students being retained from one fall to the next) to this discussion.

Making sense of faculty retention numbers requires an understanding of two contextual dimensions. The first involves knowing something about the range of circumstances that might influence a proportion of faculty to leave their teaching positions at Augustana. Every year there are faculty who retire and faculty who move into administrative roles (just as there are individuals who give up their administrative roles to return to teaching). In addition, there are numerous term-limited visiting and fellowship positions that are designed to turn over. There are also the cases of faculty who leave because they are not awarded tenure (although, if we’re being honest with ourselves we know that in some of these cases this decision may not be entirely because of deficiencies exhibited by the individual faculty member). Obviously, if 10% of the faculty leave in a given year it would be silly to assume that all of those individuals left because Augustana’s work environment drove them away. To make a more insightful sense of a faculty retention data point, it’s critical to understand the proportion of those individuals whose departure is attributable to flaws, weaknesses, or dysfunctions in our community climate versus the subset of faculty departures that result from the normal and healthy movement of faculty within the institution (or within higher education generally) and/or within the life course.

The second contextual dimension requires some sense of what should be considered “normal.” Since it is probably not reasonable to expect an organization to have no turnover, the next question becomes: What do similar institutions experience in faculty retention and turnover?  Without this information, we are left with the real possibility that our biases, loyalties, and aspirations will coerce us into setting expectations far above what is reasonable. Comparable data helps us check our biases at the door.

So after all of that . . . what do our faculty retention numbers look like? To come up with some numbers, we first removed all of the visiting and fellowship positions for this analyses in order to avoid counting folks whom we expect to leave. Instead, we focused our analysis on tenured and tenure-track faculty.

Without accounting for any of the faculty who moved into an administrative post or faculty who retired, our retention rate of tenured and tenure-track faculty has been 91% in each of the last three years.  When you exclude retirements and internal movement, those proportions jump to 96%, 95%, and 94% respectively. In terms of actual people (with about 150 tenured/tenure-track faculty each year), this translates into about 6 people each year. This group of people would include faculty who aren’t awarded tenure as well as those who leave for any other reason.

The one obstacle to fully placing these numbers in context is that we don’t have any real way of establishing comparable numbers from similar institutions. Maybe most institutions like us would give a lot of money for a 95% faculty retention rate. Or, maybe none of them have lost a single faculty member in the last ten years. All we know is that the number of Augustana tenured or tenure-track faculty departing each year is relatively small. In the end, even if we begrudgingly accept faculty retention as the roughest of proxies for the quality of our organizational climate, these numbers seem to suggest that we have maintained a reasonably healthy faculty climate at Augustana in the last few years.

Of course, in these cases there may well be entirely understandable reasons for each departure that have nothing to do with our working environment. At the same time it’s always worth asking, no matter how small the number of people who choose not to come back, if there are things we can do to improve the quality of our work environment. Certainly there are things that we can improve that might never become so influential as to drive someone to leave. With the almost-completed Augustana College Employee Engagement study, we are on our way to identifying some of those issues. But at least on one measure of organizational quality that seems a reasonable, albeit rough, metric, we might actually be doing pretty well.

Make it a good day,



Riding the waves of within-year retention

I was talking with the Faculty Council recently about this year’s term-to-term retention rates when one council member suggested that I should share these numbers with the campus community.  Of course, this was a very good idea – and something that I should have done several weeks ago. So, with apologies to everyone who cares about retention (AKA everyone), here we go.

In the table below, I’ve listed the fall-to-winter term and fall-to-spring term retention rates for each class as well as the four-year averages for these data points in order to give some of these numbers context.

Fall-to-Winter Term Retention

Fall-to-Spring Term Retention
Class 4 Yr. Avg. This Year 4 Yr. Avg. This Year
1st Year 96.5% 95.7% 92.9% 93.4%
2nd Year 97.9% 98.3% 95.5% 95.4%
3rd Year 98.3% 97.1% 97.9% 96.7%
4th Year 98.3% 97.4% 93.6% 93.3%

There are a couple of things that jump off the page immediately when trying to take in all of these numbers at once. First, breaking retention down to this level of detail can make it pretty overwhelming. It is easy to get a little vertigo staring at all the different percentages, wondering how in the world anyone decides which ones are good or bad or somewhere in between.

Second, the numbers – as well as the differences between any particular number and its corresponding four-year average – bounce around a bit. For example, although the first year students’ fall-to-winter retention rate was slightly below the four-year average, their fall-to-spring retention rate exceeds the four-year norm. Conversely, while the second year students’ fall-to-winter retention rate was higher than the four-year average, their fall-to-spring retention rate ensures that we don’t get a big head.

Third, it’s not necessarily true that a given year’s retention rate below the four-year average is uniformly a bad thing. For example, over the last several years we’ve been watching the number of seniors who finish a term early inch upward. It seemed inevitable that this would happen at some point with the increasing number of college and AP credits that incoming students bring to Augustana. And as the cost of college has jumped, we probably shouldn’t be surprised at all if a few more students want to avoid that 12th term of tuition by graduating after the winter term. I get that fewer students = less tuition = budget reductions = more stress. But if our mission is to educate, and if a student has completed all that we have asked him or her to do, then I’m not sure we can be all that disappointed that they don’t stay for the spring term – especially since we haven’t designed the broader Augustana experience to culminate in any unique way during the spring of the senior year. This is not a criticism one way or another; rather I only point to this example to demonstrate how complicated this retention conversation can be.

In the end, making accurate sense of any particular within-year retention number requires a black belt in withholding judgment, a hefty dose of context, and a battle-tested nervous system. In the end, retention data is sort of like the “check engine” light in your car. When it lights up it might mean that the only thing that doesn’t work is the fuse that controls the “check engine” light. Or it might mean that something serious is going wrong under the hood and you could be in big trouble if you don’t take your car to a mechanic today. Either way, you don’t panic just because the light comes on. At the same time, you don’t shrug it off. You take a deeper look at what you are doing and try to figure out if there is anything you could do better.

Make it a good day.