So after the first year, can we tell if CORE is making a difference?

Now that we are a little over a year into putting Augustana 2020 in motion, we’ve discovered that assessing the implementation process is deceptively difficult. The problem isn’t that the final metrics to which the plan aspires are too complicated to measure or even too lofty to achieve. Those are goals that are fairly simple to assess – we either hit our marks or we don’t. Instead, the challenge at present lies in devising an assessment framework that tracks implementation, not the end results. Although Augustana 2020 is a relatively short document, in actuality it lays out a complex, multi-layered plan that requires a series of building blocks to be constructed separately, fused together, and calibrated precisely before we can legitimately expect to meet our goals for retention and graduation rates, job acquisition and graduate school acceptance rates, or improved preparation for post-graduate success. Assessing the implementation, especially at such an early point in the process, by using the final metrics to judge our progress would be like judging a car manufacturer’s increased production speed right after the company had added a faster motor to one of the assembly lines. Of course, without having retrofitted or changed out all of the other assembly stages to adapt to this new motor, by itself such a change would inevitably turn production into a disaster.

Put simply, judging any given snapshot of our current state of implementation against the fullness of our intended final product doesn’t really help us build a better mousetrap; it just tells us what we already know (“It’s not done yet!”). During the process of implementation, the focus of assessment is much more useful if it identifies and highlights intermediate measures that give us a more exacting sense of whether we are moving in the right direction. In addition, assessing the process should tell us if the pieces we are putting in place will work together as designed or if we have to made additional adjustments to make sure the whole systems works as it should. This means narrowing our focus on the impact of individual elements on specific student behaviors, testing the fit between pieces that have to work together, and tracking the staying power of experiences that are intended to permanently impact our students’ trajectories.

With all of that said, I thought that it would be fitting to try out this assessment approach on arguably the most prominent element of Augustana 2020 – CORE. Now that CORE is finishing its first year at the physical center of our campus, it seems reasonable to ask whether we have any indicators in place that could assess whether this initiative is bearing the kind of early fruit we had hoped. Obviously, since CORE is designed to function as a part of a four-year plan of student development and preparation, it would be foolhardy to judge CORE’s ultimate effectiveness on some of the Augustana 2020 metrics until at least four years has past. However, we should look to see if there are indications that CORE’s early impact triangulates with the student behaviors or attitudes necessary for improved post-graduate success. This is the kind of data that would be immediately useful to CORE and the entire college. If indicators suggest that we are moving in the right direction, then we can move forward with greater confidence. If the indicators suggest that things aren’t working as we’d hoped, then we can make adjustments before too many other things are locked into place.

In order to find data that suggests impact, we need more than just the numbers of students who have visited CORE this year (even though it is clear that student traffic in the CORE office and at the many CORE events has been impressive). To be fair, these participation patterns could simply be an outgrowth of CORE’s new location at the center of campus (“You’ve got candy, I was just walking by, why not stop in?”). To give us a sense of CORE’s impact, we need to find data where we have comparable before-and-after numbers. At this early juncture, we can’t look at our recent graduate survey data for employment rates six months after graduation since our most recent data comes from students who graduated last spring – before CORE opened.

Yet we may have a few data points that shine some light on CORE’s impact during its first year. To be sure, these data points shouldn’t be interpreted as hard “proof.” Instead, I suggest that they are indicators of directionality and, when put in the presence of other data (be they usage numbers or the preponderance of anecdotes), we can start to lean toward some conclusions about CORE’s impact in its first year.

The first data point we can explore is a comparison of the number of seniors who have already accepted a job offer at the time they complete the senior survey. Certainly the steadily improving economy, Augustana’s existing efforts to encourage students to begin their post-graduate planning earlier, and the unique attributes of this cohort of students could also influence this particular data point. However, if we were to see a noticeable jump in this number, it would be difficult to argue that CORE should get no credit for this increase.

The second data point we could explore would be the proportion of seniors who said they were recommended to CORE or the CEC by other students and faculty. This seems a potentially indicative data point based on the assumption that neither students nor faculty would recommend CORE more often if the reputation and result of CORE’s services were no different than the reputation and results of similar services provided by the CEC in prior years. To add context, we can also look at the proportion of seniors who said that no one recommended CORE or the CEC to them.

These data points all come from the three most recent administrations of the senior survey (including this year’s edition, to which we already have 560 out of a 580 eligible respondents). The 2013 and 2014 numbers are prior to the introduction of CORE, and the 2015 number is after CORE’s first year. I’ve also calculated a proportion that includes all students whose immediate plan after graduation is to work full-time in order to account for the differences in the size of the graduating cohorts.

Seniors with jobs accepted when completing the senior survey -

  • 2013 – 104 of a possible 277 (37.5%)
  • 2014 – 117 of a possible 338 (34.6%)
  • 2015 – 145 of a possible 321 (45.2%)

Proportion of seniors indicating they were recommended to CORE or the CEC by other students -

  • 2013 – 26.9%
  • 2014 – 24.0%
  • 2015 – 33.2%

Proportion of seniors indicating they were recommended to CORE or the CEC by faculty in their major or faculty outside their major, respectively -

  • 2013 – 47.0% and 18.8%
  • 2014 – 48.1% and 20.6%
  • 2015 – 54.6% and 26.0%

Proportion of seniors indicating that no one recommended CORE or the CEC to them -

  • 2013 – 18.0%
  • 2014 – 18.9%
  • 2015 – 14.4%

Taken together, these data points seem to suggest that CORE is making a positive impact on campus.  By no means do these data points imply that CORE should be ultimately judged as a success, a failure, or anything in between at this point. However, this data certainly suggests that CORE is on the right track and may well be making a real difference in the lives of our students.

If you’re not sure what CORE does or how they do it, the best (and probably only) way to get a good answer to that question is to go there yourself, talk to the folks who work there, and see for yourself.  If you’re nice to them, they might even give you some candy!

Make it a good day,


What about faculty retention?

Last week my colleague in the institutional research office, Kimberly Dyer, suggested that although we talk about student retention all the time, it’s reasonable to argue that faculty retention may also be an important metric worth tracking. Since turnover and longevity are well-documented markers of a healthy organizational environment, it certainly makes sense for us to delve into our employee data and see what we find.

From my perspective, this question also presents an opportunity to spell out the critical importance of context in making sense of any institutional data point. In the same way that we want our students to develop the ability to withhold judgment while evaluating a claim, we help ourselves in all sorts of ways by knowing how to place institutional metrics in their proper context before concluding that everything is “just peachy,” or that “the sky is falling,” or that, more realistically, we are somewhere in between those two extremes.

Although it would be interesting to look at employee retention across all the different positions that Augustana employees hold, the variation across these positions makes it pretty hard to address the implications of all those differences in a single blog post. So today I’ll focus on faculty retention primarily because, since faculty work is so closely tied to the traditional academic calendar, we can apply an already familiar framework for understanding retention (i.e., students being retained from one fall to the next) to this discussion.

Making sense of faculty retention numbers requires an understanding of two contextual dimensions. The first involves knowing something about the range of circumstances that might influence a proportion of faculty to leave their teaching positions at Augustana. Every year there are faculty who retire and faculty who move into administrative roles (just as there are individuals who give up their administrative roles to return to teaching). In addition, there are numerous term-limited visiting and fellowship positions that are designed to turn over. There are also the cases of faculty who leave because they are not awarded tenure (although, if we’re being honest with ourselves we know that in some of these cases this decision may not be entirely because of deficiencies exhibited by the individual faculty member). Obviously, if 10% of the faculty leave in a given year it would be silly to assume that all of those individuals left because Augustana’s work environment drove them away. To make a more insightful sense of a faculty retention data point, it’s critical to understand the proportion of those individuals whose departure is attributable to flaws, weaknesses, or dysfunctions in our community climate versus the subset of faculty departures that result from the normal and healthy movement of faculty within the institution (or within higher education generally) and/or within the life course.

The second contextual dimension requires some sense of what should be considered “normal.” Since it is probably not reasonable to expect an organization to have no turnover, the next question becomes: What do similar institutions experience in faculty retention and turnover?  Without this information, we are left with the real possibility that our biases, loyalties, and aspirations will coerce us into setting expectations far above what is reasonable. Comparable data helps us check our biases at the door.

So after all of that . . . what do our faculty retention numbers look like? To come up with some numbers, we first removed all of the visiting and fellowship positions for this analyses in order to avoid counting folks whom we expect to leave. Instead, we focused our analysis on tenured and tenure-track faculty.

Without accounting for any of the faculty who moved into an administrative post or faculty who retired, our retention rate of tenured and tenure-track faculty has been 91% in each of the last three years.  When you exclude retirements and internal movement, those proportions jump to 96%, 95%, and 94% respectively. In terms of actual people (with about 150 tenured/tenure-track faculty each year), this translates into about 6 people each year. This group of people would include faculty who aren’t awarded tenure as well as those who leave for any other reason.

The one obstacle to fully placing these numbers in context is that we don’t have any real way of establishing comparable numbers from similar institutions. Maybe most institutions like us would give a lot of money for a 95% faculty retention rate. Or, maybe none of them have lost a single faculty member in the last ten years. All we know is that the number of Augustana tenured or tenure-track faculty departing each year is relatively small. In the end, even if we begrudgingly accept faculty retention as the roughest of proxies for the quality of our organizational climate, these numbers seem to suggest that we have maintained a reasonably healthy faculty climate at Augustana in the last few years.

Of course, in these cases there may well be entirely understandable reasons for each departure that have nothing to do with our working environment. At the same time it’s always worth asking, no matter how small the number of people who choose not to come back, if there are things we can do to improve the quality of our work environment. Certainly there are things that we can improve that might never become so influential as to drive someone to leave. With the almost-completed Augustana College Employee Engagement study, we are on our way to identifying some of those issues. But at least on one measure of organizational quality that seems a reasonable, albeit rough, metric, we might actually be doing pretty well.

Make it a good day,



Riding the waves of within-year retention

I was talking with the Faculty Council recently about this year’s term-to-term retention rates when one council member suggested that I should share these numbers with the campus community.  Of course, this was a very good idea – and something that I should have done several weeks ago. So, with apologies to everyone who cares about retention (AKA everyone), here we go.

In the table below, I’ve listed the fall-to-winter term and fall-to-spring term retention rates for each class as well as the four-year averages for these data points in order to give some of these numbers context.

Fall-to-Winter Term Retention

Fall-to-Spring Term Retention
Class 4 Yr. Avg. This Year 4 Yr. Avg. This Year
1st Year 96.5% 95.7% 92.9% 93.4%
2nd Year 97.9% 98.3% 95.5% 95.4%
3rd Year 98.3% 97.1% 97.9% 96.7%
4th Year 98.3% 97.4% 93.6% 93.3%

There are a couple of things that jump off the page immediately when trying to take in all of these numbers at once. First, breaking retention down to this level of detail can make it pretty overwhelming. It is easy to get a little vertigo staring at all the different percentages, wondering how in the world anyone decides which ones are good or bad or somewhere in between.

Second, the numbers – as well as the differences between any particular number and its corresponding four-year average – bounce around a bit. For example, although the first year students’ fall-to-winter retention rate was slightly below the four-year average, their fall-to-spring retention rate exceeds the four-year norm. Conversely, while the second year students’ fall-to-winter retention rate was higher than the four-year average, their fall-to-spring retention rate ensures that we don’t get a big head.

Third, it’s not necessarily true that a given year’s retention rate below the four-year average is uniformly a bad thing. For example, over the last several years we’ve been watching the number of seniors who finish a term early inch upward. It seemed inevitable that this would happen at some point with the increasing number of college and AP credits that incoming students bring to Augustana. And as the cost of college has jumped, we probably shouldn’t be surprised at all if a few more students want to avoid that 12th term of tuition by graduating after the winter term. I get that fewer students = less tuition = budget reductions = more stress. But if our mission is to educate, and if a student has completed all that we have asked him or her to do, then I’m not sure we can be all that disappointed that they don’t stay for the spring term – especially since we haven’t designed the broader Augustana experience to culminate in any unique way during the spring of the senior year. This is not a criticism one way or another; rather I only point to this example to demonstrate how complicated this retention conversation can be.

In the end, making accurate sense of any particular within-year retention number requires a black belt in withholding judgment, a hefty dose of context, and a battle-tested nervous system. In the end, retention data is sort of like the “check engine” light in your car. When it lights up it might mean that the only thing that doesn’t work is the fuse that controls the “check engine” light. Or it might mean that something serious is going wrong under the hood and you could be in big trouble if you don’t take your car to a mechanic today. Either way, you don’t panic just because the light comes on. At the same time, you don’t shrug it off. You take a deeper look at what you are doing and try to figure out if there is anything you could do better.

Make it a good day.



Don’t look now, but the wheels of improvement are already in motion

430 responses.  Wow.

About 75% of Augustana’s full-time employees responded to the Augustana College Employee Survey over the last three weeks. Moreover, we got a great response from each segment of Augustana employees – faculty, staff, and administrators. I have to admit, after doing almost everything I could to encourage responses short of marching around campus in an sandwich board and chicken costume, I am thrilled. I would have been genuinely happy with 350 responses.

So … Congratulations! This means that the average response to each item is almost certain to closely reflect the perception of the entire employee population. As a result, we can be confident that whatever issues emerge from this data are not mere artifacts of the numbers we happen to collect. In addition, the quality of this data set will allow us to pursue all sorts of interesting analyses of various smaller segments of our employee population, further improving the potential for this study to help us legitimately improve the environment in which we all work.

Of course, this also means that if your earnestly held belief about a prevailing attitude among Augustana employees is contradicted by the findings of this study, you are going to be faced with a gnarly dilemma. Either you’ll have to accept the strong likelihood that your opinion has turned out not to be so, or you’ll have to present compelling evidence that refutes these findings. I suppose you could choose to double down on your belief, facts be damned, full speed ahead. But the reality of a 75% response rate means that, like it or not, the findings from this survey are pretty solid. And just so you don’t think that I’m trying to be some sort of righteous researcher revelling in my own rectitude (that line sounds great if you roll your R’s), I’ve already had to eat crow on one issue where the data makes it pretty clear that I was dead wrong. (Yes, it tastes about like what you’d think.)

Today (Monday, April 20th, 2015) we start collecting data for the second half of our employee engagement project. Later today I’ll send out an email with an invitation to participate in the The Gallup Employee Engagement Survey (otherwise known as the Q12 if you want to sound hip and “in the know” around other geeked out quant researchers). While our first survey was designed internally so that we could hone in on some important questions specific to Augustana College, the second survey, the Q12, gives us some comparison data that can function as a sort of grounding point to more realistically assess ourselves. Moreover, we will be able to get data from Gallup that we can use to compare ourselves to other educational organizations, giving us an even better sense of how we might realistically improve. The Gallup Q12 Survey is built on several decades of in-depth research on employee engagement. Some of the questions might strike you as unusual at first, but know that a virtual ocean of analysis has gone into developing the questions that compose this survey.

And in order to ensure that we don’t unintentionally bias the responses to the Q12, we won’t publicly release the results of our own Augustana Employee Survey until the Gallup data has been collected.  Even though the questions in both surveys are not identical, there is enough overlap that we need to be careful. Beside, this will give me a few weeks to process all of the data and turn it into something that will be a lot easier to read. As much as my inner quant geek would love it, I suspect that you don’t want me to send you a massive excel spreadsheet and call it good!

You will receive an email soon with a link to participate in the Q12. You’ll hear about this survey from me more than a few times in the next three weeks. Just like the first half of this project, your participation in the Q12 matters immensely.

430 responses to our first survey makes a giant statement about how much we value making Augustana a great place to work and a great community to join. It also means that this community made the collective commitment to improve – even if you did not individually complete the first survey. Whether you like or not, the improvement train has left the station and we’re all on it.

Make it a good day,


The race to get old started yesterday. Hurry up!

A little over a week ago the Wall Street Journal published a short piece entitled, “Today’s Anxious Freshmen Declare Majors Far Faster Than Their Elders:Weak job market and high debt loads prompt broad shift away from intellectual exploration.” They cited data from their own small but random survey of colleges and universities suggesting that more and more freshmen declare their majors earlier. While the article and those interviewed for it speculated about a variety of factors that might be driving this phenomenon, the conclusion seemed pretty clear: college is now much less about discovering yourself first and finding a career later and much more about locking into a track for a career.

I thought it would be interesting to see if our own data reflected a similar trend. We were able to examine data over a similar time period, exploring the differences between students who entered Augustana as freshmen in the fall of 2007 and students who entered Augustana as freshmen in the fall of 2013. In addition, I thought it would be interesting to expand on the Wall Street Journal analysis since they aren’t clear about when the institutional data they presented was collected (in the fall of the first year? in the spring of the first year? at the beginning of the second year?). So we compared the two freshmen cohorts noted above in three ways. First, what proportion of the class indicated that they were undecided on their major when they applied to Augustana? Second, what proportion of those undecided students had declared a major by the beginning of their second year? And third, what proportion of the entire freshman class had declared a major by the beginning of the second year?

Our Augustana results seem to parallel the findings reported by the Wall Street Journal. During the application process, 16% (111 of 713) of the 2007 first-year cohort indicated that they were undecided about their major. During the 2013 cohort’s application process, only 11% (70 of 627) selected “undecided” when asked about their intended major. Interestingly, the proportion of these initially undecided students who had chosen a major by the beginning of their second year did not change appreciably between the fall of 2008 and the fall of 2014. Of the undecided majors from the 2007 cohort, 68% (63 of 92 – the remaining 19 did not persist to the second year) had still not selected a major one year later.  From the 2013 cohort, 69% (40 of 58 – the remaining 12 did not return to Augustana) of the initially undecided remained undeclared.

The biggest difference between the two cohorts can be found in the proportion of students who had declared a major by the beginning of the second year. Remember, the position taken by the Wall Street Journal article was that students take less time for intellectual pursuits and narrow their focus on a major earlier than in previous years. At Augustana, It appears that we are seeing a similar phenomenon.  While 54% of the 2007 first-year cohort had not yet declared their major by the beginning of the second year, only 36% of the 2013 cohort were still undeclared majors by the beginning of the second year.

So . . . is this a bad thing?

Honestly, I’m not sure.  In the end, I don’t know that we will have much success telling students that they are wrong to respond to external pressures of a tight job market and high student debt by choosing their major earlier. That kind of approach is likely to come across as tone-deaf to some very real concerns. It seems to me that this data re-emphasizes the importance of timely and substantive conversations between students and all of us who impact their education (faculty, administrators, work supervisors, residence life staff, student life staff, and fellow students) that push students to develop themselves even as they are preparing for life after college. Personal and intellectual development and career preparation ought to be a “both/and” enterprise.

If we can do that, our students are likely to grow and change in just the ways that we hoped they would.

Make it a good day,


What if our students could point to their most important learning moments?

If we could make a college education work perfectly, our students would do more than learn. In addition, they would be able to point to those actual moments during their college career when an interaction, an experience, or a discovery altered their trajectory regarding their plans for life after college. Although this might sound a little dreamy aspirational, it turns out that students who can talk about their learning experiences in this way tend to have a sort of educational momentum that seems to set them apart from their peers. These are the students who do the little things to put themselves in the early running for advantageous opportunities that ultimately lead to a deeper sense of purpose and direction as well as stronger job applications and stronger graduate school applications. These students make folks like me wish I had had some of what those students have when I was their age.

That’s why it makes a lot of sense to find out what proportion of our freshmen have this kind of perspective after their first year at Augustana. Ideally, we’d like to be able to cultivate that deeper level of awareness in more of our students by figuring out if there are ways that we could make this happen in more of them. So at the end of the year we ask our freshmen to agree or disagree with the following statement: “Reflecting on the past year, I can think of specific experiences or conversations that helped me clarify my life/career goals (e.g. conversations with faculty/staff, organized activities with other students, community involvement, specific classes, etc.).”

Here’s how last years’ freshmen responded (remember that not all freshmen completed this survey):

  • Strongly disagree –  9 (4%)
  • Disagree          -     15 (6%)
  • Neutral           –       64 (28%)
  • Agree            -       100 (43%)
  • Strongly agree    -   44 (19%)

My reaction to this bit of data is a little mixed.  On the one hand, most of the students either agreed or strongly agreed with this statement.  On the other hand, 87 of our respondents can’t seem to put themselves in an affirmative category.

To be fair, it would be a little naive to think that we could hand out inspirational moments like some kind of kitschy swag. At the same time, it would be awfully useful to know whether there are things we could do to increase the likelihood that a given freshman would say that they could point to a specific experience in their first year that helped them clarify their life or career goals.

After testing a host of possibilities, we found five items that significantly increased the likelihood of this perspective among our freshmen. Interestingly, in addition to a set of experiences that come from all facets of a residential college life these items indicate a certain type of experience that provides some guidance for our work.  Here are those five items:

  • How frequently did your faculty ask you to try to understand someone else’s views by imagining how an issue looks from his or her perspective?
  • My instructors recommended specific out-of-class experiences that would enhance my learning and growth.
  • My adviser asked me about my career goals and post-graduate aspirations.
  • My out-of-class experiences helped involve me in community service off-campus.
  • About how often have you had serious conversations with students who are very different from you?

Again, as we’ve found in other analyses of our student data, the ideal college experience depends upon the work that each of us do, no matter if it is inside or outside of a classroom. But today I want to highlight the role of faculty reflected in these items. Instructors who often ask students to practice perspective-taking in order to better understand someone else’s views, instructors who take the time to recommend specific out-of-class learning experiences, and advisers (in other words, faculty who are first-year advisers) who ask students about their career goals and post-graduate aspirations all appear to significantly contribute to the quality of our students’ educational experience. Students who experience these kinds of faculty interactions seem to be more likely to be able to point to specific moments in their first year experience that helped them hone in on their post-graduate goals.

The other thing I like about this list of faculty interactions is that, no matter the course or the discipline, at least one of these items seems possible. If your course doesn’t lend itself to perspective-taking exercises, you could point students toward particularly valuable educational experiences on campus or in the community. If your class is composed of students who are already highly involved, you could engage them in perspective-taking skill development. And when students engage you outside of class, you could take a moment to ask them about their life goals beyond college.  I hope you will consider finding a way to plug one of these items into your regular interactions with students.  Good luck with your spring term!

Make it a good day,


We’ve Still Got a Long Way to Go

Most of the time, I try to write a post that includes both a deep dive into some morsel of data and a few implications that I think might be embedded in that data.  But this week, I think I’m going to try to dispense with a longer examination of implications and just lay out a set of responses to a single question from our recent survey of prospective students that we conducted in collaboration with the Hanover Research Group.

An early question in the survey asked the respondents to select the top five words that best described the college they would most like to attend.  You might recall that last week I pointed out that “affordable” was the most frequently selected word (not a big surprise, right?) and that “liberal arts” was pretty far down the list.

Although it’s certainly interesting to see the ordering of selected words from highest to lowest, it’s also potentially enlightening to look at how different subgroups of respondents respond to similar words. Parsing the responses of white and non-white respondents exposes a stark difference worth noting.

In digging deeper in the responses to this same question, the disparity between white and non-white respondents in selecting the word “diverse” really jumped out at me. White respondents selected this word 15% of the time. Non-white respondents selected this word 46% of the time. Given the substantial demographic shifts that are already underway across our primary recruiting region, this difference seems particularly important.  In addition to the moral imperative for us to continue to diversify our student body, it appears that ignoring such an imperative could increase our future economic risk as well.

While this finding is interesting, asking respondents to choose their top five words from a long list of possible options can complicate the interpretation of the results. So I want to show you another set of responses, parsed by white and non-white respondents, to a very specific question that asks respondents to indicate how important a diverse student body is to them when selecting a college.

Response Option                                                                        White              Non-White

Not at all important 11% 2%
Slightly important 19% 3%
Moderately important 40% 27%
Very important 25% 38%
Essential 5% 31%
Very important + Essential 30% 69%

As you can probably tell, non-white respondents trend toward thinking that a diverse student body matters a lot.  By contrast, it appears that white respondents trend toward thinking that a diverse student body matters some, but not nearly as much.

Yes; there is probably more than one reason for this difference in responses. And it’s not as if the difference between the two sets of responses are in complete opposition to each other. But, I hope this data will further underscore the reasons why we need to be active champions for equality. We’ve still got a long way to go.

Make it a good day,


Who are we talking to when we use the term “liberal arts”?

If a complete stranger had stumbled onto campus the weekend before last they might have thought that Augustana was the busiest college on the planet. That Saturday (January 17th), the Admissions Office hosted one of our largest annual open-house events for prospective students and families. While this event always draws large numbers, this year the number of visitors to campus (prospective students and their parents combined) may well have exceeded the actual number of Augustana students living on campus.

With the college recruiting season hurtling into the most critical few months of the year, every little bit of information that we can learn about prospective students and their parents and their decision-making process matters. To that end, we’ve been gathering data on the things that are most important to our prospective students and their parents as they evaluate, and ultimately select, a college. One way that researchers try to get at this kind of information is to ask folks to pick five words or phrases from a longer list of words or phrases that they think best describe an idyllic college experience. As you might expect from this year’s prospective students, “affordable” topped the list with 57% of the respondents choosing it.  Other words near the top of the list included “friendly” (41%), “safe” (39%), “respected,” (38%), and “career-oriented” (33%).

Much further down the list, 15th to be exact, sits the phrase “liberal arts” (just 12% of respondents thought this was a top-five word for them). Since rank ordering the words selected ends up clustering “liberal arts” with a seemingly contradictory group of terms (e.g., “small,” “large,” “rigorous,” and “flexible,”), it’s clear that we probably  shouldn’t go all Chicken Little just yet. Look on the bright side: only 6% of the respondents selected “party school.”

The question this finding raises for me, however, isn’t really about the exact ranking of the term “liberal arts.” My concern is that there seems to be a substantive gap between the degree to which we (faculty, staff, administrators, board members) use the phrase “liberal arts” to describe who we are and the level of importance that prospective students responding to this survey gave it. To make matters worse, this data doesn’t come from some general survey of potential college-going students; these responses came from students in our own inquiry pool (i.e., students who have either contacted us directly or students who fit a profile of those who might be interested in us).

Now please don’t conclude that I’m suggesting the elimination of the term or the philosophy behind it. On the contrary, I happen to think that if we are going to remain a viable college then we will have to explicitly embody a liberal arts philosophy that focuses on integrating and synthesizing preexisting knowledge. Almost exactly a year ago, I went on a three-post rant about it here, here, and here.

Rather, I suspect that the term “liberal arts” means very little of substance to prospective students. Maybe it is, like many other words that get used over and over again in marketing materials, a case where the phrase means one thing to an internal audience and something else to an external audience. When we use the term, even though we might not all agree exactly, I think we could describe relatively precisely the dispositions of a liberally educated individual.  This finding increases my worry that when an external audience, most notably prospective students, sees this term, they have a much less precise sense of its meaning. In that context, “liberal arts” might mean little more than “small” or “rigorous.” It also could end up being interpreted to mean “lots of classes in fields I’m not interested in” or, even worse, “a club that maybe we’ll let you into.”

I certainly don’t have a brilliant answer to this challenge. But I think it is worth noting that just because we have a term that we believe describes us well doesn’t mean that this term will compel others who are new to the concept of college to buy what we are selling. There’s nothing wrong with believing in what we do; even drinking our own Kool-Aid. We just better be able to spell out what we do and why it works in a way that makes sense to regular folks who seem to care a lot more about affordability.

Make it a good day,


Setting a high bar for equality in graduation

US News rankings have never been my favorite part of higher education. For many years these rankings did little more than con colleges and universities into an illusory arms race under the guise of increasing educational quality. But recently US News has started to use their data, power, and influence to prod more useful conversations that might lead to improvements at higher education institutions. Last week, US News released their rankings for “Which top-ranked colleges operate most efficiently.” Like last year Augustana appeared near the top of the list among liberal arts colleges, suggesting that we apply our limited resources effectively to educate our students. Whether conversations about “efficiency” give you a warm fuzzy or a cold shudder, I don’t think it’s particularly controversial to say that such recognition is, at the very least, more good than bad.

But in keeping with their deified status in higher education, the US News rankings giveth and the US News rankings taketh away. A few weeks ago, they released another set of rankings that I found particularly intriguing given our recent campus discussions about equality and social justice. This set of rankings focused on the graduation rates of low-income students, and contrasted the proportion of low income students who ultimately graduate from each institution with each institution’s overall graduation rate. Based on these two numbers, US News identified colleges and universities that they called “top performers,” “over performers,” and “under performers.” Sadly, Augustana appeared in the under performer group with a 13 percentage point deficit between our overall six-year graduation rate (78%) and our six-year graduation rate of low-income students (65%). Just in case  you’re wondering, these graduation rates come from students who entered college in the fall of 2007.

Because of the focused nature of this particular analysis, US News combined all institutions from their two national ranking categories (national universities and national liberal arts colleges) to create these three groups. The presence of several familiar institutions in each group suggests that there might be something to learn about graduating low-income students from other similar institutions that might in turn help us narrow our own disparity in graduation rates. 

The criteria for the “top performer” category required that the institution’s overall graduation rate was above 80% and that the graduation rate of low-income students was the same (or within a percentage point). While there were numerous national liberal arts colleges on the list, they were generally highly ranked institutions with well known pedigrees. However, two familiar institutions appeared in this category that seemed worth highlighting.

  • St. Olaf College – overall grad rate: 88%, low-income grad rate: 87%
  • Gustavus Adolphus College – overall and low-income grad rate: 82%

The criteria for the “over performer” category was simply that low-income students graduated at a higher rate than the overall student population. There were several institutions in this group that are not too different from us, particularly based on their US News overall ranking (remember, Augustana was ranked #105 this year).  These institutions include:

  • Drew University (#99) – overall grad rate: 69%, low-income grad rate: 76%
  • College of the Atlantic (#99) – overall grad rate: 69%, low-income grad rate: 75%
  • Knox College (#81) – overall grad rate: 79%, low-income grad rate: 83%
  • Lewis & Clark College (#77) – overall grad rate: 74%, low-income grad rate: 79%
  • Beloit College (#61) – overall grad rate: 78%, low-income grad rate: 83%

Interestingly, there were also some institutions in the over performer group that probably wouldn’t dare to dream of a ranking approaching the top 100. In other words, they would probably trade their place for ours in a heartbeat. A few to note include:

  • Oglethorpe University (#148) – overall grad rate: 62%, low-income grad rate: 67%
  • Illinois College (#155) – overall grad rate: 64%, low-income grad rate: 68%
  • Warren Wilson College (#165) – overall grad rate: 51%, low-income grad rate: 60%
  • Ouachita Baptist University (#176) – overall grad rate: 60%, low-income grad rate: 80%
  • Wisconsin Lutheran College (#178) – overall grad rate: 64%, low-income grad rate: 75%

Finally, the under performer group noted institutions where low-income students graduated at rates lower than the overall graduation rate. Some similar/familiar liberal arts colleges in this group included:

  • Augustana College (#105) – overall grad rate: 78%, low-income grad rate: 65%
  • Washington College (#105) – overall grad rate: 68%, low-income grad rate: 49%
  • Hampden-Sydney College (#105) – overall grad rate 62%, low-income grad rate: 43%
  • St. Mary’s College of Maryland (#89) – overall grad rate: 73%, low-income grad rate: 64%
  • Wittenberg University (#139) – overall grad rate: 63%, low-income grad rate: 49%
  • Alma College (#139) – overall grad rate: 61%, low-income grad rate: 44%

Although we ought to be careful not to jump to rash conclusions from this data alone, there are a couple of suppositions that this data seems to contradict. First, although the national graduation rates for low-income students consistently lag behind overall graduation rates, this is not necessarily so at every institution. Some institutions graduate low-income students at substantially higher rates than the the rest of their students. Second, it does not appear that institutional wealth, prestige, or academic profile guarantees graduation equity. There are institutions at both ends of the ranking spectrum that manage to graduate low-income students at a higher rate than the rest of their students. Third, geographical location doesn’t necessarily ensure success or failure. Successful institutions are located in both urban and rural locations.

I don’t know what makes each of these successful institutions achieve graduation equality. But in looking at our own disparity in graduation rates, it seems to me that we might learn something from these institutions that have found ways to graduate low-income students at rates similar to the rest of their students. We have set our own bar pretty high (our overall graduation rate of 78% is comparable or higher than all of the institutions I listed from the US News over performer category). Now it’s up to us to make sure that every student we enroll can clear that height. We shouldn’t be satisfied with anything less.

Make it a good day,


Some Key Findings from our Recent Alumni Survey

Every once in a while you get lucky enough to have multiple studies that all point pretty clearly to the same conclusions.  So in the spirit of Christmas, I give you a gift of confirmatory evidence that all of what we do at Augustana – in the classroom and outside of it – matters for student learning.  Special thanks should go to my student assistant, Melanie, who did all of the data analysis and even wrote the first draft of this post.  Thanks, Melanie!

The Recent Alumni Survey asks a cohort of graduates about their experiences in the nine months since they walked across the stage to receive their diploma. Three items in this survey are designed to get at some of the intended outcomes of an Augustana education.  Those items ask:

  • To what extent do you feel your Augustana experience prepared you to succeed in your current program?
  • To what extent do you feel your Augustana experience prepared you to succeed in your current position/job?
  • To what degree does your current professional/educational status align with your long term career goals?

The first two questions address our graduates’ perception of the quality of their preparation for their next step in adult life, be it graduate school or their first foray into the world of work. Because we care about the full arc of our graduates’ adult lives, the third question addresses the degree to which that “next step” – the one for which our mission demands that we play an important role in preparation and selection, aligns with their long term goals.

To help us improve the quality of an Augustana education, we want to determine the nature of the relationship between college experiences that we already believe to be important (gleaned from our last senior survey) and our graduates’ lives nine months after they graduated. To this end, we linked responses from our 2013 senior survey and same individuals who responded to our recent graduate survey in the winter and early spring of 2014. After identifying which senior survey items significantly predicted (in a statistical sense) these recent alumni outcomes, we expanded our analysis to account for several factors that might confound our findings: race, socio-economic status, gender and cumulative GPA. The table below shows the experiences that emerged as statistically significant positive predictors for each outcome organized by the nature of the environment in which those experiences exist.

  To what degree does your current professional/ educational status align with your long term career goals? To what extent do you feel your Augustana experience prepared you to succeed in your current program? (asked of alums in grad school) To what extent do you feel your Augustana experience prepared you to succeed in your current position/job? (asked of alums in the workforce)
Co-curricular Experiences -My out-of-class experiences have helped me develop a deeper understanding of myself -My out of class experiences helped me develop a deeper understanding of myself*-My out of class experiences helped me develop a deeper understanding of how I relate to others
Advising - How often did your major adviser ask you about your career goals and aspirations? - How often did your major adviser ask you about your career goals and aspirations? - How often did your major adviser ask you about your career goals and aspirations?-My major adviser connected me with other campus resources
Experiences           in the Major -Faculty in this major cared about my development as a whole person-In this major, how frequently did your faculty emphasize making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions?
Overall Curricular Experience -My one-on-one interactions with faculty have had a positive influence on my intellectual growth and interests in ideas -My one-on-one interactions with faculty have had a positive influence on my intellectual growth and interests in ideas

Clearly there are multiple experiences across a range of settings that influence these three outcomes. Moreover, these findings are similar to the results of prior alumni data analyses and replicate findings from analyses of senior survey data.  In short, we can be confident that the experiences noted in the table above play a critical role in shaping the success of Augustana graduates.

These findings strongly emphasize the importance of quality and purposeful faculty interactions with students. The item, “my one-on-one interactions with faculty have had a positive influence on my intellectual growth and interests in ideas,” significantly predicted students’ sense of preparedness for both those entering graduate programs and those who went into the workforce. This item focuses on more than the frequency of students’ interactions with faculty or friendliness of those interactions. Instead, this item emphasizes the nature of faculty influence; encouraging, inspiring, cajoling, pushing, prodding, and even challenging students to engage tough questions and complicated ideas while at the same time supporting students as they struggle with the implications and ramifications of their own evolving values, beliefs, and worldview.

Faculty influence was again evident in the advising relationship. The question, “How often did your major adviser ask you about your career goals and aspirations?” significantly predicted all three outcomes. In addition, for graduates in the workforce faculty attention to connecting students with other campus resources also influenced the graduates’ sense of preparedness. Furthermore, faculty impact on our graduates’ success is apparent in the major experiences that predicted students’ sense of preparation for their career. Two items were significantly predictive: “Faculty in this major cared about my development as a whole person,” and “In this major, how frequently did your faculty emphasize making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions?” In addition to confirming the caring aspect of quality and purposeful faculty interactions with students, this finding also highlights the value of classroom experiences that cultivate higher order thinking skills.

It is also worth noting the importance of out-of-class experiences in predicting our graduates’ success. Again, the importance of the developmental quality of these experiences is paramount. Instead of items that denote participation in particular types of organizations or activities, the items that proved predictive emphasize that the experiences that matter are ones that help students develop in two ways. First, they help students develop a deeper understanding of themselves.  Second, they help students develop a deeper understanding of how they relate to others. Obviously, these skills are critical for success in every manner of adult life.  The key for Augustana is to ensure that every out-of-class experience contributes – directly or indirectly – to this kind of growth.

The goal of this analysis was not to determine which experiences (faculty interactions or co-curricular experiences) play a larger role in shaping Augustana graduates’ outcomes. Instead, it is clear that all facets of the Augustana education contribute to our students’ success.  It is also clear that not all graduates experience Augustana in a way that maximizes the potential impact of quality and purposeful faculty interaction or developmental out-of-class activities.  Throughout the institution, we can use these findings as principled guidelines to improving the work that we do with our students.

Make it a good day (and a great holiday break),