How many responses did you get? Is that good?

As most of you know by now, the last half of the spring term sometimes feels like a downhill sprint. Except in this case you’re less concerned about how fast you’re going and more worried about whether you’ll get to the finish line without face-planting on the pavement.

Well, it’s no different in the IR Office.  At the moment, we have four large-scale surveys going at once (the recent graduate survey, the senior survey, the freshman survey, and the employee survey), we’ve just finished sending a year’s worth of reports to the Department of Education, and we’re preparing to send all of the necessary data to the arbiter of all things arbitrary, U.S. News College Rankings. That is in addition to all of the individual requests for data gathering and reporting and administrative work that we do every week.

So in the midst of all of this stuff, I wanted to thank everyone who responded to our employee survey as well as everyone who has encouraged others to participate. After last week’s post, a few of you asked how many responses we’ve received so far and how many we need. Those are good questions, but as is my tendency (some might say “my compulsion”) the answer is more complicated than you’d probably prefer.

In essence, we need as many as we can get from as many different types of employees as we can get. But in terms of an actual number, defining “how many responses is enough” can get pretty wonky with formulas and unfamiliar symbols. So I shoot for 60% of an overall population. That means, since Augustana has roughly 500 full-time employees, we would cross that threshold with 300 employee survey responses.

However, that magic 60% applies to any situation where we are looking at the degree to which a set of responses to a particular item can be confidently applied to the overall population. What if we want to look at responses from a certain subgroup of employees (e.g., female faculty)?  In that case, we need to have responses from 60% of the female faculty, something that isn’t necessarily a certainty just because we have 300 out of 500 total responses.

This is why I am constantly hounding everyone about our surveys in order to get as many responses as possible. Because we don’t know all of the subgroups that we might want to analyze when we start collecting data; those possibilities arise during the analysis. And once we find out that we don’t have enough responses to dig into something that looks particularly important, we are flat out of luck.

So this week, I’m asking you to do me a favor.  Ask one person who you don’t necessarily talk to every day if they’ve taken the survey. If they haven’t, encourage them to do it. It might end up making big difference.

Make it a good day,

Mark

A casual and incomplete FAQ for our current employee survey

Even though this is my fifth year at Augustana, the concept of Muesday still throws me for a loop. Maybe this is because I don’t have to think about it much, counting beans in my little office all day every day like I do. Conversely, most faculty I know talk about it as if it’s the most normal concept in the world, no matter if they’ve taught at Augustana for a couple of years or a couple of decades. And even though I think I’ve developed a failsafe cover to hide my ignorance (toss my head back, laugh, lean in while I bat the air in front of my face, say emphatically, “of course, what was I thinking!” while rolling my eyes), it’s an annual reminder for me that the concepts each of us take for granted aren’t always so obvious to everyone else.

I’ve been reminded of this reality again as I’ve been inviting everyone to fill out current Augustana College Employee Survey.  More than a few people have expressed concerns about anonymity and confidentiality.  A few have even floated impressive conspiracy theories of NSA-caliber data scrubbing.  So before I have to run off to my weekly administrator neural network reprogramming and empathy reduction session, I thought that I’d try to answer the anonymity and confidentiality questions in a little more detail. (Yes, I’m kidding. The administrator neural network reprogramming and empathy reduction sessions are every OTHER week and don’t meet this week because it’s MUESDAY!)

When I promise anonymity to everyone who responses to the Augustana College Employee Survey, that means that I don’t ask for your name or other information that directly identifies you. It also means that the software doesn’t collect your Augustana user ID or the IP address of the computer that you used to complete the survey. In order to do this, I turn off a setting in the Google Forms software that would normally add this information to the dataset.

Turning this feature off also means that the survey is publicly accessible – a potential downside to be sure. So it is technically possible that each of the 340+ survey responses I’ve received aren’t actually coming from Augustana employees. But that would mean that somebody somewhere else has acquired the web address of the survey and has spent their days and nights repeatedly filling the survey out over and over with just enough variation of answer choices to avoid suspicion.  Yeah, I doubt it.

Some folks have pointed out that there are enough demographic questions that there might be a way to identify some respondents. This is technically true: if someone had access to both the college’s employee database and the current employee survey dataset, one could probably figure out a way to be pretty sure about the identify of some of the respondents, particularly if one were to triangulate several demographic characteristics (e.g., race and age data) to pick out subgroups of employees that have only a few members. Of course, the only person on campus who has access to both of these datasets is, well, me. If you think that this is a likely explanation for how I spend my time … I guess I sort of doubt that you are even reading this post. Nonetheless, to be clear – I’m not trying to figure out what you said in your survey. And I’m not taking that information and slipping it under someone else’s door so that they can hire henchmen to come to your office and hide your keys. It’s not that I don’t care.  I’m just too busy.

All joking aside, this survey does ask some questions that can easily be perceived as risky to answer. So, if you are concerned about anonymity but want to respond to the survey, just leave any demographic question that cuts too close to the quick blank.  That way you don’t have to worry about having your anonymity violated. I think we’d rather be able to stir your opinion into the mix even if it might not get included in more complex analysis.

Confidentiality is a little different from anonymity. There are numerous student surveys where we promise confidentiality but not anonymity. We often will ask students for their ID number so that we can merge the data they provide with prior institutional data so that we can take a longer view of our students’ four years at Augie, looking for patterns across the entirety of their college experience. Confidentiality specifically refers to how we will share any of our survey findings. When I promise confidentiality, I am promising that I won’t share the data in any way that might link your set of responses to you. Instead, all data findings will be shared as averages of groups, whether that be the entire group of respondents or small subgroups of respondents.

This does again raise the question that some have asked about protecting the anonymity and confidentiality of those who are members of sparsely populated subgroups. When I promise confidentiality, I have to also consider the possibility that presenting data in all of the ways that it can be sliced and diced could lead to violating someone’s confidentiality. To allay this concern, I am ensuring confidentiality by simply not sharing any results in a way that might allow folks to reasonably infer any individual’s responses. I will not share any average responses to questions where the number of respondents in that particular subgroup is less than five. This makes it much less likely that anyone could determine the nature of someone’s individual responses based on the average responses from any particular subgroup of responses. So, for example, if we have less than five respondents in the category of employees who have worked here between six and ten years, then we won’t share any results for any question by the number of years employees have worked at Augustana.

Just like the anonymity question above, if you are worried that your confidentiality will be violated, don’t provide answers to those specific questions.

Even though we have already received many responses to this survey, we still need many more because the more that we have, the more likely it will be that we can look at subgroups of responses and analyze this data without violating anonymity and confidentiality.

Getting better as an organization is hard work. At its core, it requires that we all put something into it.  Completing this survey is a big first step. I hope you’ll all give it a shot.

Make it a good day,

Mark

 

The race to get old started yesterday. Hurry up!

A little over a week ago the Wall Street Journal published a short piece entitled, “Today’s Anxious Freshmen Declare Majors Far Faster Than Their Elders:Weak job market and high debt loads prompt broad shift away from intellectual exploration.” They cited data from their own small but random survey of colleges and universities suggesting that more and more freshmen declare their majors earlier. While the article and those interviewed for it speculated about a variety of factors that might be driving this phenomenon, the conclusion seemed pretty clear: college is now much less about discovering yourself first and finding a career later and much more about locking into a track for a career.

I thought it would be interesting to see if our own data reflected a similar trend. We were able to examine data over a similar time period, exploring the differences between students who entered Augustana as freshmen in the fall of 2007 and students who entered Augustana as freshmen in the fall of 2013. In addition, I thought it would be interesting to expand on the Wall Street Journal analysis since they aren’t clear about when the institutional data they presented was collected (in the fall of the first year? in the spring of the first year? at the beginning of the second year?). So we compared the two freshmen cohorts noted above in three ways. First, what proportion of the class indicated that they were undecided on their major when they applied to Augustana? Second, what proportion of those undecided students had declared a major by the beginning of their second year? And third, what proportion of the entire freshman class had declared a major by the beginning of the second year?

Our Augustana results seem to parallel the findings reported by the Wall Street Journal. During the application process, 16% (111 of 713) of the 2007 first-year cohort indicated that they were undecided about their major. During the 2013 cohort’s application process, only 11% (70 of 627) selected “undecided” when asked about their intended major. Interestingly, the proportion of these initially undecided students who had chosen a major by the beginning of their second year did not change appreciably between the fall of 2008 and the fall of 2014. Of the undecided majors from the 2007 cohort, 68% (63 of 92 – the remaining 19 did not persist to the second year) had still not selected a major one year later.  From the 2013 cohort, 69% (40 of 58 – the remaining 12 did not return to Augustana) of the initially undecided remained undeclared.

The biggest difference between the two cohorts can be found in the proportion of students who had declared a major by the beginning of the second year. Remember, the position taken by the Wall Street Journal article was that students take less time for intellectual pursuits and narrow their focus on a major earlier than in previous years. At Augustana, It appears that we are seeing a similar phenomenon.  While 54% of the 2007 first-year cohort had not yet declared their major by the beginning of the second year, only 36% of the 2013 cohort were still undeclared majors by the beginning of the second year.

So . . . is this a bad thing?

Honestly, I’m not sure.  In the end, I don’t know that we will have much success telling students that they are wrong to respond to external pressures of a tight job market and high student debt by choosing their major earlier. That kind of approach is likely to come across as tone-deaf to some very real concerns. It seems to me that this data re-emphasizes the importance of timely and substantive conversations between students and all of us who impact their education (faculty, administrators, work supervisors, residence life staff, student life staff, and fellow students) that push students to develop themselves even as they are preparing for life after college. Personal and intellectual development and career preparation ought to be a “both/and” enterprise.

If we can do that, our students are likely to grow and change in just the ways that we hoped they would.

Make it a good day,

Mark

Participation: A Prerequisite for Improvement

Usually I post to my blog on Monday mornings, but I hope you’ll indulge this early post and keep it in mind as you begin your week.

Sexual assault is a problem on virtually every college campus. Yet it is only very recently that colleges and universities, no doubt pushed by public outcry and increasing stern federal action, have begun to face the need to more fully understand and address this issue.

Within the last year, Augustana substantially revised a host of policies regarding sexual assault. But other than those cases that are reported, we don’t know nearly enough about our students’ perception of, and experiences with, sexual assault on campus.

For the last two weeks we’ve been participating in a survey of campus climate and sexual assault conducted by the Higher Education Data Sharing Consortium. This upcoming Friday, March 27th, the data collection phase of our participation in this survey will end. Although we repeatedly invited responses from all students, as of last Thursday we had only received responses from 570 individuals. While that means we’ve heard from almost 25% of our study body (good enough in statistical terms to make some inferences based on the results), we need to hear from as many students as possible. This is in large part because the most useful information is likely to come from those who are most reticent to share their experiences, making the number of total responses all that much more important.

So I am asking – no matter if you interact with students as their instructor, their mentor, their work supervisor, or even their friend – that you encourage your students to complete this survey. Please remind them that participation in this survey is a prerequisite for improvement. In other words, we can’t improve what we do as a college if we don’t know what our students experience.

I know you have plenty of things on your mind as you prepare for this week. But your comments, even if they are brief, will demonstrate the degree to which Augustana is serious about facing this issue and eradicating sexual assault from our campus. I know that eliminating sexual assault might seem like a rather high bar; I just don’t know how we could aim for anything less.

So please mention this survey to your students. They all received an email on Sunday evening inviting those students who had not participated one last time to respond. Their unique link to the survey was in that email. They can complete it any time this week, but after Friday the survey will no longer be accepting new data.

Thank you for your help.

Make it a good day,

Mark

 

The Problem with Aiming for a Culture of Assessment

In recent years I’ve heard a lot of higher ed talking heads imploring colleges and university to adopt a “culture of assessment.” As far as I can tell (at least from a couple of quick Google searches), the phrase has been around for almost two decades and varies considerably in what it actually means. Some folks seem to think it describes a place where everyone uses evidence (some folks use the more slippery term “facts”) to make decisions, while others seem to think that a culture of assessment describes a place where everyone measures everything all the time.

There is a pretty entertaining children’s book called Magnus Maximus, A Marvelous Measurer that tells the story of guy who gets so caught up measuring everything that he ultimately misses the most important stuff in life. In the end he learns “that the best things in life are not meant to be measured, but treasured.” While there are some pretty compelling reasons to think twice about the book’s supposed life lesson (although I dare anyone to float even the most concise post-modern pushback to a five year old at bedtime and see how that goes), the book delightfully illustrates the absurdity of spending one’s whole life focused on measuring if the sole purpose of that endeavor is merely measuring.

In the world of assessment in higher education, I fear that we have made the very mistake that we often tell others they shouldn’t make by confusing the ultimate goal of improvement with the act of measuring. The goal – or “intended outcome” if you want to use the eternally awkward assessment parlance – is that we actually get better at educating every one of our students so that they are more likely to thrive in whatever they choose to do after college. Even in the language of those who argue that assessment is primarily needed to validate that higher education institutions are worth the money (be it public or private money), there is always a final suggestion that institutions will use whatever data they gather to get better somehow. Of course, the “getting better” part seems to always be mysteriously left to someone else. Measuring, in any of its forms is almost useless if that is where most or all of the time and money is invested. If you don’t believe me, just head on down to your local Institutional Research Office and ask to see all of the dusty three-ring binders of survey reports and data books from the last two decades. If they aren’t stacked on a high shelf, they’re probably in a remote storage room somewhere.

Measuring is only one ingredient of the recipe that gets us to improvement. In fact, given the myriad of moving parts that educators routinely deal with (only some of which educators and institutions can actually control), I’m not sure that robust measuring is even the most important ingredient. An institution has no more achieved improvement just because they measure things than a chef bakes a cake by throwing a bag of flour in an oven (yes I know there are such things as flourless tortes … that is kind of my point). Without cultivating and sustaining an organizational culture that genuinely values and prioritizes improvement, measurement is just another thing that we do.

Genuinely valuing improvement means explicitly dedicating the time and space to think through any evidence of mission fulfillment (be it gains on learning outcomes, participation in experiences that should lead to learning outcomes, or the degree to which students’ experiences are thoughtfully integrated toward a realistic whole), rewarding the effort to improve regardless of success or failure, and perpetuating an environment in which everyone cares enough to continually seek out things that might be done just a little bit better.

Peter Drucker is purported to have said that “culture eats strategy for lunch.” Other strategic planning gurus talk about the differences between strategy and tactics. If we want our institutions to actually improve and continually demonstrate that, no matter how much the world changes, we can prepare our students to take adult life by the horns and thrive no matter what they choose to do, then we can’t let ourselves mistakenly think that maniacal measurement magically perpetuates a culture of anything. If anything, we are likely to just make a lot more work for quantitative geeks (like me) while excluding those who aren’t convinced that statistical analysis is the best way to get at “truth.” And we definitely will continue to tie ourselves into all sorts of knots if we pursue a culture of assessment instead of a culture of improvement.

Make it a good day,

Mark

 

What if our students could point to their most important learning moments?

If we could make a college education work perfectly, our students would do more than learn. In addition, they would be able to point to those actual moments during their college career when an interaction, an experience, or a discovery altered their trajectory regarding their plans for life after college. Although this might sound a little dreamy aspirational, it turns out that students who can talk about their learning experiences in this way tend to have a sort of educational momentum that seems to set them apart from their peers. These are the students who do the little things to put themselves in the early running for advantageous opportunities that ultimately lead to a deeper sense of purpose and direction as well as stronger job applications and stronger graduate school applications. These students make folks like me wish I had had some of what those students have when I was their age.

That’s why it makes a lot of sense to find out what proportion of our freshmen have this kind of perspective after their first year at Augustana. Ideally, we’d like to be able to cultivate that deeper level of awareness in more of our students by figuring out if there are ways that we could make this happen in more of them. So at the end of the year we ask our freshmen to agree or disagree with the following statement: “Reflecting on the past year, I can think of specific experiences or conversations that helped me clarify my life/career goals (e.g. conversations with faculty/staff, organized activities with other students, community involvement, specific classes, etc.).”

Here’s how last years’ freshmen responded (remember that not all freshmen completed this survey):

  • Strongly disagree –  9 (4%)
  • Disagree          -     15 (6%)
  • Neutral           –       64 (28%)
  • Agree            -       100 (43%)
  • Strongly agree    -   44 (19%)

My reaction to this bit of data is a little mixed.  On the one hand, most of the students either agreed or strongly agreed with this statement.  On the other hand, 87 of our respondents can’t seem to put themselves in an affirmative category.

To be fair, it would be a little naive to think that we could hand out inspirational moments like some kind of kitschy swag. At the same time, it would be awfully useful to know whether there are things we could do to increase the likelihood that a given freshman would say that they could point to a specific experience in their first year that helped them clarify their life or career goals.

After testing a host of possibilities, we found five items that significantly increased the likelihood of this perspective among our freshmen. Interestingly, in addition to a set of experiences that come from all facets of a residential college life these items indicate a certain type of experience that provides some guidance for our work.  Here are those five items:

  • How frequently did your faculty ask you to try to understand someone else’s views by imagining how an issue looks from his or her perspective?
  • My instructors recommended specific out-of-class experiences that would enhance my learning and growth.
  • My adviser asked me about my career goals and post-graduate aspirations.
  • My out-of-class experiences helped involve me in community service off-campus.
  • About how often have you had serious conversations with students who are very different from you?

Again, as we’ve found in other analyses of our student data, the ideal college experience depends upon the work that each of us do, no matter if it is inside or outside of a classroom. But today I want to highlight the role of faculty reflected in these items. Instructors who often ask students to practice perspective-taking in order to better understand someone else’s views, instructors who take the time to recommend specific out-of-class learning experiences, and advisers (in other words, faculty who are first-year advisers) who ask students about their career goals and post-graduate aspirations all appear to significantly contribute to the quality of our students’ educational experience. Students who experience these kinds of faculty interactions seem to be more likely to be able to point to specific moments in their first year experience that helped them hone in on their post-graduate goals.

The other thing I like about this list of faculty interactions is that, no matter the course or the discipline, at least one of these items seems possible. If your course doesn’t lend itself to perspective-taking exercises, you could point students toward particularly valuable educational experiences on campus or in the community. If your class is composed of students who are already highly involved, you could engage them in perspective-taking skill development. And when students engage you outside of class, you could take a moment to ask them about their life goals beyond college.  I hope you will consider finding a way to plug one of these items into your regular interactions with students.  Good luck with your spring term!

Make it a good day,

Mark

Some Myths Just Won’t Die

I’ve recently heard more than a few folks suggest that the number of administrators at Augustana College is going up at the expense of faculty positions. This seems to be a particularly popular hypothesis, one that has been around at both the national level and on our campus for a long time. I’ve tested this assertion with our local data several years ago and, to be fair, it’s worth retesting hypotheses every once in a while to make sure that previous findings, and more importantly previous conclusions, still hold true.

Below I’ve laid out a table of our own Augustana data over the last ten years that includes instructional faculty numbers, non-instructional staff numbers, student enrollment, and ratios that give some sense of the relationships between a variety of combinations. Please note that the first column is the academic year 2014-15; data moves back in the time from left to right.

2014-15 2013-14 2012-13 2011-12 2010-11 2009-10 2008-09 2007-08 2006-07 2005-06
Tenured Professors 114 102 98 104 102 94 90 102 90 94
Tenure Track Professors 33 42 52 51 62 64 55 35 46 41
Total Tenure and Tenure-Track 147 144 150 155 164 158 145 137 136 135
Full-Time Instructors Off the Tenure Track 50 44 36 27 20 16 35 36 28 14
Proportion of Full-Time Instruction Workforce Off the Tenure Track 25.4 23.4 19.4 14.8 10.9 9.2 19.4 20.8 17.1 9.4
Academic Administration/Salaried Operations Administration * 153 135 171 158 172 183 172 167 159
Hourly Employees * 170 178 158 158 171 197 190 192 190
Total Full-Time Non-Instructional Employees * 323 313 329 316 343 380 362 359 349
Student Enrollment FTE 2483 2514 2538 2506 2529 2455 2531 2516 2450 2371
Ratio of Non-Instructional Employees to Full-Time Instructors * 1.7 1.7 1.8 1.7 2.0 2.1 2.1 2.2 2.3
Ratio of “Administrators” to Full-Time Instructors * 0.8 0.7 0.9 0.9 1.0 1.0 1.0 1.0 1.1
Ratio of “Administrators” to Total Tenure/Tenure Track Faculty * 1.1 0.9 1.1 1.0 1.1 1.3 1.3 1.2 1.2
Ratio of Students to Full-Time Instructors 12.6 13.4 13.6 13.8 13.7 14.1 14.1 14.5 14.9 15.9
Ratio of Students to Non-Instructional Employees * 7.8 8.1 7.6 8.0 7.2 6.7 7.0 6.8 6.8
Ratio of Students to “Administrators” * 16.4 18.8 14.7 16.0 14.3 13.8 14.6 14.7 14.9
*not reported to IPEDS until April, 2015

First, while the number of tenured professors has gone up and the number of tenure-track professors has gone down over the last ten years, the total number of traditional faculty (i.e., faculty within the tenure system) has gone up 9%. Moreover, the overall number of full-time instructional faculty has increased over the last ten years by 32%. (Although it’s a somewhat separate issue for a separate post, I couldn’t help but note the increase in the proportion of our full-time instructional workforce that is not a part of the tenure system.)

Second, the number of administrators and the number of hourly employees has dropped over the last ten years, from 159 to 153 and from 190 to 170, respectively. This change strikes me as particularly interesting given the increase in student enrollment over the same period, especially for the hourly employees who often are on the front lines of serving students’ non-academic needs.

Finally, I’ve included six lines of ratios that put these relationships between numbers of faculty, administrators, staff, and students into context over the past ten years. As you can see, there are now fewer non-instructional employees for every full-time instructor, fewer administrators for every full-time instructor, and fewer administrators for every tenured or tenure track faculty member. Moreover, even though the total number of students has increased, the number of students per instructor has dropped while the number of students per non-instructional employee and number of students per administrator has gone up.

So no matter how you slice it, asserting that the total number of administrators has gone up while the total number of faculty has gone down is, well, hogwash. Even in the context of the relationships between administrators and faculty, administrators and students, or faculty and students, this assertion is, well, hogwash. Nationally this assertion might hold some water, but at Augustana College . . . it just ain’t so.

Certainly, within those big-picture numbers there are lots of positions that have been moved from one office to another or faculty lines that have been moved from one department to another. You might not agree with one or more of those moves, but that sounds to me like a separate issue entirely – one worth a robust discussion no doubt, but a separate issue nonetheless.

Make it a good day,

Mark

We’ve Still Got a Long Way to Go

Most of the time, I try to write a post that includes both a deep dive into some morsel of data and a few implications that I think might be embedded in that data.  But this week, I think I’m going to try to dispense with a longer examination of implications and just lay out a set of responses to a single question from our recent survey of prospective students that we conducted in collaboration with the Hanover Research Group.

An early question in the survey asked the respondents to select the top five words that best described the college they would most like to attend.  You might recall that last week I pointed out that “affordable” was the most frequently selected word (not a big surprise, right?) and that “liberal arts” was pretty far down the list.

Although it’s certainly interesting to see the ordering of selected words from highest to lowest, it’s also potentially enlightening to look at how different subgroups of respondents respond to similar words. Parsing the responses of white and non-white respondents exposes a stark difference worth noting.

In digging deeper in the responses to this same question, the disparity between white and non-white respondents in selecting the word “diverse” really jumped out at me. White respondents selected this word 15% of the time. Non-white respondents selected this word 46% of the time. Given the substantial demographic shifts that are already underway across our primary recruiting region, this difference seems particularly important.  In addition to the moral imperative for us to continue to diversify our student body, it appears that ignoring such an imperative could increase our future economic risk as well.

While this finding is interesting, asking respondents to choose their top five words from a long list of possible options can complicate the interpretation of the results. So I want to show you another set of responses, parsed by white and non-white respondents, to a very specific question that asks respondents to indicate how important a diverse student body is to them when selecting a college.

Response Option                                                                        White              Non-White

Not at all important 11% 2%
Slightly important 19% 3%
Moderately important 40% 27%
Very important 25% 38%
Essential 5% 31%
Very important + Essential 30% 69%

As you can probably tell, non-white respondents trend toward thinking that a diverse student body matters a lot.  By contrast, it appears that white respondents trend toward thinking that a diverse student body matters some, but not nearly as much.

Yes; there is probably more than one reason for this difference in responses. And it’s not as if the difference between the two sets of responses are in complete opposition to each other. But, I hope this data will further underscore the reasons why we need to be active champions for equality. We’ve still got a long way to go.

Make it a good day,

Mark

Who are we talking to when we use the term “liberal arts”?

If a complete stranger had stumbled onto campus the weekend before last they might have thought that Augustana was the busiest college on the planet. That Saturday (January 17th), the Admissions Office hosted one of our largest annual open-house events for prospective students and families. While this event always draws large numbers, this year the number of visitors to campus (prospective students and their parents combined) may well have exceeded the actual number of Augustana students living on campus.

With the college recruiting season hurtling into the most critical few months of the year, every little bit of information that we can learn about prospective students and their parents and their decision-making process matters. To that end, we’ve been gathering data on the things that are most important to our prospective students and their parents as they evaluate, and ultimately select, a college. One way that researchers try to get at this kind of information is to ask folks to pick five words or phrases from a longer list of words or phrases that they think best describe an idyllic college experience. As you might expect from this year’s prospective students, “affordable” topped the list with 57% of the respondents choosing it.  Other words near the top of the list included “friendly” (41%), “safe” (39%), “respected,” (38%), and “career-oriented” (33%).

Much further down the list, 15th to be exact, sits the phrase “liberal arts” (just 12% of respondents thought this was a top-five word for them). Since rank ordering the words selected ends up clustering “liberal arts” with a seemingly contradictory group of terms (e.g., “small,” “large,” “rigorous,” and “flexible,”), it’s clear that we probably  shouldn’t go all Chicken Little just yet. Look on the bright side: only 6% of the respondents selected “party school.”

The question this finding raises for me, however, isn’t really about the exact ranking of the term “liberal arts.” My concern is that there seems to be a substantive gap between the degree to which we (faculty, staff, administrators, board members) use the phrase “liberal arts” to describe who we are and the level of importance that prospective students responding to this survey gave it. To make matters worse, this data doesn’t come from some general survey of potential college-going students; these responses came from students in our own inquiry pool (i.e., students who have either contacted us directly or students who fit a profile of those who might be interested in us).

Now please don’t conclude that I’m suggesting the elimination of the term or the philosophy behind it. On the contrary, I happen to think that if we are going to remain a viable college then we will have to explicitly embody a liberal arts philosophy that focuses on integrating and synthesizing preexisting knowledge. Almost exactly a year ago, I went on a three-post rant about it here, here, and here.

Rather, I suspect that the term “liberal arts” means very little of substance to prospective students. Maybe it is, like many other words that get used over and over again in marketing materials, a case where the phrase means one thing to an internal audience and something else to an external audience. When we use the term, even though we might not all agree exactly, I think we could describe relatively precisely the dispositions of a liberally educated individual.  This finding increases my worry that when an external audience, most notably prospective students, sees this term, they have a much less precise sense of its meaning. In that context, “liberal arts” might mean little more than “small” or “rigorous.” It also could end up being interpreted to mean “lots of classes in fields I’m not interested in” or, even worse, “a club that maybe we’ll let you into.”

I certainly don’t have a brilliant answer to this challenge. But I think it is worth noting that just because we have a term that we believe describes us well doesn’t mean that this term will compel others who are new to the concept of college to buy what we are selling. There’s nothing wrong with believing in what we do; even drinking our own Kool-Aid. We just better be able to spell out what we do and why it works in a way that makes sense to regular folks who seem to care a lot more about affordability.

Make it a good day,

Mark

Setting a high bar for equality in graduation

US News rankings have never been my favorite part of higher education. For many years these rankings did little more than con colleges and universities into an illusory arms race under the guise of increasing educational quality. But recently US News has started to use their data, power, and influence to prod more useful conversations that might lead to improvements at higher education institutions. Last week, US News released their rankings for “Which top-ranked colleges operate most efficiently.” Like last year Augustana appeared near the top of the list among liberal arts colleges, suggesting that we apply our limited resources effectively to educate our students. Whether conversations about “efficiency” give you a warm fuzzy or a cold shudder, I don’t think it’s particularly controversial to say that such recognition is, at the very least, more good than bad.

But in keeping with their deified status in higher education, the US News rankings giveth and the US News rankings taketh away. A few weeks ago, they released another set of rankings that I found particularly intriguing given our recent campus discussions about equality and social justice. This set of rankings focused on the graduation rates of low-income students, and contrasted the proportion of low income students who ultimately graduate from each institution with each institution’s overall graduation rate. Based on these two numbers, US News identified colleges and universities that they called “top performers,” “over performers,” and “under performers.” Sadly, Augustana appeared in the under performer group with a 13 percentage point deficit between our overall six-year graduation rate (78%) and our six-year graduation rate of low-income students (65%). Just in case  you’re wondering, these graduation rates come from students who entered college in the fall of 2007.

Because of the focused nature of this particular analysis, US News combined all institutions from their two national ranking categories (national universities and national liberal arts colleges) to create these three groups. The presence of several familiar institutions in each group suggests that there might be something to learn about graduating low-income students from other similar institutions that might in turn help us narrow our own disparity in graduation rates. 

The criteria for the “top performer” category required that the institution’s overall graduation rate was above 80% and that the graduation rate of low-income students was the same (or within a percentage point). While there were numerous national liberal arts colleges on the list, they were generally highly ranked institutions with well known pedigrees. However, two familiar institutions appeared in this category that seemed worth highlighting.

  • St. Olaf College – overall grad rate: 88%, low-income grad rate: 87%
  • Gustavus Adolphus College – overall and low-income grad rate: 82%

The criteria for the “over performer” category was simply that low-income students graduated at a higher rate than the overall student population. There were several institutions in this group that are not too different from us, particularly based on their US News overall ranking (remember, Augustana was ranked #105 this year).  These institutions include:

  • Drew University (#99) – overall grad rate: 69%, low-income grad rate: 76%
  • College of the Atlantic (#99) – overall grad rate: 69%, low-income grad rate: 75%
  • Knox College (#81) – overall grad rate: 79%, low-income grad rate: 83%
  • Lewis & Clark College (#77) – overall grad rate: 74%, low-income grad rate: 79%
  • Beloit College (#61) – overall grad rate: 78%, low-income grad rate: 83%

Interestingly, there were also some institutions in the over performer group that probably wouldn’t dare to dream of a ranking approaching the top 100. In other words, they would probably trade their place for ours in a heartbeat. A few to note include:

  • Oglethorpe University (#148) – overall grad rate: 62%, low-income grad rate: 67%
  • Illinois College (#155) – overall grad rate: 64%, low-income grad rate: 68%
  • Warren Wilson College (#165) – overall grad rate: 51%, low-income grad rate: 60%
  • Ouachita Baptist University (#176) – overall grad rate: 60%, low-income grad rate: 80%
  • Wisconsin Lutheran College (#178) – overall grad rate: 64%, low-income grad rate: 75%

Finally, the under performer group noted institutions where low-income students graduated at rates lower than the overall graduation rate. Some similar/familiar liberal arts colleges in this group included:

  • Augustana College (#105) – overall grad rate: 78%, low-income grad rate: 65%
  • Washington College (#105) – overall grad rate: 68%, low-income grad rate: 49%
  • Hampden-Sydney College (#105) – overall grad rate 62%, low-income grad rate: 43%
  • St. Mary’s College of Maryland (#89) – overall grad rate: 73%, low-income grad rate: 64%
  • Wittenberg University (#139) – overall grad rate: 63%, low-income grad rate: 49%
  • Alma College (#139) – overall grad rate: 61%, low-income grad rate: 44%

Although we ought to be careful not to jump to rash conclusions from this data alone, there are a couple of suppositions that this data seems to contradict. First, although the national graduation rates for low-income students consistently lag behind overall graduation rates, this is not necessarily so at every institution. Some institutions graduate low-income students at substantially higher rates than the the rest of their students. Second, it does not appear that institutional wealth, prestige, or academic profile guarantees graduation equity. There are institutions at both ends of the ranking spectrum that manage to graduate low-income students at a higher rate than the rest of their students. Third, geographical location doesn’t necessarily ensure success or failure. Successful institutions are located in both urban and rural locations.

I don’t know what makes each of these successful institutions achieve graduation equality. But in looking at our own disparity in graduation rates, it seems to me that we might learn something from these institutions that have found ways to graduate low-income students at rates similar to the rest of their students. We have set our own bar pretty high (our overall graduation rate of 78% is comparable or higher than all of the institutions I listed from the US News over performer category). Now it’s up to us to make sure that every student we enroll can clear that height. We shouldn’t be satisfied with anything less.

Make it a good day,

Mark