Transparency Travails and Sexual Assault Data

The chill that dropped over campus on Monday seems like an apt metaphor for the subject that’s been on my mind for the past week. Last spring, Augustana participated in a multi-institutional study focused on sexual assault and the related campus climate that was developed and administered by the Higher Education Data Sharing Consortium (HEDS). We hoped that the findings from this survey would help us, 1) get a better handle on the nature and prevalence of sexual assault and unwanted sexual contact among our students, and 2) better understand our campus climate surrounding sexual assault and unwanted sexual contact. We actively solicited student participation in the survey, collaborating with student government, faculty, and administration to announce the survey and encourage students to respond. The student response was unusually robust, particularly given the sensitivity of the topic. Equally important, many people across campus – students, faculty, administrators, and staff alike – took note of our announced intentions to improve and repeatedly asked when we would have information about the findings to share with the campus community. You saw the first announcement of these results on Sunday in a campus-wide email from Dean Campbell. If you attended the Monday night screening of The Hunting Ground and the panel discussion that followed, you likely heard additional references to findings from this survey. As Evelyn Campbell indicated, the full report is available from Mark Salisbury (AKA, me!) in the IR office upon request.

It has been interesting to watch the national reporting this fall as several higher ed consortia and individual institutions have begun to share data from their own studies of sexual assault and related campus climate. While some news outlets have reported in a fairly objective manner (Inside Higher Ed and The Chronicle of Higher Education), others have tripped over their own feet trying to impose a tale of conspiracy and dark motives (Huffington Post) or face-planted trying to insert a positive spin where one doesn’t really exist (Stanford University). Moreover, the often awkward word choices and phrasing in the institutional press releases (e.g., Princeton’s press release) announcing these data seems to accentuate the degree to which colleges and university aren’t comfortable talking about their weaknesses, mistakes, or human failings (not to mention the extent to which faculty and college administrators might need to bone up on their quantitative literacy chops!).

Amidst all of this noise, we are watching two very different rationales for transparency play out in entirely predictable ways. One rationale frames transparency as a necessary imposition from the outside, like the piercing beam of an inspector’s flashlight pointed into an ominous darkness to expose bad behavior and prove a supposition. The other rationale frames transparency as a disposition that emanates from within, cultivating an organizational dynamic that makes it possible to enact and embrace meaningful and permanent improvement.

For the most part, it seems that most of the noise being made in the national press about sexual assault data and college campuses comes from using transparency to beat institutions into submission. This is particularly apparent in the Huffington Post piece. If the headline, “Private Colleges Keep Sexual Assault Data Secret: A bunch of colleges are withholding sexual assault data, thanks to one group,” doesn’t convey their agenda clearly enough, then the first couple of paragraphs walks the reader through it. The problem in this approach to transparency is that the data too often becomes the rope in a giant tug-of-war between preconceived points of view. Both (or neither) points of view could have parts that are entirely valid, but the nuance critical to actually identifying an effective way forward gets chopped to bits in the heat of the battle. In the end, you just have winners, losers, and a lifeless coil of rope that no one cares about anymore.

Instead, transparency is more likely to lead to effective change when it is a disposition that emanates within the institution’s culture. The folks at HEDS understood this notion when they designed the protocol for conducting the survey and conveying the data. The protocol they developed specifically prohibited institutions from revealing the names of other participant institutions, forcing institutions to focus the implications of their results back on themselves. Certainly, a critical part of this process at any institution is sharing its data with its entire community and collective addressing the need to improve. But in this situation, transparency isn’t the end goal. Rather, it becomes a part of a process that necessarily leads to an improvement and observable change. To drive this point home, HEDS has put extensive efforts into helping institutions use their data to create change that reduces sexual assault.

At Augustana, we will continue to share our own results across our community as well tackle this problem head-on. Our own findings point to plenty of issues that will likely improve our campus climate and reduce sexual assault. I’ll write about some of these findings in more detail in the coming weeks. In the meantime, please feel free to send me an email requesting our data. I’ll send you a copy right away. And if you’d like me to bring parts of the data to your students so that they might reflect and learn, I’m happy to do that too.

Make it a good day,


Welcome back to a smorgasbord of ambiguity!

Every summer I get lonely.  Don’t get me wrong, I love the people I work with in Academic Affairs and in Founders Hall . . . probably more than they love me sometimes.  But the campus just doesn’t feel right unless there is a certain level of manageable chaos, the ebb and flow of folks scurrying between buildings, and a little bit of nervous anticipation in the air.  Believe it or not, I genuinely missed our student who sat in the trees and sang out across the quad all last year!  Where are you, Ellis?!

For those of you who are new to Augustana, I write this column/blog every week to try to drop a little dose of positive restlessness into the campus ether.  I first read the phrase “positive restlessness” in the seminal work by George Kuh, Jillian Kinzie, John Schuh, and Liz Whitt titled Student Success in College. This 2005 book describes the common threads the authors found among 20 colleges and universities that, no matter the profile of students they served or the amount of money squirreled away in their endowment portfolio, consistently outperformed similar institutions in retention and graduation rates.

More important than anything else, the authors found that the culture on each of these campuses seemed energized by a perpetual drive to improve. No matter if it was a massive undertaking or a tiny little tweak, the faculty, staff, and students at these schools seemed almost hungry to get just a little bit better at who they were and how they did what they do every day.  This doesn’t mean that the folks on these campuses were some cultish consortium of maniacal change agents or evangelical sloganeers. But over and over it seemed that the culture at each of the schools featured in this study coalesced around a drive to do the best that they could with the resources that they had and to never let themselves rest on their laurels for too long.

What continues to strike me about this attribute is the degree to which it requires an optimistic willingness to wade into the unknown. If we were to wait until we figured out the failsafe answer to every conundrum, none of us would be where we are now and Augustana would have almost certainly gone under a long time ago.  Especially when it comes to educating, there are no perfect pedagogies or guaranteed solutions. Instead, the best we can do is continually triangulate new information with our own experience to cultivate learning conditions that are best suited for our students. In essence, we are perpetually focused on the process in order to increase the likelihood that we can most effectively influence the product.

The goal of this blog is to present little bits of information that might combine with your expertise to fuel a sense of positive restlessness on our campus.  Sometimes I point out something that we seem to be doing well.  Other times I’ll highlight something that we might improve.  Either way, I’ll try to present this information in way that points us forward with an optimism that we can always make Augustana just a little bit better.

By a lot of different measures, we are a pretty darn good school.  And we have a healthy list of examples of ways in which we have embodied positive restlessness on this campus (if you doubt me, read the accreditation documents that we will be submitting to the Higher Learning Commission later this fall).  We certainly aren’t perfect, but frankly that would be a fool’s errand because perfection is a static concept – and maintaining an effective learning environment across an entire college campus is by definition a perpetually evolving endeavor.

So I raise my coffee mug to all of you and to the deliciously ambiguous future that this academic year holds.  Into the unknown we stride together.

Make it a good day!



So after the first year, can we tell if CORE is making a difference?

Now that we are a little over a year into putting Augustana 2020 in motion, we’ve discovered that assessing the implementation process is deceptively difficult. The problem isn’t that the final metrics to which the plan aspires are too complicated to measure or even too lofty to achieve. Those are goals that are fairly simple to assess – we either hit our marks or we don’t. Instead, the challenge at present lies in devising an assessment framework that tracks implementation, not the end results. Although Augustana 2020 is a relatively short document, in actuality it lays out a complex, multi-layered plan that requires a series of building blocks to be constructed separately, fused together, and calibrated precisely before we can legitimately expect to meet our goals for retention and graduation rates, job acquisition and graduate school acceptance rates, or improved preparation for post-graduate success. Assessing the implementation, especially at such an early point in the process, by using the final metrics to judge our progress would be like judging a car manufacturer’s increased production speed right after the company had added a faster motor to one of the assembly lines. Of course, without having retrofitted or changed out all of the other assembly stages to adapt to this new motor, by itself such a change would inevitably turn production into a disaster.

Put simply, judging any given snapshot of our current state of implementation against the fullness of our intended final product doesn’t really help us build a better mousetrap; it just tells us what we already know (“It’s not done yet!”). During the process of implementation, the focus of assessment is much more useful if it identifies and highlights intermediate measures that give us a more exacting sense of whether we are moving in the right direction. In addition, assessing the process should tell us if the pieces we are putting in place will work together as designed or if we have to made additional adjustments to make sure the whole systems works as it should. This means narrowing our focus on the impact of individual elements on specific student behaviors, testing the fit between pieces that have to work together, and tracking the staying power of experiences that are intended to permanently impact our students’ trajectories.

With all of that said, I thought that it would be fitting to try out this assessment approach on arguably the most prominent element of Augustana 2020 – CORE. Now that CORE is finishing its first year at the physical center of our campus, it seems reasonable to ask whether we have any indicators in place that could assess whether this initiative is bearing the kind of early fruit we had hoped. Obviously, since CORE is designed to function as a part of a four-year plan of student development and preparation, it would be foolhardy to judge CORE’s ultimate effectiveness on some of the Augustana 2020 metrics until at least four years has past. However, we should look to see if there are indications that CORE’s early impact triangulates with the student behaviors or attitudes necessary for improved post-graduate success. This is the kind of data that would be immediately useful to CORE and the entire college. If indicators suggest that we are moving in the right direction, then we can move forward with greater confidence. If the indicators suggest that things aren’t working as we’d hoped, then we can make adjustments before too many other things are locked into place.

In order to find data that suggests impact, we need more than just the numbers of students who have visited CORE this year (even though it is clear that student traffic in the CORE office and at the many CORE events has been impressive). To be fair, these participation patterns could simply be an outgrowth of CORE’s new location at the center of campus (“You’ve got candy, I was just walking by, why not stop in?”). To give us a sense of CORE’s impact, we need to find data where we have comparable before-and-after numbers. At this early juncture, we can’t look at our recent graduate survey data for employment rates six months after graduation since our most recent data comes from students who graduated last spring – before CORE opened.

Yet we may have a few data points that shine some light on CORE’s impact during its first year. To be sure, these data points shouldn’t be interpreted as hard “proof.” Instead, I suggest that they are indicators of directionality and, when put in the presence of other data (be they usage numbers or the preponderance of anecdotes), we can start to lean toward some conclusions about CORE’s impact in its first year.

The first data point we can explore is a comparison of the number of seniors who have already accepted a job offer at the time they complete the senior survey. Certainly the steadily improving economy, Augustana’s existing efforts to encourage students to begin their post-graduate planning earlier, and the unique attributes of this cohort of students could also influence this particular data point. However, if we were to see a noticeable jump in this number, it would be difficult to argue that CORE should get no credit for this increase.

The second data point we could explore would be the proportion of seniors who said they were recommended to CORE or the CEC by other students and faculty. This seems a potentially indicative data point based on the assumption that neither students nor faculty would recommend CORE more often if the reputation and result of CORE’s services were no different than the reputation and results of similar services provided by the CEC in prior years. To add context, we can also look at the proportion of seniors who said that no one recommended CORE or the CEC to them.

These data points all come from the three most recent administrations of the senior survey (including this year’s edition, to which we already have 560 out of a 580 eligible respondents). The 2013 and 2014 numbers are prior to the introduction of CORE, and the 2015 number is after CORE’s first year. I’ve also calculated a proportion that includes all students whose immediate plan after graduation is to work full-time in order to account for the differences in the size of the graduating cohorts.

Seniors with jobs accepted when completing the senior survey -

  • 2013 – 104 of a possible 277 (37.5%)
  • 2014 – 117 of a possible 338 (34.6%)
  • 2015 – 145 of a possible 321 (45.2%)

Proportion of seniors indicating they were recommended to CORE or the CEC by other students -

  • 2013 – 26.9%
  • 2014 – 24.0%
  • 2015 – 33.2%

Proportion of seniors indicating they were recommended to CORE or the CEC by faculty in their major or faculty outside their major, respectively -

  • 2013 – 47.0% and 18.8%
  • 2014 – 48.1% and 20.6%
  • 2015 – 54.6% and 26.0%

Proportion of seniors indicating that no one recommended CORE or the CEC to them -

  • 2013 – 18.0%
  • 2014 – 18.9%
  • 2015 – 14.4%

Taken together, these data points seem to suggest that CORE is making a positive impact on campus.  By no means do these data points imply that CORE should be ultimately judged as a success, a failure, or anything in between at this point. However, this data certainly suggests that CORE is on the right track and may well be making a real difference in the lives of our students.

If you’re not sure what CORE does or how they do it, the best (and probably only) way to get a good answer to that question is to go there yourself, talk to the folks who work there, and see for yourself.  If you’re nice to them, they might even give you some candy!

Make it a good day,


How many responses did you get? Is that good?

As most of you know by now, the last half of the spring term sometimes feels like a downhill sprint. Except in this case you’re less concerned about how fast you’re going and more worried about whether you’ll get to the finish line without face-planting on the pavement.

Well, it’s no different in the IR Office.  At the moment, we have four large-scale surveys going at once (the recent graduate survey, the senior survey, the freshman survey, and the employee survey), we’ve just finished sending a year’s worth of reports to the Department of Education, and we’re preparing to send all of the necessary data to the arbiter of all things arbitrary, U.S. News College Rankings. That is in addition to all of the individual requests for data gathering and reporting and administrative work that we do every week.

So in the midst of all of this stuff, I wanted to thank everyone who responded to our employee survey as well as everyone who has encouraged others to participate. After last week’s post, a few of you asked how many responses we’ve received so far and how many we need. Those are good questions, but as is my tendency (some might say “my compulsion”) the answer is more complicated than you’d probably prefer.

In essence, we need as many as we can get from as many different types of employees as we can get. But in terms of an actual number, defining “how many responses is enough” can get pretty wonky with formulas and unfamiliar symbols. So I shoot for 60% of an overall population. That means, since Augustana has roughly 500 full-time employees, we would cross that threshold with 300 employee survey responses.

However, that magic 60% applies to any situation where we are looking at the degree to which a set of responses to a particular item can be confidently applied to the overall population. What if we want to look at responses from a certain subgroup of employees (e.g., female faculty)?  In that case, we need to have responses from 60% of the female faculty, something that isn’t necessarily a certainty just because we have 300 out of 500 total responses.

This is why I am constantly hounding everyone about our surveys in order to get as many responses as possible. Because we don’t know all of the subgroups that we might want to analyze when we start collecting data; those possibilities arise during the analysis. And once we find out that we don’t have enough responses to dig into something that looks particularly important, we are flat out of luck.

So this week, I’m asking you to do me a favor.  Ask one person who you don’t necessarily talk to every day if they’ve taken the survey. If they haven’t, encourage them to do it. It might end up making big difference.

Make it a good day,


The Problem with Aiming for a Culture of Assessment

In recent years I’ve heard a lot of higher ed talking heads imploring colleges and university to adopt a “culture of assessment.” As far as I can tell (at least from a couple of quick Google searches), the phrase has been around for almost two decades and varies considerably in what it actually means. Some folks seem to think it describes a place where everyone uses evidence (some folks use the more slippery term “facts”) to make decisions, while others seem to think that a culture of assessment describes a place where everyone measures everything all the time.

There is a pretty entertaining children’s book called Magnus Maximus, A Marvelous Measurer that tells the story of guy who gets so caught up measuring everything that he ultimately misses the most important stuff in life. In the end he learns “that the best things in life are not meant to be measured, but treasured.” While there are some pretty compelling reasons to think twice about the book’s supposed life lesson (although I dare anyone to float even the most concise post-modern pushback to a five year old at bedtime and see how that goes), the book delightfully illustrates the absurdity of spending one’s whole life focused on measuring if the sole purpose of that endeavor is merely measuring.

In the world of assessment in higher education, I fear that we have made the very mistake that we often tell others they shouldn’t make by confusing the ultimate goal of improvement with the act of measuring. The goal – or “intended outcome” if you want to use the eternally awkward assessment parlance – is that we actually get better at educating every one of our students so that they are more likely to thrive in whatever they choose to do after college. Even in the language of those who argue that assessment is primarily needed to validate that higher education institutions are worth the money (be it public or private money), there is always a final suggestion that institutions will use whatever data they gather to get better somehow. Of course, the “getting better” part seems to always be mysteriously left to someone else. Measuring, in any of its forms is almost useless if that is where most or all of the time and money is invested. If you don’t believe me, just head on down to your local Institutional Research Office and ask to see all of the dusty three-ring binders of survey reports and data books from the last two decades. If they aren’t stacked on a high shelf, they’re probably in a remote storage room somewhere.

Measuring is only one ingredient of the recipe that gets us to improvement. In fact, given the myriad of moving parts that educators routinely deal with (only some of which educators and institutions can actually control), I’m not sure that robust measuring is even the most important ingredient. An institution has no more achieved improvement just because they measure things than a chef bakes a cake by throwing a bag of flour in an oven (yes I know there are such things as flourless tortes … that is kind of my point). Without cultivating and sustaining an organizational culture that genuinely values and prioritizes improvement, measurement is just another thing that we do.

Genuinely valuing improvement means explicitly dedicating the time and space to think through any evidence of mission fulfillment (be it gains on learning outcomes, participation in experiences that should lead to learning outcomes, or the degree to which students’ experiences are thoughtfully integrated toward a realistic whole), rewarding the effort to improve regardless of success or failure, and perpetuating an environment in which everyone cares enough to continually seek out things that might be done just a little bit better.

Peter Drucker is purported to have said that “culture eats strategy for lunch.” Other strategic planning gurus talk about the differences between strategy and tactics. If we want our institutions to actually improve and continually demonstrate that, no matter how much the world changes, we can prepare our students to take adult life by the horns and thrive no matter what they choose to do, then we can’t let ourselves mistakenly think that maniacal measurement magically perpetuates a culture of anything. If anything, we are likely to just make a lot more work for quantitative geeks (like me) while excluding those who aren’t convinced that statistical analysis is the best way to get at “truth.” And we definitely will continue to tie ourselves into all sorts of knots if we pursue a culture of assessment instead of a culture of improvement.

Make it a good day,



A little thing happened while you were away . . .

Welcome back to campus! I hope you enjoyed a restful winter break. Although I was able to find a few days of legitimate relaxation (I actually read fiction for fun!), a little thing happened at the end of last week that yanked me back into focus and kept my mind spinning over the weekend.

Friday morning’s big reveal from the higher ed press was the announcement from President Obama that he is proposing a program to make community college free.  The details and the obligatory range of reactions was dutifully reported here and here, and by this morning it seems that almost every news outlet with an education beat has polled the usual suspects for comment, analysis, and knee-jerk reaction. The chatter about this policy proposal doesn’t need any more faux smart people to weigh in, so I’ll refrain from adding an unfocused “thumbs-up” or “thumbs-down” to the mix.  However, I think that the mere emergence of this policy proposal holds a couple of important implications that could matter a lot for those of us at Augustana College (as well as other small liberal arts colleges).

First, a big part of this proposal turns on the caveat that “Community colleges will be expected to offer … academic programs that fully transfer credits to local public four-year colleges and universities.”  This sounds great, except for the fact that the destination institution is the one that determines whether academic programs or credits transfer fully, not the individual community college from whence the student originates.  Whether or not the President’s policy proposal comes to fruition, I think it reflects a increasingly common belief that students should be able to move seamlessly between higher education institutions, no matter if they are moving between two-year or four-year institutions (not to mention the individual online courses, degree programs, or prior learning credits).

If I’m right here, then we will continue to see more and more students transfer credits to and from Augustana as they become less associated with a particular institution and more connected to the degree they are intending to earn or career they intend to pursue.  Again, if I’m right, that will make it even more difficult for us to know, a) if our graduates have learned everything that we believe an Augustana degree represents, and b) if the students sitting in front of us on the first day of the term already possess the prerequisite knowledge and skills to succeed in each class. However we respond to this issue (for example, offering remediation services for students who struggle, signing articulation agreements with individual community colleges to assure some degree of vetting prior coursework for transfer students, or designing competency-based assessments for students to demonstrate their readiness for advanced academic work and graduation), the challenges that emerge when students increasingly enter and depart colleges and universities at times other than the beginning and the end of that institution’s designed educational experience are, as a 2012 study suggests, likely to become more prevalent.

Second, if this proposal does in fact signal that earning credits from multiple institutions to complete a degree is gaining in both numbers and legitimacy, then we would be smart to take a hard look at all of the ways in which our institutional practices might subtly dissuade transfer students from considering Augustana.  Since our study of transfer students’ experience a couple of years ago, we’ve already made some changes to make Augustana a better destination for transfer students. But we still have some work to do – not because we have dropped the ball in responding to our findings, but because this kind of work is just plain hard.

Third, it seems to me that this trend further emphasizes the degree to which we need to be able to show that the totality of the Augustana experience – not just the academic coursework – produces the critical learning that we intend for our students.  Otherwise, we are likely to fall victim to the external framing of what constitutes a college education (aka an accumulation of academic credits that are equally valuable as a whole or a sum of their parts), making it even more difficult to differentiate ourselves in product or perception.

I’m sure that you can think of specific issues that we ought to examine if transfer students are going to become an increasingly large segment of the college-going public.  As the number of high school graduates in the Midwest continues to decrease over the next decade or so, it seems that this question becomes that much more important.  If you have some thoughts, please feel free to post them in the comment section below.  Maybe we can have a conversation without having to brave the frigid temperatures outside?

Make it a good day,


What is your definition of a “Plan B?”

I often get pegged as “the numbers guy.” Even though the words themselves seem pretty simple, I’m never really sure how to interpret that phrase. Sometimes people seem to use it to defer to my area of expertise (and that feels nice). But sometimes it seems vaguely dismissive, as if they’re a little surprised to find that I’ve escaped from my underground statistical production bunker (that doesn’t feel so nice).

With data points, it’s not the numbers by themselves that make the difference; it’s the meaning that gets assigned to them. The same is true with phrases that we all too often toss around without a second thought. I stumbled into a prime example of this issue recently while talking to several folks about the way that they think about helping students prepare for life after college. It turns out that we can run ourselves into a real buzzsaw of a problem if we don’t mean the same thing when we talk to students about developing a “Plan B.”

Essentially, a Plan B is simple – it’s a second plan if the first plan doesn’t work out. But underneath that sort of obvious definition lies the rub. For what purpose does the Plan B exist? Is it to get to a new and different goal, or is it to take an alternative path to get to the original goal?

For some, helping a student construct a Plan B means identifying a second career possibility in case the student’s first choice post-graduation plan doesn’t work out. For example, a student who intends to be a doctor may not have the grades or references to guarantee acceptance into med school. At this point, a faculty adviser might suggest that the student investigate other careers that might match some of the student’s other interests (maybe in another health field, maybe not). This definition of a Plan B assumes a career change and then begins to formulate a plan to move toward that new goal.

But for others, helping a student construct a Plan B doesn’t mean changing career goals at all. Instead, this definition of a Plan B recognizes that there are often multiple pathways to get into a particular career. For the aspiring med school student who may not have slam-dunk grades in biology or chemistry but still wants to be a doctor, one could envision a Plan B that includes taking a job at a hospital in some sort of support role, retaking specific science courses at a local university or college, then applying to medical school with stronger credentials, potentially better references, and more experience. In this case the end goal didn’t change at all. The thing that changed was the path to get there.

In no way am I suggesting that one definition of a Plan B is better than another. On the contrary, both are entirely appropriate. In fact, the student would probably be best served by laying out both possibilities and walking through the relevant implications. But the potential for a real disaster comes when two people (maybe a faculty member and a career adviser) are separately talking to the same student about the need to devise a Plan B, yet the faculty member and the adviser mean very different things when they use the same phrase.

As you can imagine, the student would probably feel as though he or she is getting conflicting advice. In addition, she might well think that the person encouraging a different career choice just doesn’t believe in her (and that the person suggesting an alternate path to her original career goal is the one who really cares about her). Moreover, the person encouraging the student to explore another career choice might feel seriously undermined by the person who has suggested to the student an alternative way to continue toward the original career goal. In the end, a student’s trust in our ability to guide them accurately and effectively is seriously eroded and a rift has likely developed between the two individuals who both genuinely care about the student in question.

Absolutely, there are times when we have to tell students that they need to explore alternative career plans. We do them no favors by placating them. At the same time, we all know students who, although they seemed to lack motivation and direction when they were at Augustana, kicked it in after graduation and eventually found a way into the career they had always wanted to pursue.

I’m certainly not suggesting that we should adopt one official definition of the phrase “Plan B.” Rather, my suspicion is that this is one of those phrases that we use often without realizing that we might not all mean the same thing. If our goal is to collectively give students the kind of guidance that they need to succeed after graduation, we probably ought to make sure that in each case we all mean the same thing when we talk to a student about a Plan B.

Make it a good day,



“Lean” in and learn something new

I think it’s fair to say that most educators cringe at the idea of applying practices from the world of business to education. So many times we’ve read or heard someone talk about education as if it were a cursory transaction where students or parents simply purchase a product as an investment toward future earnings. Of course, one only needs to spend a few days trying to get students to learn something that contests their prior assumptions to know that viewing education through such a transactional lens leads to a gross misunderstanding of what we do and how education works. I’d love to see a list of all the times when a business framework was misapplied to an educational setting with disastrous results.

So, does this mean that everything developed in a business setting is guaranteed to fail in an educational setting? It’s okay if you’re inclined to say “yes” (especially if you’ve been down that road a few times). When President Bahls suggested that we could apply principles of lean management to improve a variety of processes at Augustana College, I’ll admit that I shuddered. Maybe like you, I imagined an internal apocalypse: budget cuts and position reductions with no changes in expectations. But after reading up on the concept of lean management and spending last week as a member of the first Rapid Improvement Team, I have to admit that my shudder was merely emblematic of my own ignorance. While lean management has its own set of terminology that might seem foreign to educators, the values embedded in a lean management philosophy embody the same values that we aspire to uphold in a collaborative and transparent organization dedicated to educating students. I found the framework and the process to be deeply gratifying and potentially applicable to the range of domains in which we operate.

First, “lean” doesn’t mean thinner.  It’s not about losing weight, downsizing, or cutting out the fat. It’s not an acronym. The term refers to the degree to which processes are conducted efficiently while best serving the needs of the beneficiary (i.e., anyone who benefits from that process).

Second, lean management philosophy asserts that the people best positioned to make improvement happen are those who are intimately involved in that particular job or process. Not only do those folks know the ins and outs of that work better than anyone else; they also need to believe in the efficacy of any identified changes in order to give those changes the best chance of turning into demonstrable and lasting improvements. For these reasons, any attempt to improve a process must genuinely involve the people who do that work.

Third, lean management philosophy argues that improvement of a process is exemplified in those who benefit from that process. Although the beneficiaries of our work are often students, we often conduct operations that benefit more than just students. The beneficiaries of payroll are anyone who gets paid. The beneficiaries of the salad bar are anyone who eats a salad. As a result, the way to determine if we have improved a process is to identify clear means of demonstrating an improved impact on the beneficiaries of that process.

Fourth, lean management starts with the belief that the collective ability of an organization’s people can find and put in place substantial improvements to a process.  Effective lean management begins by collaborating to develop a shared understanding of the current state of a process or problem.  Only after the problem is fully understood as something worthy of improvement would an improvement team begin to consider potential solutions.

Fifth, lean management philosophy focuses on continual improvement, not perfection. There are simply too many external and unpredictable influences to expect perfection.  Furthermore (especially in the work that we do), just when we find that a particularly education practices works well, the students change and we have to continue to adjust.

Everything the Rapid Improvement Team did last week reflected all of these values.  I was impressed with the way the process was designed to keep them at the forefront while moving us toward a set of suggestions that were extremely likely to improve the process.

If you would like to see a recording of the presentation from the Rapid Improvement Team from last Friday, you can see it here.

Ultimately, the lesson I learned from this process was that it is possible (shocking, I know) for something that has been developed in the business world over the last several decades to be applied successfully in an educational institution in a way that actually strengthens our ability to enact the values we espouse.  In addition, I (re)learned that we have some amazing people at Augustana who are willing to put their hearts into doing what we do better. I’m lucky to be a part of it.

Make it a good day,



A taste of my own medicine

One of the fundamental tenets of Delicious Ambiguity has been that all of us who contribute to the development of students are at our best when we approach our work through a lens of “positive restlessness.” That phrase was introduced into the lexicon of higher education writers by George Kuh and his colleagues in their book Student Success in College, describing a pervasive philosophy that his research team saw at colleges that always seemed to be seeking out ways to get better no matter how successful they already were. Anyone who knows me recognizes that I relish the chance to look for ways to improve. But I think it is an entirely fair criticism to suggest that I might have an overly rosy view of change and that I should be forced to get elbow-deep in the down-and-dirty work of actually fixing a complicated and convoluted process.

So this week, if you’ve ever thought that I needed a dose of your version of reality, you are in luck. My “comeuppance” has appeared in the form of participation in a weeklong, immersive Rapid Improvement Event (RIE). I’ll be joining a team of Augustana employees trying to wrangle a portion of the payroll process and hopefully improve it. I don’t know much about payroll – so they tell me I’m “perfect” for the job.

So here goes!

Make it a good day,



Hey; what’s this I hear about the Winding Path Study?

Some of you have heard me mention a study that we (AKA our massive juggernaut of an IR office better known to most of you as Kimberly and Mark) started last spring called the Winding Path Study. In short, this study was designed to gather information from all living Augustana alumni (at least those for whom we had working email addresses) about the nature of their adult lives from the time they entered Augustana up until last spring.

During the twelve months of strategic planning discussions, one of the things that stood out to me was how much we really don’t know about the long-term impact of an Augustana education. Don’t get me wrong; we have lots of wonderful stories about Augustana graduates excelling in all sorts of professional and personal pursuits. But we don’t know nearly as much as we would like about the nature of our alum’s lives after college: the ways that they have handled success and failure, the adjustments they have had to make when life throws them a curveball, or the ways that their Augustana experiences might have influenced twists and turns in their life’s path right after graduation or much later in life. This information matters because, if we are preparing students to succeed throughout their adult lives, we need to know how those lives play out across personal and professional domains and as our alums grow and change over time.

After looking through all of the different ways that colleges have tried to survey their alumni, we couldn’t find any approach that matched our conceptual frame or addressed the questions we had constructed. So we rolled up our metaphorical sleeves and built a study from scratch based on the sociological theory of Life Course Perspective, a construct that describes the life course as a series of trajectories, transitions, and turning points.

In this post I’d like to share a few summary findings just to give you a flavor for what we’ve seen from the almost 2,800 responses we received last spring in the first stage of this project.

The first two questions explored the nature of our alum’s path when entering Augustana and moving through their undergraduate years.

  • Did you have a specific career goal or major in mind when you came to Augustana?
    • 53% – Yes; I was pretty sure I knew what I wanted to do
    • 33% – Somewhat; I had some ideas but wasn’t set on anything in particular
    • 12% – No; I didn’t know what I wanted to do at all
  • Did you change majors or career goals while you were an undergraduate?
    • 38% – Yes
    • 61% – No
  •  What path did you take right after graduation from Augustana?
    • 26% – Went to grad school in the same field that I studied
    • 8% –   Went to grad school in a different field than I studied
    • 42% – Took a job or volunteered in the same field that I studied
    • 15% – Took a job or volunteered in a different field than I studied
    • 2% –   Took time off to pursue other interests
    • 7% –   Other

The next set of questions explored the varied nature of our graduates’ adult lives. Although we couldn’t have possibly captured every facet of an adult life, our goal was to gather a first glimpse that could be explored in more detail later.

  • How many times have you changed jobs since you graduate from Augustana?
    • 16% – None
    • 27% – 1-2
    • 28% – 3-4
    • 16% – 5-6
    • 11% – 7 or more
  • How many of those job changes occurred because of a professional opportunity that you chose to pursue?
    • 22% – None
    • 36% – 1-2
    • 22% – 3-4
    • 11% – 5-6
    • 5% –   7 or more
  • How many of those job changes occurred because of a professional disruption (downsizing, bankruptcy, termination, etc.)?
    • 67% – None
    • 23% – 1-2
    • 3% –   3-4
    • 1% –   5-6
    • 0% –   7 or more
  • Were any of your job changes influenced by family considerations?
    • 39% – Yes
    • 56% – No
  • Were any of your job changes influenced by personal considerations?
    • 54% – Yes
    • 41% – No

(Please note that some folks didn’t respond to every question, resulting in some proportions equaling less than 100%).

These findings deepened our understanding of the variety of pathways that students pursue after college.  Almost one quarter of our graduates, immediately after college, entered graduate school or took a job in a different field than their major.  These findings also strengthened our belief that preparing students for successful lives after college goes far beyond one’s major or minor and extends long past the first job, first graduate school degree, or whatever the first thing a student chooses to do after college might be.

As you can also see, many if not most Augustana alumni have likely led adult lives that look more like winding paths than straight lines. These findings – even if they might seem fairly obvious to anyone who has lived through the reality of an unpredictable life – have shaped our thinking as we continue to design a college experience that prepares every student to carve through life after college – no matter what comes out of the woodwork.

The first stage of this study concluded with a question at the end of the survey asking if the respondent would be willing to participate in a half-hour interview.  Based on prior research experience, we expected 100-200 positive responses.  We received over 1,400 positive responses!  So the next step for us, after spending the last six months analyzing all of the open-ended responses, is to develop a framework for the interviews and how we will select potential interviewees.  We would like to interview as many as possible, but frankly, the specter of 1,400 interviews is a bit daunting! Moreover, because we also asked other categorizing information like the year the respondent graduated, their major, and their current profession, we have all kinds of ways that we can organize and analyze this data.

Over the rest of the academic year, I hope I’ll have another update on the results of the second phase of this study. In the mean time, enjoy the last week of the fall term!

Make it a good day,