What is your definition of a “Plan B?”

I often get pegged as “the numbers guy.” Even though the words themselves seem pretty simple, I’m never really sure how to interpret that phrase. Sometimes people seem to use it to defer to my area of expertise (and that feels nice). But sometimes it seems vaguely dismissive, as if they’re a little surprised to find that I’ve escaped from my underground statistical production bunker (that doesn’t feel so nice).

With data points, it’s not the numbers by themselves that make the difference; it’s the meaning that gets assigned to them. The same is true with phrases that we all too often toss around without a second thought. I stumbled into a prime example of this issue recently while talking to several folks about the way that they think about helping students prepare for life after college. It turns out that we can run ourselves into a real buzzsaw of a problem if we don’t mean the same thing when we talk to students about developing a “Plan B.”

Essentially, a Plan B is simple – it’s a second plan if the first plan doesn’t work out. But underneath that sort of obvious definition lies the rub. For what purpose does the Plan B exist? Is it to get to a new and different goal, or is it to take an alternative path to get to the original goal?

For some, helping a student construct a Plan B means identifying a second career possibility in case the student’s first choice post-graduation plan doesn’t work out. For example, a student who intends to be a doctor may not have the grades or references to guarantee acceptance into med school. At this point, a faculty adviser might suggest that the student investigate other careers that might match some of the student’s other interests (maybe in another health field, maybe not). This definition of a Plan B assumes a career change and then begins to formulate a plan to move toward that new goal.

But for others, helping a student construct a Plan B doesn’t mean changing career goals at all. Instead, this definition of a Plan B recognizes that there are often multiple pathways to get into a particular career. For the aspiring med school student who may not have slam-dunk grades in biology or chemistry but still wants to be a doctor, one could envision a Plan B that includes taking a job at a hospital in some sort of support role, retaking specific science courses at a local university or college, then applying to medical school with stronger credentials, potentially better references, and more experience. In this case the end goal didn’t change at all. The thing that changed was the path to get there.

In no way am I suggesting that one definition of a Plan B is better than another. On the contrary, both are entirely appropriate. In fact, the student would probably be best served by laying out both possibilities and walking through the relevant implications. But the potential for a real disaster comes when two people (maybe a faculty member and a career adviser) are separately talking to the same student about the need to devise a Plan B, yet the faculty member and the adviser mean very different things when they use the same phrase.

As you can imagine, the student would probably feel as though he or she is getting conflicting advice. In addition, she might well think that the person encouraging a different career choice just doesn’t believe in her (and that the person suggesting an alternate path to her original career goal is the one who really cares about her). Moreover, the person encouraging the student to explore another career choice might feel seriously undermined by the person who has suggested to the student an alternative way to continue toward the original career goal. In the end, a student’s trust in our ability to guide them accurately and effectively is seriously eroded and a rift has likely developed between the two individuals who both genuinely care about the student in question.

Absolutely, there are times when we have to tell students that they need to explore alternative career plans. We do them no favors by placating them. At the same time, we all know students who, although they seemed to lack motivation and direction when they were at Augustana, kicked it in after graduation and eventually found a way into the career they had always wanted to pursue.

I’m certainly not suggesting that we should adopt one official definition of the phrase “Plan B.” Rather, my suspicion is that this is one of those phrases that we use often without realizing that we might not all mean the same thing. If our goal is to collectively give students the kind of guidance that they need to succeed after graduation, we probably ought to make sure that in each case we all mean the same thing when we talk to a student about a Plan B.

Make it a good day,

Mark

 

“Lean” in and learn something new

I think it’s fair to say that most educators cringe at the idea of applying practices from the world of business to education. So many times we’ve read or heard someone talk about education as if it were a cursory transaction where students or parents simply purchase a product as an investment toward future earnings. Of course, one only needs to spend a few days trying to get students to learn something that contests their prior assumptions to know that viewing education through such a transactional lens leads to a gross misunderstanding of what we do and how education works. I’d love to see a list of all the times when a business framework was misapplied to an educational setting with disastrous results.

So, does this mean that everything developed in a business setting is guaranteed to fail in an educational setting? It’s okay if you’re inclined to say “yes” (especially if you’ve been down that road a few times). When President Bahls suggested that we could apply principles of lean management to improve a variety of processes at Augustana College, I’ll admit that I shuddered. Maybe like you, I imagined an internal apocalypse: budget cuts and position reductions with no changes in expectations. But after reading up on the concept of lean management and spending last week as a member of the first Rapid Improvement Team, I have to admit that my shudder was merely emblematic of my own ignorance. While lean management has its own set of terminology that might seem foreign to educators, the values embedded in a lean management philosophy embody the same values that we aspire to uphold in a collaborative and transparent organization dedicated to educating students. I found the framework and the process to be deeply gratifying and potentially applicable to the range of domains in which we operate.

First, “lean” doesn’t mean thinner.  It’s not about losing weight, downsizing, or cutting out the fat. It’s not an acronym. The term refers to the degree to which processes are conducted efficiently while best serving the needs of the beneficiary (i.e., anyone who benefits from that process).

Second, lean management philosophy asserts that the people best positioned to make improvement happen are those who are intimately involved in that particular job or process. Not only do those folks know the ins and outs of that work better than anyone else; they also need to believe in the efficacy of any identified changes in order to give those changes the best chance of turning into demonstrable and lasting improvements. For these reasons, any attempt to improve a process must genuinely involve the people who do that work.

Third, lean management philosophy argues that improvement of a process is exemplified in those who benefit from that process. Although the beneficiaries of our work are often students, we often conduct operations that benefit more than just students. The beneficiaries of payroll are anyone who gets paid. The beneficiaries of the salad bar are anyone who eats a salad. As a result, the way to determine if we have improved a process is to identify clear means of demonstrating an improved impact on the beneficiaries of that process.

Fourth, lean management starts with the belief that the collective ability of an organization’s people can find and put in place substantial improvements to a process.  Effective lean management begins by collaborating to develop a shared understanding of the current state of a process or problem.  Only after the problem is fully understood as something worthy of improvement would an improvement team begin to consider potential solutions.

Fifth, lean management philosophy focuses on continual improvement, not perfection. There are simply too many external and unpredictable influences to expect perfection.  Furthermore (especially in the work that we do), just when we find that a particularly education practices works well, the students change and we have to continue to adjust.

Everything the Rapid Improvement Team did last week reflected all of these values.  I was impressed with the way the process was designed to keep them at the forefront while moving us toward a set of suggestions that were extremely likely to improve the process.

If you would like to see a recording of the presentation from the Rapid Improvement Team from last Friday, you can see it here.

Ultimately, the lesson I learned from this process was that it is possible (shocking, I know) for something that has been developed in the business world over the last several decades to be applied successfully in an educational institution in a way that actually strengthens our ability to enact the values we espouse.  In addition, I (re)learned that we have some amazing people at Augustana who are willing to put their hearts into doing what we do better. I’m lucky to be a part of it.

Make it a good day,

Mark

 

Hey; what’s this I hear about the Winding Path Study?

Some of you have heard me mention a study that we (AKA our massive juggernaut of an IR office better known to most of you as Kimberly and Mark) started last spring called the Winding Path Study. In short, this study was designed to gather information from all living Augustana alumni (at least those for whom we had working email addresses) about the nature of their adult lives from the time they entered Augustana up until last spring.

During the twelve months of strategic planning discussions, one of the things that stood out to me was how much we really don’t know about the long-term impact of an Augustana education. Don’t get me wrong; we have lots of wonderful stories about Augustana graduates excelling in all sorts of professional and personal pursuits. But we don’t know nearly as much as we would like about the nature of our alum’s lives after college: the ways that they have handled success and failure, the adjustments they have had to make when life throws them a curveball, or the ways that their Augustana experiences might have influenced twists and turns in their life’s path right after graduation or much later in life. This information matters because, if we are preparing students to succeed throughout their adult lives, we need to know how those lives play out across personal and professional domains and as our alums grow and change over time.

After looking through all of the different ways that colleges have tried to survey their alumni, we couldn’t find any approach that matched our conceptual frame or addressed the questions we had constructed. So we rolled up our metaphorical sleeves and built a study from scratch based on the sociological theory of Life Course Perspective, a construct that describes the life course as a series of trajectories, transitions, and turning points.

In this post I’d like to share a few summary findings just to give you a flavor for what we’ve seen from the almost 2,800 responses we received last spring in the first stage of this project.

The first two questions explored the nature of our alum’s path when entering Augustana and moving through their undergraduate years.

  • Did you have a specific career goal or major in mind when you came to Augustana?
    • 53% – Yes; I was pretty sure I knew what I wanted to do
    • 33% – Somewhat; I had some ideas but wasn’t set on anything in particular
    • 12% – No; I didn’t know what I wanted to do at all
  • Did you change majors or career goals while you were an undergraduate?
    • 38% – Yes
    • 61% – No
  •  What path did you take right after graduation from Augustana?
    • 26% – Went to grad school in the same field that I studied
    • 8% –   Went to grad school in a different field than I studied
    • 42% – Took a job or volunteered in the same field that I studied
    • 15% – Took a job or volunteered in a different field than I studied
    • 2% –   Took time off to pursue other interests
    • 7% –   Other

The next set of questions explored the varied nature of our graduates’ adult lives. Although we couldn’t have possibly captured every facet of an adult life, our goal was to gather a first glimpse that could be explored in more detail later.

  • How many times have you changed jobs since you graduate from Augustana?
    • 16% – None
    • 27% – 1-2
    • 28% – 3-4
    • 16% – 5-6
    • 11% – 7 or more
  • How many of those job changes occurred because of a professional opportunity that you chose to pursue?
    • 22% – None
    • 36% – 1-2
    • 22% – 3-4
    • 11% – 5-6
    • 5% –   7 or more
  • How many of those job changes occurred because of a professional disruption (downsizing, bankruptcy, termination, etc.)?
    • 67% – None
    • 23% – 1-2
    • 3% –   3-4
    • 1% –   5-6
    • 0% –   7 or more
  • Were any of your job changes influenced by family considerations?
    • 39% – Yes
    • 56% – No
  • Were any of your job changes influenced by personal considerations?
    • 54% – Yes
    • 41% – No

(Please note that some folks didn’t respond to every question, resulting in some proportions equaling less than 100%).

These findings deepened our understanding of the variety of pathways that students pursue after college.  Almost one quarter of our graduates, immediately after college, entered graduate school or took a job in a different field than their major.  These findings also strengthened our belief that preparing students for successful lives after college goes far beyond one’s major or minor and extends long past the first job, first graduate school degree, or whatever the first thing a student chooses to do after college might be.

As you can also see, many if not most Augustana alumni have likely led adult lives that look more like winding paths than straight lines. These findings – even if they might seem fairly obvious to anyone who has lived through the reality of an unpredictable life – have shaped our thinking as we continue to design a college experience that prepares every student to carve through life after college – no matter what comes out of the woodwork.

The first stage of this study concluded with a question at the end of the survey asking if the respondent would be willing to participate in a half-hour interview.  Based on prior research experience, we expected 100-200 positive responses.  We received over 1,400 positive responses!  So the next step for us, after spending the last six months analyzing all of the open-ended responses, is to develop a framework for the interviews and how we will select potential interviewees.  We would like to interview as many as possible, but frankly, the specter of 1,400 interviews is a bit daunting! Moreover, because we also asked other categorizing information like the year the respondent graduated, their major, and their current profession, we have all kinds of ways that we can organize and analyze this data.

Over the rest of the academic year, I hope I’ll have another update on the results of the second phase of this study. In the mean time, enjoy the last week of the fall term!

Make it a good day,

Mark

Data doesn’t have to be numbers

As some of you might know, every once in a while I get asked to talk to other colleges or universities about my study abroad research. Yesterday I was fielding some questions at one such workshop, when a faculty member who takes students on study abroad trips told a story of her experiences talking to her students about what they had learned during their trip abroad. She talked at length about the students’ description of their own growth but ended her statement by saying, “Of course, I don’t have any data on this.”

I hear that line so often when talking with faculty or student affairs staff about their experiences with students.  And although I’m sure I’ll say this again at some point (and hopefully in a kind and caring way!), I just wanted to quickly say to anyone reading this blog today . . . Data is information.  Sometimes information comes in the form of numbers. Sometimes information comes in the form of comments.  Sometimes information comes in the form of student assignments.  Data does not have to be numbers. In fact, sometimes the worst data out there comes in the form of numbers. So if you have information about students that comes from data you have gathered, then you have data. It might be indicative of something that a broader swath of students experience, or it might just be illustrative of the small group of students that provided that data.  But it’s ALL data.

The subtitle of this blog is “using evidence to improve student learning.”  Evidence comes in all forms, and information that qualifies as evidence must go through a vetting process that doesn’t have to be numerical. So if you have data that you think qualifies as evidence – bring it!  Please don’t hesitate or worse, short-change yourself and your work, by thinking that it only qualifies if it comes in numbers.  That’s just not so.

Make it a good day,

Mark

Swimming in the 2014 Senior Survey data!

I think I’ve resigned myself to the fact that it is nearly impossible to explain to someone who hasn’t experienced Augustana’s 10-week terms what it feels like to go from standing still less than a month ago to flying by the seat of your pants at week four. But here it is verging on mid-term time, and I’m hurtling through space trying everything I can to get my bearings!

So, to show that the Institutional Research and Assessment staff (AKA Kimberly and I) doesn’t just sit around dreaming up ways to collect more data, I thought I’d share with you . . . more data. (I guess this doesn’t really debunk my assumptions about your assumptions, does it.)

Every spring, we ask our graduating seniors to complete a survey that asks all sort of questions about their experiences at Augustana. In addition, we ask a few important questions that we’ve found to be useful outcome questions (would you choose Augustana again, is your post-graduate plan a good fit for who you are and where you want your life to go, and do you already have a job or grad school place).

It takes a lot of work to process this data into a readable report, but it’s finally finished and posted on the IR web page.  Here is the direct link to the 2014 Senior Survey results.

2014 Senior Survey Report and Findings

Now you could jump on that link right away and start swimming in the data – percentages, averages, and standard deviations (oh, my!). And you might survive the experience, although your eyes will probably start to glaze over as you look at mean score after mean score and your brain will likely start to go soft wondering (rightly so) what exactly each average score means – is it good? is is bad? is it just right? (sort of like Goldilocks in the house of the three bears . . . replacing the bowls of porridge with excel spreadsheets, of course).

So if I may, let me make a suggestion that I hope will make some of this data more meaningful.  Instead of looking at the numbers first, put a sheet of paper over the numbers and look at the questions first.  Reflect on why each question might matter for students and what might be the “about right” distribution of responses.  Pick out 3-5 questions that seem particularly interesting to you.

THEN take away the sheet of paper covering the numbers. Do your musings match up with the average score or the distribution of responses?  What more would you like to know that might help you get a better handle on what we could do to improve, if the difference between your reflections and the actual score suggests that institutional improvement might be valuable?

This data isn’t of much use if it doesn’t help us get better at what we do. And you – the people on the ground floor who are working with students every day – are the ones who are ideally suited to tackle this data, jump into this process, and benefit from the results of your efforts.

If you have any questions, comments, suggestions, or criticisms (I prefer to think of it as constructive feedback!) about the senior survey, PLEASE PLEASE PLEASE contact us – Kimberly or me – in the IR office. Nothing we have built is so important that it can’t be changed . . . especially if those changes make the survey better.

Make is a good day,

Mark

What happens when you build assessment around improvement?

Well hello, everyone!

It’s great to feel the energy on campus again. And it’s exciting to restart my Delicious Ambiguity blog: Season 4 (I’ve been renewed!). Each week I share some tidbit (data that comes from statistics or focus groups) from our own Augustana student data that will help you do what you do just a little bit better the next time you do it. If you’re new to Delicious Ambiguity, you might also want to know that you can search the three years of previous posts (about 100 in all) for everything from athletes to introverts, Greeks to geeks. In addition to a ton of useful findings, you might even find a few funny quips (AKA bewildering side comments).

By now you’ve probably heard me say on at least one occasion that building assessment efforts around genuine improvement, as opposed to doing assessment to find out what’s already happened (i.e., to prove what you think you are already doing), thoroughly changes every part of the assessment process.  More importantly, it’s the only way to actually get better because:

  1. You’ve backward-designed the entire project around finding out what you need to do to get better instead of just finding out what happened, and
  2. You’ve humbled yourself to the possibility of improvement and thereby matched your efforts with the way that educational processes actually work.

I’d like to share an example of one program at Augustana that has clearly benefited from an “assessment for improvement” approach.  My goal here isn’t to brag, but rather to walk you through an example of such a process in the hope that something I share might be applicable to your own unique context.

Augustana has run some version of freshman orientation for a very long time. And by and large, it’s been a pretty successful program. Yet everyone involved has always wondered what they might do to make it just a little bit better. Much of our prior data only told us the proportion of students who were “satisfied” with the experience. Although we could pat ourselves on the back when the numbers were decent (which they were virtually every year), we had no way of turning that information into specific changes that we could trust would actually make the experience demonstrably more effective.

So a few years ago, folks from Student Affairs, Academic Affairs, and the IR office began applying an improvement-centered assessment approach to orientation. First, we talked at length about drilling down to the core learning goals of freshman orientation. Sure, we have lots of things we’d love to result from orientation, but realistically there are only so many things you can do in four days for a group of 18-year-olds who are some combination of giddy, overwhelmed, and panicked – none of which makes for particularly fertile learning conditions.

So with that in mind, we needed to strip our goals down to a “triage” set of learning goals for orientation.  We settled on three concepts.

  • Welcome Week will connect new students with the people necessary for success.
  • Welcome Week will connect new students with the places necessary for success.
  • Welcome Week will connect new students with the values necessary for success.

The people we identified included all of the individuals who might influence a student’s first year experience – other students, student affairs and residential life staff, and specific faculty members. The concept of place involved a) knowing EXACTLY how to walk to one’s classes and specific first-year resources, and b) finding other places on campus that a student might use for emotional rejuvenation as well as intellectual work. The values we discussed focused on clarifying a strategy for getting the most out of a liberal arts college setting. This meant introducing students to a mindset that focuses on actively participating in a process of learning and growth and show them how this approach will increase their likelihood of success, both in the first year and beyond.

Once we spelled out our goals for Welcome Week, then we could set about our work from two directions. First, we could start to alter the design of the experience to meet those goals. Second, we could build a survey that examined the degree to which students came away from Welcome Week connected to the people, places, and values that substantially increase the likelihood of success in the first year.

Over the last two years the survey findings have provided a number of interesting insights into the degree to which certain experiences were already meeting the goals we had set. More importantly, the survey data has become a critical conversation guide for specific improvements. Because the questions were built around specific experiences, it has given everyone – particularly peer mentors – a clear target to shoot for with each student. For example, if the goal was to ensure that each student would say that they knew exactly how to get to their classes on the first day, then the peer mentor could shift from merely pointing at buildings while walking around campus to creating some way for new freshmen to walk right up to the door of the room where their class would be.

At the same time that we were using data to guide specific adjustments, the folks planning Welcome Week examined the design of the entire program. This led them to introduce several changes, including the Saturday morning concurrent sessions titled “Augie 101,” focusing on all kinds of issues that would specifically increase the likelihood of successful academic acclimation.

We will survey the freshmen in the next week or so to find out how the most recent set of changes impacted their experience during Welcome Week. But even without that data, I suspect that the new programming this year improved the experience. My confidence comes from one particularly compelling data point that isn’t a number (I know – sit down and take a deep breath!). During the Augie 101 sessions, peer mentors and other older students who were assisting with Welcome Week kept saying, “I wish we would have had something like this during my Welcome Week experience.”  To me, that is a powerful endorsement of our efforts.

We will likely never be perfect, but we have mounting evidence that we keep getting better at what we do. That doesn’t mean that we have any reason to brag or rest on our laurels.  It just means that we are doing things right.  And that’s what makes doing this work so much fun.

Make it a good day,

Mark

A new (and maybe better) way to understand the impact of an Augustana education

As you probably know by now, the new Augustana 2020 strategic plan places our graduates’ success after college at the center of our institutional mission.  In real terms, this means that what our students learn in college matters to the degree that it contributes to their success after college.  Put another way, even if our students learn all kinds of interesting knowledge and complicated skills, if what they have learned can’t be effectively and meaningfully applied to life after college, then we haven’t really done our job.

Now whether you think that this is the last nail in the liberal arts coffin or the long-awaited defibrillator to revive liberal arts education, our own success hinges on something else that I’m pretty sure we haven’t thought much about. Exactly what are we talking about when we talk about a successful life after college? Do we have a working definition of what might make up a successful life for an Augustana graduate? In order to grapple with those vertigo-inducing questions, we have to know a lot more about what happens to our graduates after college.  But do we have anything more than vague notions about our own graduates’ lives?

I’m afraid that the answers to those questions are probably no, no, and no. In part, it’s because these are big, hard questions.  And to be fair, I don’t know of a college that has tried to get a real handle on these ideas.  So . . . . here we go . . . .

This is the kind of research project that can keep you up at night.  Because it isn’t just about getting data to figure out the relationship between one thing (an Augustana college experience) and another thing (a successful life after college).  For starters, these are two monstrously complicated constructs.  Distilling them down to some essential qualities may well be impossible.  I’m not saying that it’s NOT possible; I’m just admitting to the fact that I’m intimidated by the very idea of trying to identify a set of valid essential qualities. And as if that weren’t enough, we (higher education researchers writ large) have yet to have developed a conceptual framework that is complex enough to account for the almost infinite range of ways in which people’s lives evolve. To date, every effort to link alumni success to their college experience has presumed a straight line – even when we know that very few of us traveled a straight path to get to where we are now.

So over the past six months or so, Kimberly and I have built a multi-stage study in an attempt to get at some of these questions.  We settled on calling it “The Winding Path Study” (all credit to Kimberly for the title) and we have organized it around two initial stages, with room for additional exploration.  First, we had to find a conceptual framework that fit the way that people live their lives.  We found one that I think works that comes out of sociology and anthropology called Life Course Perspective. Essentially, this framework describes lives as amazingly complex and almost infinitely unique, yet full of three common elements – trajectories, transitions, and turning points. While Life Course scholars have extended definitions for each of these terms that I won’t try to summarize here, I think we all know what these terms mean just because we can likely point to moments in our own lives where the impact of these concepts became clear.

Next, we built a survey (but of course!) to try to get a better sense of the range of trajectories, transitions, and turning points that our graduates have experienced.  I hoped that we might get 1000 responses.  From these respondents, I hoped that we might find 100 that were willing to participate in a 30 minute interview.

Well, apparently we struck a chord.  We got 1000 responses from Augustana alumni in the first 12 hours of the survey, and finished with 2,792.  In addition, over 1200 respondents said that they would be willing to participate in a 30 minute interview.

I’ll share more about this project in the next several months as we pore through the data. One thing that jumped out at me as I began to watch the data coming in was the extent to which people were willing to tell us surprisingly personal details about their lives.  Our respondents wrote and wrote and wrote. We now have a treasure trove of data that we have to read through and organize.  At the end of this project, however, we will likely have a much greater understanding of the range of life courses that our alums have taken. Better yet, we hope to find some patterns that will help us think about the way that we guide our students during college.

The goals of the Augustana 2020 strategic plan are lofty and complicated.  I’m not sure we even realized how challenging this plan would be when the Board approved it in the winter or when we designed it last fall.  But now that we’ve started to roll up our sleeves, I think we already have information on our graduates that most colleges could only wish that they had.  Now comes the fun part!

Make it a good day,

Mark

 

Refocusing on the Connections Instead of Just Making Better Parts

This is the second of three posts about our need to reframe, refocus, and refine the way that we operationalize (i.e., deliver) the liberal arts. Near the end of last week’s post (where I suggested reframing how we deliver the liberal arts around enabling our graduates to thrive in the midst of change) I suggested:

So the learning experiences that matter the most may in fact be the things that we consider the least. Right now we focus the most time, resources, and energy on the classes we offer, the activities we organize, the experiences we sponsor. Reframing the way that we deliver the liberal arts means placing increased focus on the way that students connect these experiences and apply the ideas from one experience to succeed in another. Moreover, it means guiding students to strategically set up the ideal set of inter-experience connections that best prepare them to achieve their post-graduate aspirations.

When Augustana was founded, the connections between classes (at the time considered the primary, if not only, learning experiences offered by the college) were a foregone conclusion because the curriculum was virtually identical for every student.  New content assumed the delivery of prior content and students moved in lockstep from beginning courses to advanced seminars.  Only near the end of their schooling were students allowed to deviate from the central educational path to take courses that fit their vocational intentions in law, medicine, the clergy, civil service, etc.  Furthermore, extra-curricular experiences weren’t seen as potential learning experiences since they didn’t have anything to do with the content delivered through the curriculum.

This earlier version of a liberal arts education and the one that we now endeavor to provide could not be more different.  While the curriculum of yesteryear was almost entirely predetermined both in terms of the courses one took and when one took them, today (although some majors are more prescribed than others) I doubt you could find two students who took the same courses in the same order during the same year – let alone throughout an entire undergraduate degree.  In addition, we now know that students develop, learn, and grow at least as much through their courses as they do through their out-of-class experiences, resulting in the wide support for everything from student organizations to study abroad.

In the context of all of this curricular and co-curricular opportunity, it’s no wonder that students’ effort to convey the impact of their college experience on a resume often devolves into a list where the length of that list is assumed to convey something about an individual’s preparation and potential for future success.  Of course, those considering recent college graduates for graduate school, employment, or long-term service have figured out that these lists are a ruse and can be a red flag for someone who is more surface than substance.

But if we refocus the way that we operationalize the liberal arts so that students highlight “why” they chose the experiences they chose and how they took charge of constructing the person they have become, the grocery list of college experiences (AKA resume) suddenly comes to life as a story of perpetual improvement. This doesn’t mean that they are perfectly constructed when they receive their diploma.  But it does mean that those students can probably tell their own story in a way that shows an emerging clarity of purpose and an accelerating sense of momentum toward it.  We all know from our own experiences that those students stand out even when they aren’t trying to make an impression.

So how might we operationalize the way that we deliver the liberal arts to highlight this new focus?

First of all, we don’t need to go back to the days of an overly prescribed college experience.  With the diversity of our students’ pre-college experiences, learning needs and interests, and post-graduate aspirations, treating our students as if they were all the same would be stunningly foolish.

Instead, we begin by mapping every activity and every course that students can take in terms of what learning is intended to emerge from that experience and how that learning contributes to the larger learning goals and mission of the college.  Since this mapping is intended to be an iterative experience, the exercise may well result in adapting, adding, or even subtracting some courses or experiences.  It might also result in altering some experiences to more specifically meet certain learning goals.  The primary result of this exercise is not just to produce a complete catalog of the learning experiences in which students can engage.  Instead, the goal is to produce customize-able flow charts that show the variety of ways that different types of students can identify a sequence of experiences that together cultivate the learning that each student need to fully prepare them to succeed after they graduate.

These maps become the primary tools for the college to help students construct a college experience that builds upon their pre-college experiences and abilities, fills in the areas in which they need additional opportunities to learn and grown, and gives them the best chance to be the kind of person they aspire to be when they graduate.  Ultimately, the totality of each student’s college experience can be conveyed through a cohesive narrative that tells the story of his or her college journey from start to finish.

The challenges to making such a refocus work are not without consequence.  Most important, we have to actually enact our commitment to student growth and development in everything that we do.  That likely means changing something that we currently do (even if it is something we really like to do a certain way) to make it more educationally effective for students.  We often ask the question during planning conversations, “But what are we going to take away?”  This mapping exercise often identifies things that we could and probably should take away.  The challenge is whether we are willing give those things up.

In broader terms, this means that we have be able to “zoom out” and see the forest instead of the trees.  There will be a myriad of ways that a student could put together the learning experiences necessary for post-graduate success. The most important goal here is that the students can lay out their path, retrace their steps and explain why they took each one of them, situating the reasons for their choices in the context of their post-graduate aspirations. Of course there will likely be students who, despite all of our best efforts, don’t follow the guidance that we provide for them. But if all of our students learn the value of thinking about their own lives as a strategic effort to grow and develop, the chances are pretty good that they will all be on their way to succeeding in life and embodying the results of a liberal arts education when they walk across the stage to accept their diploma.

Make it a good day,

Mark

For the want of a response, the data was crap

Any time I hear someone use data from one of the new freshman, senior, or recent graduate surveys to advocate for a particular idea, I can’t help but smile a little.  It is deeply gratifying to see faculty and administrators comfortably use our data to evaluate new policy, programming, and strategic direction ideas.  Moreover, we can all point to a growing list of data-driven decisions that we know have directly improved student learning.

So it might seem odd, but that smile slips away almost as quickly as it appears. Because underneath this pervasive use of data lies a deep trust in the veracity of those numbers. And the quality of our data depends almost entirely upon the participation of 18-22 year-olds who are . . . . let’s just say “still developing.”  Data quality is like milk – it can turn on you overnight. If the students begin to think that survey questions don’t really apply to them or they start to suspect that the results aren’t valued by the college, they’ll breeze through the questions without giving them much thought or blow off the survey entirely. If that happens on a grand scale . . . . I shudder to think about it.  So you could say that I was “mildly concerned” as I organized fall IDEA course feedback forms for processing a few weeks ago and noticed several where the only bubbles colored in were “fives.”  A few minutes later I found several where the only darkened bubbles were “ones.”

Fortunately, a larger sampling of students’ IDEA forms put my mind at ease.  I found that on most forms the distribution of darkened circles varied and, as best as I could tell, student’s responses to the individual questions seemed to reflect at least a minimal effort to provide truthful responses.  However, this momentary heart attack got me wondering: to what degree might student’s approach to our course feedback process impact the quality of the data that we get?  This is how I ended up in front of Augustana’s student government (SGA) earlier this week talking about our course feedback process, the importance of good data, the reality of student’s perceptions and experiences with these forms, and ways that we might convince more students to take this process seriously.

During this conversation, I learned three things that I hope you’ll take to heart.  First, our students really come alive when they feel they are active participants in making Augustana the best place it can be.  However, they start to slip into passive bystanders when they don’t know the “why” about processes in which they are expected to be key contributors.  When they become bystanders, they are much less likely to invest their own emotional energy in providing accurate data.  Many of the students honestly didn’t think that the IDEA data they provided on the student form was used very often – if ever. If the data doesn’t really matter anyway, so their thinking goes, the effort that they put in to providing it doesn’t matter all that much either.

Second, students often felt that not all of the questions about how much progress they made on specific objectives applied in all classes equally.  As I explained to them how the IDEA data analysis worked and how the information that faculty received was designed to connect the objectives of the course with the students’ sense of what they learned, I could almost hear the light bulbs popping on over their heads.  They were accustomed to satisfaction-type surveys in which an ideal class would elicit a high score on every survey question.  When they realized that they were expected to give lower scores to questions that didn’t fit the course (and that this data would be useful as well), their concern about the applicability of the form and all of the accompanying frustrations disappeared.

Third, even though we – faculty, staff, and administrators – know exactly what we mean when we talk about learning outcomes, our students still don’t really know that their success in launching their life after college is not just a function of their major and all the stuff they’ve listed on their resume.  On numerous occasions, students expressed confusion about the learning objectives because they didn’t understand how they applied to the content of the course.  Although they may have seen the lists of skills that employers and graduate schools look for, it seems that our students think these are skills that are largely set in stone long before they get to college, and that college is mostly about learning content knowledge and building a network of friends and “connections.”  So when they see learning objectives on the IDEA forms, unless they they have been clued in to understand that these are skills that the course is designed to develop, they are likely to be confused by the very idea of learning objectives above and beyond content knowledge.

Although SGA and I plan to work together to help students better understand the value of the course feedback process and its impact on the quality of their own college experience, we – faculty, staff, and administrators – need to do a much better job of making sure that our students understand the IDEA course feedback process.  From the beginning of the course, students need to know that they will be learning more than content.  They need to know exactly what the learning goals are for the course. Students need to know that faculty want to know how much their students’ learned and what worked best in each class to fuel that learning, and that satisfaction doesn’t always equate to learning.  And students need to know how faculty have used course feedback data in the past to alter or adapt their classes.  If you demonstrate to your students how this data benefits the quality of their learning experience, I think they will be much more willing to genuinely invest in providing you with good data.

Successfully creating an evidence-based culture of perpetual improvement that results in a better college requires faculty, staff, and administrators to take great care with the sources of our most important data.  I hope you will take just a few minutes to help students understand the course feedback process.  Because in the end, not only will they benefit from it, but so will you.

Make it a good day,

Mark

 

 

 

 

Could a focus on learning outcomes unwittingly sacrifice process for product?

A central tenet of the learning outcomes movement is that higher education institutions must articulate a specific set of skills, traits, and/or dispositions that all of its students will learn before graduation. Then, through legitimate means of measurement, institutions must assess and publicize the degree to which its students make gains on each of these outcomes. Although many institutions have yet to implement this concept fully (especially regarding the thorough assessment of institutional outcomes), this idea is more than just a suggestion. Each of the regional accrediting bodies now requires institutions to identify specific learning outcomes and demonstrate evidence of outcomes assessment as a standard of practice.

This approach to educational design seems at the very least reasonable. All students, regardless of major, need a certain set of skills and aptitudes (things like critical thinking, collaborative leadership, intercultural competence) to succeed in life as they take on additional professional responsibilities, embark (by choice or by circumstance) on a new career, or address a daunting civic or personal challenge. In light of the educational mission our institutions espouse, committing ourselves to a set of learning outcomes for all students seems like what we should have been doing all along.

Yet too often the outcomes that institutions select to represent the full scope of their educational mission, and the way that those institutions choose to assess gains on those outcomes, unwittingly limits their ability to fulfill the mission they espouse. For when institutions narrow their educational vision to a discrete set of skills and dispositions that can be presented, performed, or produced at the end of an undergraduate assembly line, they often do so at the expense of their own broader vision that would cultivate in students a self-sustaining approach to learning. What we measure dictates the focus of our efforts to improve. As such, it’s easy to imagine a scenario in which the educational structure that currently produces majors and minors in content areas is simply replaced by one that produces majors and minors in some newly chosen learning outcomes. Instead of redesigning the college learning experience to alter the lifetime trajectory of an individual, we allow the whole to be nothing more than the sum of the parts – because all we have done is swap one collection of parts for another. Although there may be value in establishing and implementing a threshold of competence for a bachelor’s degree (for which a major serves a legitimate purpose), limiting ourselves to this framework fails to account for the deeply-held belief that a college experience should approach learning as a process – one that is cumulative, iterative, multi-dimensional, and, most importantly, self-sustaining long beyond graduation.

The disconnect between our conception of a college education as a process and our tendency to track learning as a finite set of productions (outcomes) is particularly apparent in the way that we assess our students’ development as life-long learners. Typically, we measure this construct with a pre-test and a post-test that tracks learning gains between the years of 18 and 22 – hardly a lifetime (the fact that a few institutions gather data from alumni five and ten years after graduation doesn’t invalidate the larger point). Under these conditions, trying to claim empirically that (1) an individual has developed and maintained a perpetual interest in learning throughout their life, and that (2) this life-long approach is direct attributable to one’s undergraduate education, probably borders on the delusional. The complexity of life even under the most mundane of circumstances makes such a hypothesis deeply suspect. Yet we all know of students that experienced college as a process through which they found a direction that excited them and a momentum that carried them down a purposeful path that extended far beyond commencement.

I am by no means suggesting that institutions should abandon assessing learning gains on a given set of outcomes. On the contrary, we should expect no less of ourselves than substantial growth in all of our students as a result of our efforts. Designed appropriately, a well-organized sequence of outcomes assessment snapshots can provide information vital to tracking student learning over time and potentially increasing institutional effectiveness. However, because the very act of learning occurs (as the seminal developmental psychologist Lev Vygotsky would describe it) in a state of perpetual social interaction, taking stock of the degree to which we foster a robust learning process is at least as important as taking snapshots of learning outcomes if we hope to gather information that helps us improve.

If you think that assessing learning outcomes effectively is difficult, then assessing the quality of the learning process ought to send chills down even the most skilled assessment coordinator’s spine. Defining and measuring the nature of process requires a very different conception of assessment – and for that matter a substantially more complex understanding of learning outcomes. Instead of merely measuring what is already in the rearview mirror (i.e., whatever has already been acquired), assessing the college experience as a process requires a look at the road ahead, emphasizing the connection between what has already occurred and what is yet to come. In other words, assessment of the learning that results from a given experience would include the degree to which a student is prepared or “primed” to make the most of a future learning experience (either one that is intentionally designed to follow immediately, or one that is likely to occur somewhere down the road). Ultimately, this approach would substantially improve our ability to determine the degree to which we are preparing students to approach life in a way that is thoughtful, pro-actively adaptable, and even nimble in the face of both unforeseen opportunity and sudden disappointment.

Of course, this idea runs counter to the way that we typically organize our students’ postsecondary educational experience. For if we are going to track the degree to which a given experience “primes” students for subsequent experiences – especially subsequent experiences that occur during college – then the educational experience can’t be so loosely constructed that the number of potential variations in the ordering of different students’ experiences virtually equals the number of students enrolled at our institution. This doesn’t mean that we return to the days in which every student took the same courses at the same time in the same order, but it does require an increased level of collective commitment to the intentional design of the student experience, a commitment to student-centered learning that will likely come at the expense of an individual instructor’s or administrator’s preference for which courses they teach or programs they lead and when they might be offered.

The other serious challenge is the act of operationalizing a concept of assessment that attempts to directly measure an individual’s preparation to make the most of a subsequent educational experience. But if we want to demonstrate the degree to which a college experience is more than just a collection of gains on disparate outcomes – whether these outcomes are somehow connected or entirely independent of each other – then we have to expand our approach to include process as well as product.  Only then can we actually demonstrate that the whole is greater than the sum of the parts, that in fact the educational process is the glue that fuses those disparate parts into a greater – and qualitatively distinct – whole.

Make it a good day,

Mark