Week 10 + Halloween + Slicing Data = Disengaged Zombie Students!

I suspect that the confluence of Week 10 and Halloween brings out a little crazy in each of us.  So I thought I’d share a brief response that I prepared for a recent media request regarding the potential existence of one underserved student population on our campus.

From our senior survey data, we find that students who self-report as Zombies also report statistically significantly lower levels of engagement across a wide range of important student experiences. These differences include lower levels of participation in class discussion despite higher satisfaction with faculty feedback.

Zombie students also report lower levels of co-curricular influence on understanding how one relates to others. Further qualitative study suggests a broad lack of self-awareness.

In addition, Zombie students indicate that they have fewer serious conversations with students who differ by race, ethnicity, socioeconomic status, or social values.  Instead, Zombie students seem to congregate together and rarely reach out of their comfort zone.

Interestingly, our first-to-second year retention rate of student zombies is 100%, despite the high number of PUGS and CARE reports.  Yet our six year graduation rate is 0%. While some have expressed concern over this dismal data point, a few administrators who are closely involved in managing the graduation ceremony have suggested that the graduation ceremony is long enough already without having Zombie students shuffling aimlessly across the stage to get their diploma.

Interestingly, Zombie students report an increased level of one-on-one student/faculty interaction outside of class.  We find no evidence to suggest that this correlates in any way with the substantial drop in the number of part-time and adjunct faculty from last year (108) to this year (52).

Happy Halloween and have a wonderful Week 10.

Make it a good day,

Mark

Does our educational community lose something when seniors live off campus?

I’ve yet to find an Augustana senior who wishes they lived on campus.  In fact, the seniors I’ve talked to seem almost relieved to finally stretch their wings and move into the surrounding neighborhoods, even though they often say they had hoped to find a cheaper or nicer place nearby.  As far as I can tell, seniors have lived off campus at least since the 1970s, and this practice is so embedded into our culture that the very name of our junior students’ housing – Transitional Living Areas (TLAs) – announces our desire to prepare seniors to live on their own.

As our strategic planning discussions have coalesced around designing and implementing a purposefully integrated, comprehensive Augustana learning experience, I’ve been thinking about the real challenge of creating a plan that allows us to balance the individualized needs of each student with the core elements of a genuine community.  Although this might not appear all that difficult at first, efforts to achieve goals for individuals or certain subgroups of students can sometimes run at cross-purposes with maintaining a community culture optimal for student learning.  Several years ago we found an interesting example of such unintended consequences when we discovered that our efforts to encourage students to join multiple campus organizations (knowing that such behavior often enhances social integration and ultimately influences retention) was likely, albeit unintentionally, limiting the chances for conversations between students from substantially different backgrounds or demographic groups (thus undermining our efforts to increase students’ intercultural competence).

With all of this in mind, I was stuck by one data point from last year’s seniors about the impact of our fourth year residential status. The question asked our graduating seniors, “How often did you participate in on-campus events during your senior year?”  Responses ranged as follows:

  • less than when I lived on campus (200 – 39.9%)
  • about the same as when I lived on campus (279 – 55.7%)
  • more than when I lived on campus (22 – 4.4%)

So how does this relate to the aforementioned tension between encouraging individual development and fostering an ideal educational community?

First of all, when we talk about Augustana College, we almost uniformly talk about the educational and developmental benefits of a four-year residential experience.  I suspect that when we talk in these terms, we imagine that this distinguishing characteristic plays an influential role at both the level of the individual and the community.  At the individual level it presents itself in the form of leadership positions and the responsibility of being the senior class.  At the communal level it presents itself through those same channels but in terms of the influence of those leaders on younger students and the atmosphere and legacy that a senior class can create that can permeate an entire campus.  While this can play out in both directions through formal channels and during formally organized events, the broader impacts are likely more pervasive through informal rituals and signaling (to use a term familiar to social psychologists and anthropologists).

However, if our seniors are living off campus in their last year, it seems like this could, at the very least, limit the educational potential and influence of the fourth year students on the rest of the student community.  Based on the substantial proportion of seniors who indicated that they participated in fewer campus events than when they lived on campus, and taking into account our other data that clearly shows a high level of overall involvement among our students overall, I’d suggest that we might have set up a situation where we have maintained the educational opportunities that contribute to individual development among our seniors, but we may be missing out on some of the benefits to a residential educational community that our senior class might provide if they lived on campus.

There are lots of reasons to suggest that we should be cautious in drawing too many conclusions from this particular data point.  For many of our seniors, they may be busy with off-campus internships, graduate school applications, or other involvements that emerge as they begin to prepare for life after college.  They could also be hosting off-campus parties that have varied effects – both good and bad – on our campus community.  And given the long history of seniors living off campus, I’ll bet that there are a certain set of beliefs or mythologies about one’s senior year that are deeply embedded into the student culture.

Yet I’d ask that as we endeavor to create an integrated learning experience that is truly comprehensive and clearly distinctive in terms of preparing students for lives of financial independence, unintended discoveries, and a legacy of success, I hope we are willing to seriously consider all of the possible design elements that might make such an educational experience and environment possible.  And I hope that we are bravely able to keep a balance between the necessary elements of the culture we hope to foster with the developmental needs of our individual students.

Make it a good day,

Mark

 

 

 

In Search of the Mysterious Muddler

On several recent occasions I have heard it said that about 25% of our students aren’t involved in anything on campus.  I am always intrigued by the way that some assertions or beliefs evolve into facts on a college campus, and this number seemed ripe for investigating.   Researchers into human behavior have found this phenomenon repeatedly and suggest that, because we want to believe our own intuition to be true, we tend to perk up at data points or anecdotes that support our beliefs.  We’ve all fallen prey to this temptation at least once – at least I have.  So I thought it might be worth testing this claim just to see if it holds up under the glare of our actual survey data.

First – to be fair, this claim isn’t totally crazy.  I can think of a particular data point that clearly nods in the direction of the 25% uninvolved claim.  For a few years, we’ve tracked the proportion of seniors who don’t use their Augie Choice money, and – although the number is steadily declining – over the last few years an average of about 25% have foregone those funds.  Others have suggested that every year we have a group of somewhere between 600 and 800 students (henceforth called “the muddlers”) who aren’t involved in anything co-curricular; athletics, music groups, or student clubs and organizations.  More ominously, some have suggested that there is a sub-population of students who are only involved in Greek organizations and that these students help to create an environment that isn’t conducive with our efforts to make Augustana a rigorous learning experience. (All of that is a wordy euphemism for “these lazy bums party too much.”).

Although the question of what should count as true involvement is a legitimate one, the question of simple participation is an empirical question that we can test.  So we looked at two sets of data – our 2013 senior survey data and our 2013 freshmen survey data – to see what proportion of students report not being involved in anything co-curricular. No athletics, no music, and no student clubs or organizations.  Then we added the question of Greek membership just to see if the aforementioned contingent of deadbeats really does exist in numbers large enough to foment demonstrable mayhem. (another wordy euphemism for “be loud and break stuff.”).

Well, I’ve got bad news for the muddlers.  Your numbers aren’t looking so hot.  From the students who graduated last spring, only 17 out of 495 said that they didn’t participate in anything (athletics, music, student groups, or Greeks).  When we took the Greek question out of the equation we only gained 5 students, ultimately finding that only about 5% (23/495) of our graduating seniors said that they didn’t participate in athletics, music, or some student group.

But what about the freshmen?  After all, the seniors are the ones who have stayed for four years.  If involvement is the magic ingredient for retention that some think it is, then we should expect this proportion to be quite a bit bigger in the freshman class.

Alas, though our muddler group appears a little bigger in the first year, it sure doesn’t approach the 25% narrative.  After eliminating freshmen who participated in athletics, music, a student group, and a Greek organization, we were left with only 15 out of 263 first year students who responded to our survey.  When we left out Greek membership, we only gained 4 students, increasing the number to 19 out of 263 (7%).  Now it’s fair to suggest that there is a limitation to this data in that we got responses from only about 45% of the freshman class.  However, even after calculating the confidence intervals (the “+/-”) in order to generalize with 95% confidence to the entire freshman class, we still end up with range in proportion of students not involved in anything co-curricular somewhere between 4 and 9 percent.

There are two other possible considerations regarding the muddler mystery.  One possibility is that there are indeed more than we know because the non-participant would also be more likely to not fill out the freshman survey.  On the other hand – as some of our faculty have observed, it’s possible that our muddlers are also the students who study more seriously; just the kind of students faculty often dream of teaching.

My reason for writing this post is NOT to suggest that we don’t have some students who need to be more involved in something outside of their classes.  We certainly have those students, and if it is almost 10% of our freshman class (as the upper bound of the confidence interval suggests), then we clearly have work to do.  Rather, it seems to me that this is another reason to think more carefully about the nature of involvement’s impact on students.  Because it appears that the students who depart after the first year are not merely uninvolved recluses (again, the limitations of the sample requires that I suggest caution in jumping to too certain a conclusion).  It seems to me that this evidence is another reason to think about involvement as a means to other outcomes that are central to our educational mission instead of an end in and of itself.

Make it a good day,

Mark

 

 

Sometimes assessing might be the wrong thing to do

Because of the break-neck pace of our work lives, we tend to look for pre-determined processes to address problems instead of considering whether or not there is another approach that might increase the chances of a successful long-term solution.  This makes sense since pre-determined processes often feel like they help to solve complicated problems by giving us a vetted action plan.  But if we begin defaulting to this option too easily, we can sometimes create more work for ourselves just because we absentmindedly opted for “doing it the way we’re supposed to do it.”  So I thought it might be worthwhile to share an observation about our efforts to improve our educational effectiveness that could help us be more efficient in the process.

We have found tremendous value in gathering evidence to inform our decisions instead of relying on anecdotes, intuition, or speculation.  Moreover, the success of our own experiences seems to have fostered a truly positive sea-change both in terms of the frequency of requests for data that might inform an upcoming discussion or decision as well as the desire to ask new questions that might help us understand more deeply the nature of our educational endeavors.  So why would I suggest that sometimes “assessing might be the wrong thing to do?”

First, let’s revisit two different conceptions of “assessment.”  One perceives “assessment” as primarily about measuring.  It’s an act that happens over a finite period of time and produces a finding that essentially becomes the end of the act of measuring.  Another conception considers assessment as a process composed of various stages: asking a question, gathering data, designing an intervention, and evaluating the effectiveness of that intervention.  Imagine the difference between the two to mirror the difference between a dot (a point in time) and a single loop within a coil (a perpetually evolving process).  So in my mind, “measurement” is a singular act that might involve numbers or theoretical frameworks. “Assessment” is the miniature process that includes asking a question, engaging in measurement of some kind, and evaluating the effectiveness of a given intervention.  “Continuous improvement” is an organizational value that results in the perpetual application of assessment.  The focus of this post is to suggest that we might help ourselves by expanding the potential points at which we could apply a process of assessment.

Too often, after discovering the possibility that student learning resulting from a given experience might not be what we had hoped, we decide that we should measure the student learning in question.  I think we expect to generate a more robust set of data that confirms or at least complicates the information we think we already know. Usually, after several months of gathering data (and if all goes well with that process) our hunch turns out to be so.

I’d like to suggest a step prior to measuring student learning that might get us on track to improvement more quickly.  Instead of applying another means of measurement to evaluate the resultant learning, we should start by applying what we know about effective educational design to assess whether or not the experience in question is actually designed to produce the intended learning.  Because if the experience is not designed and delivered effectively, then the likelihood of it falling short of its expectations are pretty high.  And if there is one truth about educating that we already know, it’s that if we don’t teach our students something, they won’t learn it.

Assessing the design of a program or experiences takes a lot less time than gathering learning outcome data.  And it will get you to the fun part of redesigning the program or experience in question much sooner.

So if you are examining a learning experience because you don’t think it’s working as it should, start by tearing apart its design.  If the design is problematic, then skip the measuring part . . . fix it, implement the changes, and then test the outcomes.

Make it a good day,

Mark