Sorry – I’m busy collecting data!

This is a potentially massive week for Augustana College.  We are hosting two data collections for the Wabash National Study.  The first one is tonight – Monday, March 26th – from 6-8 PM in Olin Auditorium.  The second is Thursday, March 29th – from 6-8 in Hanson Science 102.

 

PLEASE PLEASE PLEASE go out of your way to encourage any senior you know to come to one of those two dates.  We still have about 300 $25 gift cards to the Augie bookstore to give away to the seniors who show up.

 

Frankly, I’ve got nothing else to say at the moment.  Put more honestly, I’ve got not time to write anything right now – I doing everything I can to increase our participation rates that all of you have data that we can use over the next several years.

 

Yes, I really am “all about you.”

 

Make it a great day – Make it  Wabash National Study day!

 

Mark

Look what happens when you use your data to improve?

Even though, I know you have plenty of things to think and fret about these days with the start of a new term and the little matter of a proposed calendar and curriculum revision, I hope you are enjoying the weather and finding ways to keep your students motivated despite it!

 

With that said, I hope you’ve also had a chance to look through your IDEA course reports from the winter term and your packets of student forms.  Although many of you have attended one of the “interpreting the IDEA reports” sessions over the last year or so, I know that some of you continue to have questions.  I’m glad to sit down with you any time and answer any questions you might have.

 

I would like to share some of my observations after seeing almost every report over the last two terms.  My hope is that these observations are helpful, not only as you might be thinking about using your reports to inform your course design for future terms, but also in considering whether or not the switch to the IDEA Center process has been helpful for Augustana College in helping us to improve our teaching and student learning.

 

First, it appears to me as if the average PRO score (Progress on Relevant Objectives) went up between fall and winter terms.  There are a number of potential explanations for this – the types of courses offered, student acclimation to college (within the year as well as for first year students), and general attrition for those most unable to succeed at Augustana.  But it struck me that there are also some reasons why we might suspect learning (as represented by the PRO score) to decrease in the winter term – most notably the big break in the middle of the term and its impact on students’ motivation to restart the academic engine or remember what they had learned prior to the holiday break.  So I don’t think it’s completely out of the bounds to suggest that the increase in the overall PRO score is worth noting.

 

Second, it appears that many faculty members reduced the number of learning outcomes they selected for their individual courses.  I would argue that this is probably a good thing in the vast majority of cases.  First, I interpret the number of objectives selected as an indication of focus rather than an indication of learning.  In other words, as I’ve noted to some of you, in many cases your students reported learning substantially on objectives that you did not select.  In fact, it wasn’t uncommon at all to find faculty selecting fewer objectives and then finding that they could have selected additional objectives and the PRO score would have remained the same or even gone up.  The choice to choose fewer objectives and focus on them set the conditions for the “spill over” learning that was then evident on your reports.

 

Conversely, for faculty who initially selected many outcomes, the results of those reports suggested that the diffusion effect that I have mentioned repeatedly held true more often than not.  Folks who initially selected many objectives often found that, although some of the objectives they selected played out as they had intended, there were enough objectives on which students reported lower average learning that the average PRO score suffered as a result.  In my mind, the drop in the average number of objectives selected suggests to me that more faculty have engaged in the exact kind of purposeful thinking about course design and course outcomes that the adoption of this instrument was intended to produce.  Some of you might argue that this is only evidence of “gaming the system.”  I would argue that if “gaming the system” sets better condition for learning, then you can call it “manipulating,” “negotiating,” or “peppermint bon bon” for all I care.

 

With all of the uncertainty and ambiguity that goes with the work that we do – especially when it comes to trying to make decisions about the future of Augustana College – I think it is useful to look at a decision the faculty made last year and assess its impact.  In the case of the decision to switch to the IDEA Center system, I think that there is preliminary evidence to suggest that this switch is helping us improve the conditions for optimal student learning.  Whether or not it actually directly impacts student learning – I think that is a question for another Delicious Ambiguity Column that I will write more than a few years from now.

 

Make it a great day,

 

Mark

What do we know from our prior Wabash National Study data?

I am going to cut to the chase here – tonight is the first opportunity for senior students to participate in the Wabash National Study of Liberal Arts Education final phase of data collection.  It all starts at 6 PM in Hanson 102.  Please encourage your senior students to participate.  And remember – tell them that the first 400 participants get a $25 gift card to the Augie bookstore.

 

Instead of telling you why I think that the Wabash National Study might be so valuable to Augustana College, I thought I’d show you.  Over the course of this year, I’ve written 21 columns; almost all of them trying to help us think about ways that we can use our institutional data to improve what we do.  Nine of these columns examine data that is a part of the Wabash National Study.  Just in case you’ve forgotten, I’ve listed them below and provided links to the full column.

 

 

And these columns are only a miniscule sampling of the kinds of questions that could be answered using this dataset.  Moreover, if we can get enough seniors to participate, we could answer these same questions – and many others – within the context of each major.  This is the kind of data that would be gold for anyone thinking about how they can make their major experience the best it possibly can be.

 

I hope this demonstrates a little bit of why I hope you will help promote this study and encourage your students to participate.  If you have any questions about it, please don’t hesitate to email me.

 

Make it a great day,

 

Mark

Why should our seniors participate in the Wabash National Study?

When thing get really hectic, I have a hard time remembering what month it is.  Judging by the snow falling outside as a write this first column of the spring term, it’s not just me.  Fortunately, we all have our anchoring mechanisms – our teddy bear or our safe space that keeps us grounded.  For me, it’s the Wabash National Study senior data collection that will occur in March and April.  At long last, it’s time to find out from our seniors how their Augustana experience impacted their development on many of the primary intended outcomes of a liberal arts education.  (I know.  Own it!)

 

I believe that the data we gather from the Wabash National Study could be the most important data that Augustana has collected in its 150+ year history.  I’d like to give you three reasons why I make this claim, and three ways that I need your help.

 

First, the Wabash National Study measures individual gains across a range of specific outcomes.  Instead of taking a snapshot of a group of freshmen and a snapshot of a different group of seniors and assuming that those two sets of findings represent change over time, in this study we will have actually followed the same group of students from the first year to the fourth year.  Furthermore, instead of tracking only one outcome, this study tracks 15 different outcomes, allowing us to examine how gains on one outcome might relate to gains on another outcome.

 

Second, the Wabash National Study is the first and only study that allows us to figure out which student experiences significantly impact our students’ change on each outcome measure.  In other words, from this data we can determine which experiences improve gains, which experiences inhibit gains, and which experiences seem to have little educational impact. Furthermore, this data allows us to determine whether the gains we identify on each outcome are a function of pre-college characteristics (like intellectual aptitude) or a function of an experience that happened during college (like meaningful student-faculty interaction).  This gives us the kind of information on which we can more confidently base decisions about program design, college policies, and the way we link student experiences to optimize learning.

 

Third, as we continue to try to more fully embody a college that assesses itself based on what we do rather than what we have, this data can provide a foundation as we think about clearly articulating the kind of institution we want to be in the future and how we are best able to get there.  In the past decade, we have collected bits and pieces of this kind of data from NSSE, CLA, and various Teagle-sponsored studies – all important evidence on which we have made critical decisions that have improve the quality of the education we provide.  This time around, we will have all of that data in one study, allowing us to answer many of the questions that we need to answer now; questions that have previously been exceedingly difficult answer because the applicable data was scattered across different, often incompatible, studies.

 

But just because we are going to try to collect this data from our seniors over the next two months doesn’t mean we automatically get to have our cake and eat it, too.  Our seniors have to volunteer to provide this data.  Although we have some pretty decent incentives ($25 gift cards to the book store and group incentives for some student groups), this thing could be a monumental belly flop if no one shows up to fill out our surveys.  This brings me to how you can help.

 

1)      Make it your mission to tell every senior with whom you interact to participate in the survey.  We are going to invite them by email, announce this study at various student venues, and hopefully have some articles in the Observer.  But the students need to be encouraged to participate at every turn.

2)      Tell them why they should participate!  It’s not enough to ask them to do it.  They need to know that this will fundamentally shape the way that we construct Augustana College for the next generation of students.  They can play a massive role in that effort just by showing up and filling out some surveys.  Oh, if the rest of life was so easy!

3)      Remind them to participate.  We will have four different opportunities for seniors to provide data.  We will give $25 gift cards to the first 100 students at each session – so if they all wait to participate, most of them won’t get the incentives we would like to give them.  The dates, times, and locations of these sessions are:

 

  1. Monday, March 12, 6-8:30 PM in Science 102
  2. Monday, March 26, 6-8:30 PM in Olin Auditorium
  3. Thursday, March 29, 6-8:30 PM in Science 102
  4. Thursday, April 26, 10:30 AM – 12:30 PM in John Deere Lecture Hall

 

Thank you so much for your help.  Just to let you know ahead of time, I’m not going to shut up about this data collection effort until we give away all of the gift cards or we run out of data collection dates.  Yes, it’s that important.

 

Make it a good day,

 

Mark