In education these days it’s all about the data: assessment data, learning outcomes, retention rates, graduation rates. But, what’s interesting to me is that much of this drive for data is being carried out seemingly without any real knowledge about or interest in real information that could provide useful insights into students’ behavior and how to provide them with valuable learning and advising experiences.
Here’s what I mean. In one course I teach, introduction to logic, there is a specific learning outcome that we are supposed to measure. Each section of the course taught at the college has the very same learning outcome and assessment tool to measure it. The learning outcome is as follows: Formulate a deductive argument. The assessment tool is quite simple: Formulate a deductive argument in 250 words or less. The data we collect are just as simple. We track how many students completed the assessment and from that pool how many were successful in completing it. So, for example in one section of the course 17/20 students successfully completed the assessment. In another, 20/20.
Now the question we should ask is: What, if anything, can be inferred from this data? Are there any useful insights to be gained from this data set? I can’t imagine that there are. The sample is too small. We don’t know how the sample compares with the class size. Just because 17/20 were successful doesn’t tell us how many were in the class who didn’t complete the assessment at all. And, what about students who withdrew from the course before the assessment was given. For that matter, what about those who withdrew after the assessment was given? There is absolutely nothing useful that this data can tell us if we are interested in such things as why students succeed in the course, why they don’t, why they stop coming to the class, why they never complete the class, what they really experienced in the class, or what they really learned.
I could be wrong and I hope that I am but it seems that the world of vibrant research that is going on in behavioral economics and psychology is largely unknown in the circle of educrats who are the most adamant about data collection. I say this because of the kinds of questions that are being asked and the kind of data that is being sought. The questions show no awareness of the important findings of behavioral economics and the kind of data being sought show no awareness of what data would really provide the most insight into student behavior and success.
Here are just a few resources that could really inform our teaching practices and would provide educators with useful insights into the information we could really use to help our students the most.
Daniel Kahneman: Thinking Fast and Slow
Dan Ariely: Predictably Irrational
Chip & Dan Heath: Made to Stick
Dan Pink: Drive
Steve J. Martin: The Small Big
These books provide a rich, sometimes counterintuitive, view of human behavior, motivation, and insights into how people respond to incentives. If you’re in education and in charge of collecting, analyzing, and using student data and you are not familiar with these books, stop. Take time to do some research. I know there are probably better, kinder, ways to say this but the bottom line is that you are missing out on a deep appreciation of the landscape not only of data collection and use but also of how to effectively implement learning strategies that will have a positive effect on students.
One reason I can make these claims is that, in reading and researching these ideas, I have come to realize how much I do not know myself about what motivates students, how to create effective learning environments and assignments, and how to engage students in meaningful dialogue and learning experiences. I still have much to learn which is why it is so shocking that there are those higher up in the system than I who seem to know less about this.
As I said above I could be, and actually hope that I am, wrong about this. But, all I have to go on is data! In this case, the data consists of the ideas being presented, the questions being asked, and the forms to be completed. All of this data suggests that there is much work to be done before we will really be ready to learn about the effectiveness of what we as educators are doing and what we can do to improve.