In the world of Instructional Technology, evaluating technology tools is a widely misunderstood concept. School districts are under pressure to quantify the impact of technology integration on student achievement, and in general this pressure leads to districts attempting to draw a direct line between student achievement and a tool. They’ll often hear community questions like “Where is the data that shows that ‘this’ works?”, or “How do these devices raise test scores?” These questions are certainly important, and I don’t want to make light of parents and school leaders wanting to see their children meet their highest potential. However, there is an inconvenient truth that must be recognized. And that truth is that is not how tools are measured! Let me repeat. That. Is. Not. How. Tools. Are. Measured. In this post, I will explain why this is the case, as well as how tools should actually be evaluated.
Imagine that you go for a walk through your neighborhood. Birds are chirping, the sun is shining, the breeze is blowing. It’s a beautiful day. You choose to walk down a street, onto which you’ve never ventured. As you turn the corner, you see a house that makes your heart skip a beat. In fact, you are certain that you’ve never seen a house so exceptional. You decide in that moment that it is time to sell your house and have a new house built. With reckless abandon, you head for the front door in hopes of speaking with the homeowner. You ring the bell, and the owner answers the door. At that point you ask, “What kind of hammers, saws and drills were used to build this home? I need to know what tools to buy so that I can be sure that my newly built home is just as beautiful as this one.” The homeowner gives you a confused look, and slowly closes the door. But nothing can dampen your mood. You decide to go out to dinner to celebrate this exciting event. After a delicious meal, you have the most incredible chocolate souffle, and you notice that the head chef has come out into the dining room, to greet guests. You wave to her and she comes over. “I absolutely loved the souffle, and I’d really like to make one at home. What kind of pots, spoons and measuring cups did you use to make it taste so delicious?” The chef’s face displays an expression that looks oddly familiar, and then she slowly backs away.
Now you might think that these examples sound silly, because things like hammers, and pots are merely tools, and that tools can’t be held directly responsible for a culminating outcome. You know what? You’d be exactly right. Tools don’t work like that, and can’t be measured by those kinds of outcomes. So now imagine the average school classroom. You will find tools like pencils, paper, books, rulers, all the way up to interactive white boards and digital devices. It is absolutely impossible to draw a direct line from any of those tools, to the culminating outcome of student achievement. The reason lies with two very important words. Those words are “primary” and “ancillary”. In the examples above, the craftsmanship of the architect, builders, and chef, are primary in that the culminating outcomes are a direct result of their ability to apply and utilize various tools to achieve that outcome. The tools are ancillary, in that they merely support the craftsmanship and skill of the individuals, and don’t truly serve a purpose until they are incorporated into a well designed task.
Okay okay I can hear your frustration. “Are you saying that there is no way to truly evaluate a tool?!?!?” Absolutely not. However, it’s important that you are evaluating the correct criteria. If you remember nothing else from this post, please remember this, because tools are measured by a very simple guiding question. That question is, “To what degree does the tool perform the task that it is made to perform?” It’s just that simple. To what degree does the architect’s adjustable triangle, help to draw straight lines? To what degree does the builder’s nail gun, drive nails into wood? To what degree do the chef’s pots efficiently conduct heat? To what degree do digital devices help students to collaborate? To what degree do they make it easy for teachers to collect formative data? To what degree do they make it possible for students to have choice when learning new content? You should never ask, “To what degree do adjustable triangles, and nail guns create beautiful houses?” You should never ask, “To what degree do pots make a delicious souffle?” And lastly, you should NEVER ask, “To what degree does a digital device lead to student achievement?” Because, and I know that you may have heard this once or twice, that’s not how tools work.
Now let’s get down to business, because I don’t want you to get the wrong idea. Let me make this perfectly clear, and I can’t say this strongly enough. Student achievement is vitally important, and we should be constantly evaluating everything that has an impact on the degree to which students achieve. But make no mistake, when you are measuring student outcomes, and that which has the most direct impact on those outcomes, you are in fact evaluating the teaching. You are evaluating the degree to which the curriculum, teaching strategies, and overall pedagogy, resulted in students’ ability to grasp new content, and ability to display that new mastery. Frankly, THIS is the discussion that we should all be having. If you are a person who is trying to find a direct link between instructional tools and instructional outcomes, I urge you to shift your thinking. Instead, ask the question “To what degree is the design and implementation of teaching and learning, best meeting the needs of my child?” Ask the question, “To what degree can my child’s learning environment allow her to have choice in deciding her own learning path?” And lastly, “To what degree is my child able to leverage his strengths in demonstrating his understanding of new content?” If we can shift our thinking to analyzing that which has the most direct impact on student achievement, we will be one giant step closer to ensuring that all students are able to achieve to their greatest potential.