Just the shortest of backgrounds before I get started: I have an MS and PhD in Education, with a specialization in Curriculum and Instructional Technology for both. I have done professional tech support off and on for more than 20 years. I have been a Professor in a College of Education - teaching classes in all manner of topics related to instructional technology, assessment, accreditation, teacher preparation, and learning theories - for more than 16 years.

Now, the rant:

My life as a student closely followed the introduction of computers into the classroom. There were days when the computer was rolled on its own cart into the classroom, waiting eagerly to be used by the first kid who finished his normal homework. As I finished my high school and entered college, the “Computer Lab” was a sacred space. A space protected by people who wanted desperately to protect the devices from people who were there for any reason other than “serious work”. Despite this, there were still games.

Then I went to grad school, and computers started coming down off their pillar as objects of awe and wonder, and started seeing practical uses. A computer could help you learn your math facts, or guide you on a fantastic around-the-world adventure, or immerse you in a simulation. Soon, it became obvious - at least to me - that gaming computers had way more processing power than the units available to classrooms. Why couldn’t a PlayStation 2 host an immersive simulation of some complex topic? The PS2 had driving games, and flight simulators, and all manner of first person games that offered rough approximations of these times of learning events.

When I finished my PhD, I was ready to take on the world, man. I knew how to design immersive simulations, and all I needed was a programmer to develop one. I even got involved with a couple grant proposals, where the user was supposed to learn about why certain types of molecules are hard (or impossible) to make, because the “size” of the atoms can get in the way of them making strong bonds. There was even a “proof-of-concept” activity created, showing how the size and vibration of a molecule can prevent certain sized other atoms from making bonds (despite the theoretic possibility). That idea got dashed because the proposal pointed to a location on a host’s website that got restructured without forwarding the requests to the new location (short: things got screwed up). To say I was disappointed would be understating it substantially.

I was a professor by then. I was ready and prepared to do all kinds of amazing work in the field, with an initial aim toward helping people learn science more effectively and efficiently. Despite my interest, I couldn’t get on the same page with the science education folks to make anything really cool. And then service came calling. I became a program director. I became the assistant dean. I coordinated a very successful accreditation visit. I returned to faculty a couple years ago.

So, in that span of 16 years or so, things have gotten way better, right? Technology is used in valuable ways to help people learn more effectively and efficiently, right? I taught the early MS program course that introduces the fundamentals of the field. The students in that class knew how to use their devices very well, and knew how to incorporate certain kinds of activities into their teaching. But, when asked to pay attention to the value that technology brings, or to conceive how technology helps learners to learn better, or faster, their examples were all anecdotal. “My kids love Classroom Dojo…” or “We use a lot of Padlet to get students to participate…” or “I use Kahoot to test whether students understand…”, etc., etc., etc.

The problem is that these uses are no better than well-structured classroom discussions, worksheets, or tutorials. Where are the examples of students using available materials to create something amazing? Or who use technology to go beyond what is provided in terms of evidence of understanding? Where is the sandboxing, or prototyping, or “goofing around” in a way that helps a student to truly build understanding? Instead, it seems like these students are merely finding the most efficient way to learn the stuff on the exit ticket, without putting themselves at risk of actually understanding anything better than they did before.

And in this sense, it is a waste to use expensive technology to do nothing better than fill out worksheets and select the best available answer for the provided question. Whereas, this worksheet and multiple choice method was not much better than rote memorization, the ability to manipulate personal devices for no material learning gain is no better than a complex gimmick. Like, the machine that goes “bing” from Monty Python, that fascinated the hospital administrator so much in that sketch from “The Meaning of Life”.

To this point, I’ve been engaging in what I would call a “heavy dose of complaining”. Or, expressing dissatisfaction without offering any useful options for improvement. Let’s now talk in bullet list terms about what could be better:

  1. Teachers need to be trained better in the use of technology to teach for ALL students. This is the role of teacher education programs.
  2. There must be better software created to allow the wide variety of impressive technologies to support learning, rather than merely entertaining.
  3. Students must be taught technology literacy skills from the very first moment possible. Simple exposure is not enough.
  4. With their impressive personal devices, and strong literacy skills, attention can be turned toward self-management of inquiry.
  5. Lastly, I know that nobody truly *needs* technology. But, to have it and use it poorly is a *shame*.

Other rants in this section will talk about these topics, and others will be added as they become necessary!

Comments?