Mane (CRIN 604)






         Digital tools in education

March 8, 2010

A Comedy of Errors

Filed under: Academic Technology,Opinio — mepada @ 1:59 pm

If I have learned anything at all during my almost 27 years in Academic Technology (used to be called Instructional Technology) is this:

You may have found and would like to use the coolest and most innovative of technology tools in your classroom, meeting, etc., but unless you carefully plan its implementation, consider its affordances and constraints, and test it to make sure it works the way you want it to work…you could be inviting disaster!

This is what happened a couple of weeks ago with the implementation of a technology tool gone terribly wrong!

Even though this particular incident did not take place in the classroom, I have witnessed similar situations take place inside learning environments.

The culprit?  Opinio, William & Mary’s supported survey/research tool.

This is how this event developed:

I was contacted by a faculty member who wanted me to develop a survey which was going to be sent to the whole faculty body.

Guided by the text I was given I designed the survey and sent it back to the faculty member with some instructions:

  • Check the questions for typos and grammar structure.
  • Check the survey for clarity to make sure that the questions were clear and concise.
  • Check the features I selected for the survey to make sure it is exactly the way they want it to “behave” (attributes such as “save and return”, authentication, anonymity, “look and feel”, etc.)
  • Make sure the invitations have been sent properly (emails, number of respondents, etc.)
  • Test it! Test it! And test it again!
  • Send a message to the faculty letting them know that this survey is coming their way.

I then received a message that indicated everything “looked fine” and to go ahead and launch the survey.

I launched the survey.

Another survey successfully launched! …or so I thought!

  1. That same day I started to receive emails from faculty who did not know what this survey was about and why were they receiving it at all? They asked me if it was “legit and should even open the link”.

No leading email had been sent to the faculty. They were not aware this survey was coming their way!

I immediately contacted the committee and asked them why they had not sent a “heads up” email to the faculty? Answer: They had forgotten.

They promised they would send them a message ASAP explaining what this survey was and why they needed to complete it (I am sure this came a bit too late for some –who know did not want to “waste their time”)

 I re-sent the survey after the faculty email was sent to the whole group.

  1. The next day –when I was in the middle of my radio show- I got a voicemail with an urgent message to stop the survey.

I rushed to a computer and stopped it.

The survey was not supposed to launch yet. It had been launched two days earlier than planned by the committee. They had sent me the wrong sent date!

I proceeded to re-send the survey on the scheduled date, which implied that I had to delete responses already stored and send a message to those respondents to tell them that they needed to re-take the survey! (Not an effective way of delivering a product at all! I am sure some did not bother to respond a second time)

  1. A couple of hours later, I got another phone call. I needed to stop the survey again! This time, a member of the committee told me that the survey text was not the correct text! They had missed a couple of important items. Difficult to believe right?

I was now having a difficult time believing this comedy of errors unfolding in front of me!

I stopped the survey again.

They sent me the text with the missing items.

I re-wrote it.

I re-sent the survey.

At this time, the committee had no choice but to send a message to the whole faculty body explaining what had happened and to please re-take the survey one more time! The plan was to delete all stored responses (140 by then) and to start fresh!

No “glitches” occurred after that!

Once the survey closed and I generated the reports (less than half the faculty ended participating). We usually get a much higher response rate.

Lessons learned? You bet this committee plans to do things differently in the future.

When discussing the whole experience with them I found out the obvious. They had not followed my pre-launch instructions at all! They apologized profusely and promised me a cup of coffee (or glass of beer) for all my troubles!

On my part I will have to become more of a bay-sitter and micro-manager (which I hate to do!).  We all need to understand the importance of good planning, testing, assessment and follow-up to implement any academic technology successfully and effectively!

Some places recognize planning is essential and offer a robust support system:

Harvard: http://icg.harvard.edu/icb/icb.do?keyword=atg&tabgroupid=icb.tabgroup78642

University of Minnesota: http://dmc.umn.edu/consultations.shtml

 Link to Opinio: http://www.objectplanet.com/opinio/