I just read the Key Findings (pdf) of The ECAR Study of Undergraduate Students and Information Technology, 2007 and see some really big differences between my relationship with technology compared to that of these American undergraduates. I wanted to share some of my observations with you here.
And yes, I am conscious that I’m comparing my behavior (as a current postgraduate student of online and distance education with a UK university while living in the US) with that of undergraduate students in the US, but I think that this is still a really useful reflective exercise to see how I am positioned within this group of students. After all, as a learning technologist, I am indirectly affecting their learning experience by working directly with their teachers in developing teaching aids that integrate information technology to varying degrees. Essentially, what I’m say is that I feel it’s important to know your audience. If my job is to work with the teachers who educate undergraduates, then one way to ensure that I’m doing the best job possible is to know how my ICT recommendations relate to the end user.
So, now for some telling comparisons…
The above comparison explains a bit of my aversion to m-learning: I’m not really that interested in cell phones, and perhaps this makes it more difficult to get excited about mobile learning. On the other hand, my heavy use of electronic music/video devices perhaps also helps explain why I’m an advocate of podcasts and audiobooks as learning tools.
How much do those closest to us influence the way we view and use technology? Is developing ICT skills more of a social activity then a solitary act? Do we participate because we want to keep up? …to remain relevant in conversations? …because we see that others’ lives have been enriched? …or does like attract like? Do innovators all ride the techie wave while the laggards collectively dip in one toe at a time? Can an innovator convert a laggard?
Only 2.8 percent prefer courses that use technology exclusively?! I thought the majority would be in this grouping! I guess if I had to ask myself why I have this perception, it’s probably because I assume that since this generation grew up immersed in a technology-rich environment, they want that environment to extend naturally into their formal learning space.
I am one of those students who will opt for the 100% technology-based course almost every time! The study notes that 60% of those asked prefer a ‘moderate’ integration of technology into their courses. What does moderate mean to an undergraduate these days? Moderate to me would mean a standard course wiki, student blogs, podcasts, the occasional webcast (not necessarily live) and the obvious course management system. Is that moderate to you?
Perhaps a key to answering this last question reveals itself in the results from the open-ended questions where students indicated that IT becomes a barrier to learning when its proliferation creates a more complex learning environment. Could this mean that if they knew how to use the technology from the beginning of a course, it would no longer represent a barrier? Does this mean that secondary education isn’t preparing students enough for post-secondary learning? Or are students expected to learn about how to create a blog, contribute to a wiki and subscribe to a podcast in their own time?
I have to wonder whether evaluating potential candidates for learning technologist / instructional designer jobs in this way would help institutions and private companies better align competencies and interests with job goals and broader organizational targets.