There is a concept based on common sense and first articulated by Marc Pensky which states that the children brought up with digital devices around them can be considered “digital natives” – they are comfortable and resilient with technologies. Others, who struggle to adapt or adopt to new technologies are “digital immigrants” – technology is not as user-friendly for them as it is for digital natives. The only problem with this idea is that it is complete nonsense.
I can agree that today’s students, what many are calling “digital natives,” want to and expect to learn in a different way than their parents or grandparents. Though they definitely can learn without technology, teachers assume students prefer using computers, smart phones, and video games to learn. These things are considered part of students’ natural environment. They are considered givens. The idea of the digital native presupposes that they not only use technology, but that they are adept at it, and these innate skills can transfer to any technology. As Marc Scott’s article “Kids Can’t Use Computers… And This Is Why It Should Worry You” points out, this couldn’t be further from the truth.
Yes, students love games and apps, but ask them to fill out a form or set up a Google drive account and share a document and they will most certainly flounder. Case in point: my brother. My brother is 10 years younger than me and most certainly fits the definition of a digital native. There are many things he can do on his computer: play video games, download trojans, download malware, install bloatware to get rid of the malware, and download viruses. He has been through at least three computers, all of which were “too slow” because of the massive amount of crap he had inadvertently downloaded. I don’t know how many times I have taught him to download movies at a hefty discount and he still can’t learn. Now, my brother is not stupid. Nor does he have a learning disability. It’s just that he has never taken the time to learn any useful computer skills. He is not a digital native. He is a digital tourist – just visiting the sights without learning the culture.
Another telling case in point is Korea. It is considered the most wired nation on Earth and is responsible for advancing all sorts of mobile computing technologies. A large proportion of the population are glued to their phones. People have died from gaming for too long. But, ask a student to go to YouTube and they will take this circuitous route: type “Naver” (Korea’s search portal) into the address bar, click on the Naver link, search for “YouTube” or “UTube” and then click on the YouTube link. Maybe it’s a pet peeve, but when I saw my middle school students doing this I was dumbstruck. Here is the epitome of digital native and they couldn’t even complete a simple search. I don’t know how many times I told them my website address, and they would constantly search for anthony.com or anthonyteacher in Naver. I gave my university students a Google-shorted URL (something akin to http://goo.gl/RTY17ok). I can forgive the problems with case sensitivity, but more than half were trying to search the URL and when I asked them to use the browser, they pointed to Naver and said they were. If these are students’ web-browsing and search skills, what other skills are they missing? Like my brother, these students are digital tourists.
I think those that people consider digital natives are in fact digital tourists. Pensky’s digital immigrants would fit into this term as well. Digital tourists see and interact only with the surface of the digital world. They know how to choose their tour packages, but can’t go it alone. They don’t spend enough time learning the environment, playing with it, manipulating it, crashing it (and then fixing it). No one is born a native. Unlike eating, socializing or sex, digital technology is not a human instinct. We are all tourists. It’s just that at some point some of us decide to “go native”.
Recently, there have been efforts to make more students go native. Coding literacy is a buzz word going around the education world these days. Basically, it means teaching programming as a core subject, and especially to young learners. I am 100% in support of this. However, if it means teaching programming skills at the expense of basic computer skills (hardware, software, and web) as well as how to use word processing, presentation, image, and video software, then I am not in support of it.