When I was in what would here be called high school, around the early 1990s, I took informatics as one of my classes. Students in the class had a considerably wider spread of prior knowledge than in any other classes; some people had already dabbled in a programming language, but at the same time others were afraid that their computer would break if they typed in the wrong letter. Perhaps unsurprisingly, the teacher had much more fun interacting with the former group.
When I was an undergraduate, I also took informatics as a minors. The lectures and tutorials had an interesting audience. The percentage of female students was somewhere south of 10%, something I was totally unused to from my biology lectures, where it was more around 60%. More relevant for present purposes, there was again the same massive spread between people who already knew how to program and others who seemed to be touching a computer for the first time.
I particularly remember two fellow biology students who dropped out of informatics after the first few weeks. The whole idea of how you need to make a computer understand what it should do just never clicked for them, down to simple concepts like a variable – when tasked with increasing an integer variable called x by one they would, right to the end, try things like “integer + 1” instead of “x = x + 1”. And while one might argue that the instructors failed these students, who can blame them given that most other students were progressing well?
Given the ubiquity of computers in young people's lives today, I had kind of assumed that these days were gone. The past week, however, I inadvertently put that belief to the test. I gave a practical on using quantitative analyses of morphological data for species delimitation, specifically ordination, hierarchical clustering, and non-hierarchical clustering. My chosen software was R, and although I had prepared a script that the students only had to execute without needing to understand any of the commands, I felt some trepidation.
It starts with the computers themselves. Just a few years ago I had the feeling that very nearly every student owned a laptop suited to such a computer practical. But the past few years have seen a shift in what hardware people are likely to own, so now half of them may only have a tablet; great for reading a PDF or watching YouTube clips, not so great for using a scripting language.
More importantly, I found that the situation with the spread in prior knowledge had not changed in the intervening more than twenty years. About five students raised their hands when I asked if anybody had used R before, something I found seriously impressive for a second year course. At the other end of the spectrum were at least two students who appeared to find it difficult to interpret the meaning of the term working directory / folder.
I want to make it clear that I have no problem whatsoever with the latter. I have no expectation that second year students necessarily have any experience with computers beyond checking eMails and using a search engine. I could happily design a course to give them a good, useful learning experience.
But what I have no idea how to do is how to design a course that gives both them and those who have already used R a good, useful learning experience at the same time. Focus on walking the beginners through every step, and you will find the experienced students getting bored and starting to talk about what to do on the weekend. Focus on engaging the experienced, and you will hopelessly frustrate the beginners, so that they will just give up at some point.
I may be mistaken, but I think there is really no other area where the same thing happens. Yes, students come with somewhat different levels of prior knowledge in every subject, but surely we don't have some of them walk into a molecular genetics practical saying “ah yes, Western Blot, I already did that in summer camp when I was twelve” while others have never even heard of DNA.
So what to do? At the moment I don't have any good ideas.