Recently, a colleague posted a comment about how some educational technology leaders were taking the focus off of technology and putting it back onto learning. This is good. Many of us in the educational technology field have always put the emphasis on the learning with technology being used only where appropriate. By “technology” I mean any item that has been invented in our conscious lifetimes (e.g., smart boards, blogs). Sometimes, the best approach is old school tech like chalk.
What bothered me was the blog post by George Siemens that was linked. I was surprised at his interpretation of 2 articles he referenced, and how the 5 elements of his framework for ed tech are not applied more broadly in education. Even though Siemens’ interpretation surprised me, I am not worried about how he moves forward. His Learning Innovation and Networked Knowledge (LINK) Research Lab at the The University of Texas at Arlington works toward “advancing social and technological networks, designing innovative learning models, and exploring the future of higher education.”
My concern is with those who read his post and take it at face value.
Making Sense of the Articles
The first article is “Teaching tomorrow,” an interview with Sebastian Thrun in The Economist. From this article, Siemens cites the opening quote, “BECAUSE of the increased efficiency of machines, it is getting harder and harder for a human to make a productive contribution to society.” Siemens then goes on to state:
If that is true, why is his startup trying to teach humans? Why not drop the human teaching thing altogether and just develop algorithms for making the stated productive contribution to society? He also details nanodegrees which are essentially what we in academia have to date called “certificates”. Perhaps we can call them nano-robo-certificates. Making up words is fun when media attention is petitioned. Most discouraging about this is that I’ve met Sebastian and he is a friendly, caring, deeply motivated person. The Thrun-of-media doesn’t align with the thoughtful Thrun-in-person.
The last sentence suggests an Argyris and Schön theory-in-use versus an espoused theory conflict. This would be troubling if it were the case, yet within the same paragraph as Thrun’s first quote, Thrun is quoted as saying “To the extent we are seeing the beginning of a battle between artificial intelligence (AI) and humanity, I am 100% loyal to people.” Nice words, but why Udacity and nanodegrees? Thrun states:
The dream of lifetime employment has gone. In my field, whatever you’ve learned becomes obsolete within five years. If you only spend six months on your first degree, as opposed to the average six years for a bachelor’s degree today, you can afford to get more education when you need it again later on.
Udacity claims of their nanodegrees: “Compact & Flexible. Designed for busy people. Complete in 6-12 months.” I feel there is value to longer term learning beyond 6-12 months, but I do not discount the value of any learning whether it takes 6 years or 6-12 months. More important, what do people outside of higher ed value? You know, the ones threatened by AI.
Siemens asks “why is his startup trying to teach humans? Why not drop the human teaching thing altogether and just develop algorithms for making the stated productive contribution to society?” The article ends with the answer from Thrun:
We have a situation where the gap between well-skilled people and unskilled people is widening. Udacity is my response to the development of AI. The mission I have to educate everybody is really an attempt to delay what AI will eventually do to us, because I honestly believe people should have a chance.
Thrun is not the only person concerned about the advancement of AI. Bill Gates, Elon Musk and Stephen Hawking have expressed their concerns. Back in 2009 I read a book by Martin Ford named The Lights in the Tunnel where he laid out his concern about AI eliminating much of the productive work done by people, and that people need to be productive in some form in order to maintain a sense of purpose. When I had talked with a colleague about this book, he referred me to a piece of science fiction written in 1952 named Player Piano by Kurt Vonnegut, Jr. For me, Vonnegut sums it all up nicely with the very last words of the book: “Forward March.” Thrun recognizes we are marching forward, and he appears to be attempting to mitigate the negative effects upon people.
This Robot Tutor Will Make Personalizing Education Easy
The second article is “This Robot Tutor Will Make Personalizing Education Easy,” an interview with Jose Ferreira about his company, Knewton, in Wired. Siemens cites a quote from the very end of the article where Ferreira states “but this robot tutor can essentially read your mind.” Siemens then goes on to state:
His rhetoric doesn’t align with the real challenges of education where cognitive capability alone is a small factor in learner success. Robot tutors will not make personalized learning easy. Learning is contextual, social, and involves whole person dynamics.
I agree with Siemens about cognitive capability alone being “a small factor in learner success.” Contrary to the belief of some in higher ed, we have the whole person in front of us, not just their mind. Beyond this, Siemens seems to think Ferreira just wants students to “sit and click.” While Knewton could certainly facilitate just this approach, a more engaged teacher would use the personalization of this one aspect of content acquisition to free up time for different learning activities related to the subject. This is the idea behind a “flipped classroom.” If each student were able to complete a personalized lesson prior to coming to class, an engaged teacher could design an activity built upon that lesson.
Prior to Siemens’ blog post, I had not worked with Knewton. So I took a lesson on the Scientific Method to see what one snapshot of Knewton was like. From my experience I am unable to tell how my lesson differed from someone else taking it. I will allow that Knewton personalized my lesson based upon my answers. Given this, if this is all the teacher had me do, I would be in total agreement with Siemens.
What if the teacher had me complete the Knewton lesson before coming to class, and in class I worked in a small group of my peers creating a hypothesis? Each group could come up with a hypothesis, and then we could switch tables to review the hypotheses and determine if they were properly formed. No “sit and click” here. The teacher could be going around asking questions of students that would feed the “confusion process of learning” Siemens and I desire, and the students would be engaged in active learning. Knewton is sounding more promising when viewed in this light.
Hidden in Siemens’ assertion:
Both Udacity and Knewton require the human, the learner, to become a technology, to become a component within their well-architected software system. Sit and click. Sit and click. So much of learning involves decision making, developing meta-cognitive skills, exploring, finding passion, taking peripheral paths. Automation treats the person as an object to which things are done.
is the notion that we always do something different than this in a “traditional” class. What if I reworded Siemens’ quote to be:
Courses require the human, the learner, to become a receptacle, to become a component within the teacher’s well-architected course. Sit and listen. Sit and listen. So much of learning involves decision making, developing meta-cognitive skills, exploring, finding passion, taking peripheral paths. Traditional teaching treats the person as an object to which things are done.
I would certainly hope many teachers would take umbrage at this statement. Unfortunately, I expect all of us can remember a course that was exactly as described above. There is no magic fairy dust when people are involved. This is why the fairies, not humans, have the dust.
Knewton is just a tool. It may be used for good or ill. The human without the fairy dust will determine how the tool is used or abused.
A Framework for More Than Ed Tech
Siemens states his current framework for technologies in the educational technology space has the following 5 elemental questions:
- Does the technology foster creativity and personal expression?
- Does the technology develop the learner and contribute to her formation as a person?
- Is the technology fun and engaging?
- Does the technology have the human teacher and/or peer learners at the centre?
- Does the technology consider the whole learner?
I like these questions with the exception that I feel the teacher should not be at the center (#4). I think they will guide his LINK Research Lab well.
Why stop at educational technology? What if we took these questions and expanded their focus?
- Does the technology (or teacher, major, college) foster creativity and personal expression?
- Does the technology (or teacher, major, college) develop the learner and contribute to her formation as a person?
- Is the technology (or teacher, major, college) fun and engaging?
- Does the technology (or teacher, major, college) have the
human teacherlearner and/or peer learners at the centre?
- Does the technology (or teacher, major, college) consider the whole learner?
Now we are getting somewhere. Before someone tells me this is what they (i.e., teacher, major, college) are doing, be prepared for me to ask you what your evidence is that you are. Put up or shut up.
In any event, forward march!