Warning - This is more of a conference
report than a blog.
Learning Analytics and Knowledge Conference - Indianapolis, March 2014
Both of my flights: Sydney -> Los Angeles -> Indianapolis were 3 hours late, the first because of weather and a
technical fault on the A380, the second, a mid-flight medical emergency
resulting in an unscheduled stopover in Kansas City. Having handed the patient over
to paramedics, the airline realised they had to fly in oxygen tanks to replace
those used, and then, unbelievably, when we finally started taxiing - the plane
hit something on the runway and we had to sit and wait for engineers to inspect
the damage. Flying is such fun!
The conference was held downtown, in the
middle of March Madness fever. As luck
would have it, my coat was red which happened to be the colour of the local
team J
In the sessions I was able to attend there
were three kinds of papers:
A.
lots of what I would call
really interesting but small-scale learning analytics projects
B.
development of approaches to
using technologies in learning
C.
discussion on the systems and
policies needed to scale up analytics to the institutional level.
Learning Analytics is an emergent field and
we need all of these to succeed in scaling up the work. We need the smaller
scale projects to get some runs on the board, but we also need to work at the
institutional and policy levels if we are to get th enecessary attention and
funding to scale up.
A. Of those sessions I attended, the
standout individual projects were:
1.
Xavier
Ochoa’s ‘Techniques for data-driven curriculum analysis’
(see www.slideshare.com/xaoch)
One of his
approaches is to determine the degree of difficulty of each subject by
differences between students' GPAs and subject grades. Where the distribution of GPAs is greater
than the distribution of subject grades, the subject is deemed difficult.
2.
Alyssa Wise’s e-listening project has a
focus on understanding what it means to listen online (as well as the more
common approach of studying speaking or contributions) The website for her
project is at http://www.sfu.ca/~afw3/research/e-listening/
3.
Duygu Simsek won the best demo award for
her project analytics for academic
writing. A poster illustrating her work is at
B. Graesser’s opening keynote was a good
example of the use of technologies in learning. The basic idea to develop
software agents that help students learn. Sometimes the agent is represented by
some kind of avatar, other times they are simply recommendations that appear on
the screen. He is interested in the former.
Some interesting examples he showed that I
haven’t seen before were: Betty’s brian
This
blog reports on the presentation better than I could:
In the Q&A afterwards, Graesser
responded to a question on the worth of agents in learning when humans can
clearly do so much better (since everything pretty much has to be
pre-programmed for a software-based agent). He asked how often we see good
educational practice vs telling – for example, where is modeling, coaching,
fading; building on existing knowledge in f2f?
He showed a very amusing video
demonstrating learning mediated via a talking fish
And the standout projects at the university
level were:
Jeff Grann is from Capella where they have
35,000 students, of whom 75% are female and have an average age of 40. They
have done some impressive work on assessment with:
·
an automated scoring guide tool
where each assessment criterion includes clickable descriptors for what
constitutes: non-performance, basic, proficient and distinguished performance.
There is also scope for the tutor to add further comments for each criterion,
and the score is automatically calculated at completion.
·
Each student has access to a
competency map (looks like a dashboard) showing where their performance is
currently at for each competency
C.
1.
I participated in a panel with
Rebecca Ferguson and Doug Clow from the Open University in the UK, Leah
McFadyen from the University of British Columbia in Canada, and Shane Dawson
from the University of South Australia. We began by reviewing various
frameworks for taking systems approaches for institutional approaches to
implementing learning analytics.
I then used UTS
as a case study, outlining our systems approach to using analytics in all
aspects of the university’s work ie in research, in teaching and learning, and
administration. I also talked about our project to ensure that our staff and
students are numerate and our forthcoming Masters degree in Data Science and
Innovation. Finally, the bombshell! I announced that we have established a
Connected Intelligence Centre to further our analytics work, and that we have
appointed Professor Simon Buckingham-Shum from the Open University in the UK as
its Director. Everyone was very envious that we have lured Simon.
2.
Kim Arnold chaired a panel on institutional
analytics – her group is working on a Learning Analytics Readiness Instrument
(LARI) see. They posit the readiness factors to be: Ability; data; Culture and
Process; Governance and Infrastructure; and Overall Readiness Perception.
They also
mentioned the Educause ECAR Maturity Index (http://www.educause.edu/ecar/research-publications/ecar-analytics-maturity-index-higher-education)
3.
Nancy Law’s keynote ‘Is
Learning Analytics a Disruptive Innovation?’ was her usual high quality,
measured presentation. She talked about learning analytics as an invasive
species in an education ecology and hence I have categorised her presentation
in the systems section of this report.
She
began by reminding the audience that it was in fact Kodak who invented the
digital camera. Who owns a Kodak digital camera now? Where is Kodak now? Good
analogy!
But
will higher ed go the way of Kodak – inventing online learning, losing first
mover advantage and then self imploding?
Clayton
Christensen’s book – the Innovator’s Dilemma is regularly used to highlight
this and Nancy also drew on his work.
So,
now to the topic of the day - will Learning Analytics be a disruptive
innovation? Or will it be a transformative one and actually sustain higher
education?
"Adding
wings to caterpillars does not create butterflies. It creates awkward
caterpillars. It requires transformation" (Stephanie Marshall)
It
reminded me of quote of Seymour Papert’s
that was something like “you cant go from a stage coach to a jet by strapping
engines on the horses”. The reason this resonates with me is that so many
people think that by using whatever the latest technology is, that they are transforming
learning. Alas they are merely creating awkward caterpillars and stagecoaches.
She
went on to mention other ‘innovations’ in the use of technology in education
that have either failed to result in any impact or failed to reach scale.
In her recent work (a book that I didn’t catch the name of) she
identified 5 principles for sustainable innovation:
·
Diversity
·
Connectivity
·
Interdependence
·
Self-organisation
– mechanisms
·
Emergence
Challenge is
how to make learning analytics part of the ecology of learning
Other things of interest
Everyone was impressed that the Open
University has eight ‘data wranglers’ but I also discovered a new job title -
Data Griot – someone whose role it is to tell stories about data within an
organisation. See for example http://www.warrenellis.com/?p=13246
Some interesting links:
Doug Clow did his usual extraordinary job
of live blogging the event. His blogs are in much more detail than I have put
here and well worth the time to read through: http://dougclow.org/lak14/
Graphic representation of Twitter activity
for #LAK14
Reports
The Beyond Prototypes report by Professor Eileen Scanlon, Professor
Mike Sharples, Professor Mark Fenton-O’Creevy, Professor James Fleck, Dr
Caroline Cooban, Dr Rebecca Ferguson, Dr Simon Cross and Peter Waterhouse.
examination of the processes of
innovation in technology-enhanced learning (TEL).
Innovating
Pedagogy 2013
Mike Sharples, Patrick McAndrew, Martin
Weller, Rebecca Ferguson, Elizabeth FitzGerald, Tony Hirst, Mark Gaved
Alternative to Horizon report
websites
peer assessment project at Stanford
Projects
LACE - http://www.laceproject.eu
About - http://www.laceproject.eu/lace/
book
Assessing
the educational data movement
Piety Philip
Journal
article
Cynthia Coburn Research on Data Use: A
framework and analysis ‘Measurementinterdiscipliinary Research and Perspective
(2011)