Sunday 27 April 2014

LAK14

Warning - This is more of a conference report than a blog.
Learning Analytics and Knowledge Conference - Indianapolis, March 2014
Both of my flights: Sydney -> Los Angeles -> Indianapolis were 3 hours late, the first because of weather and a technical fault on the A380, the second, a mid-flight medical emergency resulting in an unscheduled stopover in Kansas City. Having handed the patient over to paramedics, the airline realised they had to fly in oxygen tanks to replace those used, and then, unbelievably, when we finally started taxiing - the plane hit something on the runway and we had to sit and wait for engineers to inspect the damage. Flying is such fun!
The conference was held downtown, in the middle of March Madness fever.  As luck would have it, my coat was red which happened to be the colour of the local team J
In the sessions I was able to attend there were three kinds of papers:
A.     lots of what I would call really interesting but small-scale learning analytics projects
B.     development of approaches to using technologies in learning
C.     discussion on the systems and policies needed to scale up analytics to the institutional level.
Learning Analytics is an emergent field and we need all of these to succeed in scaling up the work. We need the smaller scale projects to get some runs on the board, but we also need to work at the institutional and policy levels if we are to get th enecessary attention and funding to scale up.
A. Of those sessions I attended, the standout individual projects were:
1.    Xavier Ochoa’s ‘Techniques for data-driven curriculum analysis’
One of his approaches is to determine the degree of difficulty of each subject by differences between students' GPAs and subject grades.  Where the distribution of GPAs is greater than the distribution of subject grades, the subject is deemed difficult.
2.     Alyssa Wise’s e-listening project has a focus on understanding what it means to listen online (as well as the more common approach of studying speaking or contributions) The website for her project is at http://www.sfu.ca/~afw3/research/e-listening/

3.     Duygu Simsek won the best demo award for her project analytics for academic writing. A poster illustrating her work is at
pic.twitter.com/OtGzNC13lD

and her slides are at: http://t.co/pFZYiakMDL
B. Graesser’s opening keynote was a good example of the use of technologies in learning. The basic idea to develop software agents that help students learn. Sometimes the agent is represented by some kind of avatar, other times they are simply recommendations that appear on the screen. He is interested in the former.
Some interesting examples he showed that I haven’t seen before were: Betty’s brian
This blog reports on the presentation better than I could:
In the Q&A afterwards, Graesser responded to a question on the worth of agents in learning when humans can clearly do so much better (since everything pretty much has to be pre-programmed for a software-based agent). He asked how often we see good educational practice vs telling – for example, where is modeling, coaching, fading; building on existing knowledge in f2f?
He showed a very amusing video demonstrating learning mediated via a talking fish
And the standout projects at the university level were:
Jeff Grann is from Capella where they have 35,000 students, of whom 75% are female and have an average age of 40. They have done some impressive work on assessment with:
·      an automated scoring guide tool where each assessment criterion includes clickable descriptors for what constitutes: non-performance, basic, proficient and distinguished performance. There is also scope for the tutor to add further comments for each criterion, and the score is automatically calculated at completion.
·      Each student has access to a competency map (looks like a dashboard) showing where their performance is currently at for each competency
C.
1.     I participated in a panel with Rebecca Ferguson and Doug Clow from the Open University in the UK, Leah McFadyen from the University of British Columbia in Canada, and Shane Dawson from the University of South Australia. We began by reviewing various frameworks for taking systems approaches for institutional approaches to implementing learning analytics.
I then used UTS as a case study, outlining our systems approach to using analytics in all aspects of the university’s work ie in research, in teaching and learning, and administration. I also talked about our project to ensure that our staff and students are numerate and our forthcoming Masters degree in Data Science and Innovation. Finally, the bombshell! I announced that we have established a Connected Intelligence Centre to further our analytics work, and that we have appointed Professor Simon Buckingham-Shum from the Open University in the UK as its Director. Everyone was very envious that we have lured Simon.
2.     Kim Arnold chaired a panel on institutional analytics – her group is working on a Learning Analytics Readiness Instrument (LARI) see. They posit the readiness factors to be: Ability; data; Culture and Process; Governance and Infrastructure; and Overall Readiness Perception.

3.     Nancy Law’s keynote ‘Is Learning Analytics a Disruptive Innovation?’ was her usual high quality, measured presentation. She talked about learning analytics as an invasive species in an education ecology and hence I have categorised her presentation in the systems section of this report.
She began by reminding the audience that it was in fact Kodak who invented the digital camera. Who owns a Kodak digital camera now? Where is Kodak now? Good analogy!
But will higher ed go the way of Kodak – inventing online learning, losing first mover advantage and then self imploding?
Clayton Christensen’s book – the Innovator’s Dilemma is regularly used to highlight this and Nancy also drew on his work.
So, now to the topic of the day - will Learning Analytics be a disruptive innovation? Or will it be a transformative one and actually sustain higher education?
"Adding wings to caterpillars does not create butterflies. It creates awkward caterpillars. It requires transformation" (Stephanie Marshall)
It reminded me of  quote of Seymour Papert’s that was something like “you cant go from a stage coach to a jet by strapping engines on the horses”. The reason this resonates with me is that so many people think that by using whatever the latest technology is, that they are transforming learning. Alas they are merely creating awkward caterpillars and stagecoaches.
She went on to mention other ‘innovations’ in the use of technology in education that have either failed to result in any impact or failed to reach scale.
In her recent work (a book that I didn’t catch the name of) she identified 5 principles for sustainable innovation:
·       Diversity
·       Connectivity
·       Interdependence
·       Self-organisation – mechanisms
·       Emergence
Challenge is how to make learning analytics part of the ecology of learning
Other things of interest
Everyone was impressed that the Open University has eight ‘data wranglers’ but I also discovered a new job title - Data Griot – someone whose role it is to tell stories about data within an organisation. See for example http://www.warrenellis.com/?p=13246
Some interesting links:
Doug Clow did his usual extraordinary job of live blogging the event. His blogs are in much more detail than I have put here and well worth the time to read through: http://dougclow.org/lak14/
Graphic representation of Twitter activity for #LAK14
Reports
The Beyond Prototypes report by Professor Eileen Scanlon, Professor Mike Sharples, Professor Mark Fenton-O’Creevy, Professor James Fleck, Dr Caroline Cooban, Dr Rebecca Ferguson, Dr Simon Cross and Peter Waterhouse.
examination of the processes of innovation in technology-enhanced learning (TEL).
Innovating Pedagogy 2013
Mike Sharples, Patrick McAndrew, Martin Weller, Rebecca Ferguson, Elizabeth FitzGerald, Tony Hirst, Mark Gaved
Alternative to Horizon report
websites
peer assessment project at Stanford
Projects
book
Assessing the educational data movement
Piety Philip
Journal article
Cynthia Coburn Research on Data Use: A framework and analysis ‘Measurementinterdiscipliinary Research and Perspective (2011)

Thursday 6 December 2012

Conference evaluation


Evaluation of SoLAR Southern Flare

Participants were invited to complete a questionnaire and the results have been collated in such a way that they might be useful to others who may wish to run a similar event

The first question together will an analysis of responses was

1. What were your main reasons for attending this event?

To learn about/ or satisfy a general interest in LA           27 responses
Network with others in the area                                      13
Find out where other institutions are up to                      11

The results for the following question were collated by reason for attendance:

2. Did you realise these goals?

To learn about/ or satisfy a general interest in LA   
            Yes                    26
            Mostly               1

Network with others                                                
            Yes                    9
            Pretty much       3
            No                     1

Find out where other institutions are up to                
            Yes                    10
            No                      1

Those who attended out of general interest and/or to find out where other institutions are up to were the most positive about having achieved their goals. Those who attended to network with others were mostly positive.

Again, responses to the next 2 questions were analysed in the context of the goals of attendance.

For those whose goals were:
 To learn about/ or satisfy a general interest in LA    

What helped you achieve these goals?
  •      Presentations 14 responses
  •       The workshop I attended 7
  •       Conversations 5
  •       The mix of formal and informal sessions/ structure of event 4
  •       Variety of view points + pros and cons 4
  •       The keynotes 3
  •       Good intimate conference 2
  •       The twitter stream
  •       Skillful moderators
  •       Summary from ACODE
  •       Panel session


Is there anything we could have done differently to help you achieve these goals?
  •      no  16
  •       more open discussion sessions 3
  •       some pre-reading for newbies 2
  •       for next event organise work groupings based on shared problems or analytics maturity
  •       more break time
  •       workshop could have been more focused
  •       don't put the keynote on at 4pm to open the event
  •       establish ‘home groups’ for timetabled smaller group discussions of presentations
  •       distribute presentations before the session
  •       don't know


Network with others                                                                      
What helped you achieve these goals?
  •      Organisation of the evening cocktail event 4
  •       discussions at workshops and papers 4
  •       unstructured time/ time to network 4
  •       Getting folk in the room

Is there anything we could have done differently to help you achieve these goals?
  •       less talking at us (too many papers) 2
  •       establish a “Moodle” like site beforehand so we can introduce ourselves and start connecting prior to the event  2
  •       longer lead times for papers
  •       more promotion
  •       more play & conversation
  •       list of attendees and contact details (available)
  •       presenters slides (done where supplied)


Find out where other institutions are up to               
What helped you achieve these goals?
  •      Breakfast/ lunch/ dinner conversations with others 5
  •       presentations 5
  •       The workshop I attended 2
  •       Chance for exposure to other Australians  2
  •       international perspective

Is there anything we could have done differently to help you achieve these goals?
  •     No 2
  •       scrape the Twitter stream on blogs
  •       aggregate URLs – beginnings of a repository of relevant data to Learning Analytics
  •       more student-focused presentations
  •       need a cross sector (not just ACODE) interest group to drive this agenda – should be multidisciplinary


My analysis – Depending on the motivation for attending, there were different ways in which the structure of the event helped and ways in which it could be improved.

For those whose aims were to learn more about LA or who had a general interest in the area, it was clear that the presentations, workshops and keynotes were instrumental in helping them achieve their aims. The majority of this group could not think of any way in which their experience could have been enhanced, although the suggestions about providing pre-readings and more discussions have been noted for next time.

For those whose aims were to network with others in the area, the built-in opportunities for discussion were highly valued. The decision to change the conference “dinner” to a stand-up cocktail event appears to have been a good one, as comments were made about the higher level of networking that was made possible by that arrangement. As one might expect, this group asked for more opportunities to make personal contact with others both online (before and after the event), and face-to-face during the conference.

For those whose aims were to find out what other institutions are doing those needs were mainly met through conversations over meals and attendance at the presentations.

The suggestions about blogs, making URLs available, and setting up an interest group are already underway.
                                                                                                                
 5. What are you planning to do in terms of learning analytics when you return to your institution?
The majority of participants plan to discuss, share, report back, employ common-sense approach and lessons learned, scope, plan, and get the right people in the room. This indicates that there will be a high level of transfer of learning from the event back to the institutions.

6 Are you interested in being part of a Learning Analytics network?
A significant number of people have already added their names to Google Document and those will be contacted shortly

7 Any other comments you would like to make?
·      Brilliant / Thank you/ well done                9
·      When is the next one?/ do it again             4
·      Good to amazing food                               3
·      Conference was expensive                        2
(note we broke even on the event so not sure how we could do it more cheaply)
·      Good organisation                                     2
·      I am concerned about ethics                      1
·      Next time make it more interactive


The great majority of comments were very positive, although there some aspects that will be improved next time.

T H A N K S



Thursday 29 November 2012

Postscript

Once again my thanks to the Organising Committee:

Lori Lockyer from Macquarie University
Margaret Hicks and Shane Dawson from University of South Australia
Phil Long from The University of Queensland
Ron Oliver from Edith Cowan University (unfortunately called overseas and couldn't participate live)
Peter Tregloan University of Melbourne
Gabrielle Gardiner UTS

Now the conference has closed I thought it might be worth saying a little more about where we might go to from here.

It was very pleasing to see the number of people who were interested in collaborating on further developments, and on the OLT grants to be announced in particular. Participants will receive an email with a link to a Google Doc - please complete that and indicate your area of interest in collaborating. we will put people together and go from there.

Finally, here is a list of other organisations around the world that I know of who are working in this area

Society for Learning Analytics Research

Solar Flare UK and Doug Clow's blog from the event

Solar Flare Purdue

The next international SoLAR event is in Leuven, Belgium 8-12 April 2013.

Afternoon of Day 2 of SoLAR


Second paper session



First steps in learning analytics with LearnTrak
Susan Tull

Unfortunately I couldn’t watch this talk –  Prezi which always makes me dizzy so instead I will refer to an external summary of the project from
http://www.educause.edu/ero/article/trak-first-steps-learning-analytics

which includes the following summary of major points:
  • A current focus at the University of Canterbury is to develop a culture of institution-wide accountability for the attraction, admission, support, retention, and success of students.
  • Lecturers are encouraged to become part of the early intervention process and develop skills using this first step of the system so that they can effectively engage with learning analytics.
  • LearnTrak software lets lecturers view data on student accesses to resources and activities in a user-friendly graphical form.
  • An organization using Moodle can adopt this tool and Canterbury's method of implementation as a first step toward implementing learning analytics.
There were lots of very positive comments on this paper so it was disappointing to miss it.

 Moving beyond a fashion: likely paths and pitfalls for learning analytics
David Jones, Colin Beer


This was a great presentation on paths and pitfalls – admitted he had intended to be more negative, but changed his mind as a result of Susan's presentation. He also admitted to being a glass half-empty person so said we should bear that in mind when he says we are into the gold rush days of learning analytics.

After mentioning his previous Indicators Project, he posed his big (and refreshing) question How can we stop Learning Analytics from becoming a fad? He said there is some value in Learning Analytics but how can we prevent the fad behaviour which consists of big unrealistic expectations, poor implementation, then abandonment and moving on to the next fad.

He showed Birnbaum's 2000 model as an alternative to Gartner’s Hype cycle 

One of the terrific images in his presentation was of a swamp - but he had some suggested paths through the swamp which he referred to as changing the game of education.

Do it to – (academics) dominant model – managing from the dark side. Data warehouses likely to fail or not used very much
Do it for – (academics and students) – change thinking and planning
Do it with – evolutionary development – failures of rationality - Have to embed learning analytics in the design of courses

Link to his paper on this
Predicts no-one will be taking about Learning Analytics in 5 years time
David has posted his slides and references.

Exploring data from existing systems that are useful in assisting student learning: An Indonesian Perspective
Yasmin Erika Faridhan

PhD student – also using Prezi
Great to see a doctoral student here and sorry I can't provide more detail of her Prezi talk. Instead here are exerts from her abstract:

In Indonesia, extracting data for learning analytics purposes is not an easy job.  Although internet and information technology usage in Indonesian higher education has been growing fast in this twenty-first century, their integration in the form of student learning systems has only been surfacing recently. Compared to other countries in the Southeast Asia region, Indonesia is lagging behind Singapore and Malaysia in adopting educational technology to support student learning.

In this presentation I am going to introduce my PhD study on learning analytics in the Indonesian higher education environment.  I would be very interested in starting conversations with other researchers who have experiences with learning analytics, especially in an international context.  My work has a statistical focus and I would also be interested in finding out who else is working in this area.  This research is presented in order to obtain feedback that may provide insight into the next step of this project.

The final session was essentially about where to from here?


Rob Phillips gave an interesting report back from Australasian Council for Open and Distance Education (ACODE) in terms of the interesting work being done by members, and the projects they have decided to undertake including completing an annotated bibliography briefing papers on LA.

There was a panel session involving 4 of us either talking about what we are doing in our institutions or what we should be doing. Rather than what often happens at a gathering or early adopters, there was much discussion about ensuring we don't repeat the mistakes of the past - learning design and learning objects were mentioned as two areas that there was great enthusiasm about but haven't lived up to their earlier promise.

One aspect of this was ethics and there was discussion about whether students should be able to opt out of having their data included. A brilliant suggestion was made about the need to have a teaching and learning ethics committee which would mirror the research ethics committee 

The final talk was from Siobhan Lenihan who is Director of Grants and Fellowships for the Office for Learning and Teaching (OLT). She ended on the day on a good note by announcing that one of the first topics to be commissioned in 2013 by  OLT will be in the general area of Learning Analytics. There is an indicative budget of $1.5M for 4 projects, one of which will be on the topic of Learning Analytics. There seemed to be a general feeling in the audience that rather than these being competitive grants, we should try to find ways for everyone to collaborate.