UCISA Learning Analytics Event

On the 12th of April, I attended the UCISA event on learning analytics. This was a single day event held in Birmingham. Due to travel issues, we unfortunately missed the first speaker, Sarah Porter, Co-Chair of the National Inquiry into data in HE, High Education Commission – the author of the HEA report, “from bricks to clicks

OU Analyse (ppt)

Zdenek Zdrahal, Knowledge Media Institute, The Open University

The second speaker was Zdenek Zdrahal from Media Unit at the Open University. He described the use of predictive analytics to attempt to identify at-risk students even before the first assignment had taken place, as 98.6% of students who failed the first assignment did not complete the programme. They used a combination of static data including gender and educational history, and fluid data notably VLE interactions. The accuracy of the predictions was increased greatly even when only using the count of “no of clicks” in the VLE.

In practice, each item in the VLE was given a label and different paths were mapped to view the successful students possible paths and the typical failing student path. The  data analysis involved the use of 4 different data modelling techniques. If the student failed two of those techniques they were classified as at risk and intervention was put in place by the School. Spreadsheets were sent to the School highlighting the risk levels of each student. A dashboard has been created that highlights at risk students and the data used.

A student dashboard has been created that shows the student their “nearest neighbours”. These are students showing similar engagement behaviours and the predicted outcome for those students. It also predicts whether the student will submit the next assignment. To improve the students predicted outcomes, various activities are recommended. Currently, students cannot access this dashboard but it is now a priority to release this. I think the extra step of showing suggested activity to students has a real benefit to the learning experience. There is an intervention strategy automatically suggested to the student. The key would be to make sure these suggestions are accurate and relevant.

They have released an anonymised open source dataset. – https://analyse.kmi.open.ac.uk/open_dataset

Newcastle University does not have a retention issue overall but there may be areas where the use of data in this way may be beneficial – for example, distance learning programmes.

JISC Learning Analytics (ppt)

Michael Webb, the Director of Technology and Analytics at JISC talked about the history of learning analytics , then through some of the work that they are carrying out in the field of learning analytics.

Michael described their work with learning analytics as “the application of big data techniques such as machine-based learning and data mining to help learners and institutions meet their goals.”

Predictive learning analytics as seen in the Open University presentation was defined as the “statistical analysis of historical and current data derived from the learning process to create models that allow for predictions that can be used to improve learning outcomes. Models are developed by ‘mining’ large amounts of data to find hidden patterns that correlate to specific outcomes.” JISC are confident that predictive learning models would be ‘fairly’ transferable between institutions.

JISC currently have a learning analytics project that has three core strands:

  • Learning analytics architecture and service
  • Toolkit
  • Community

Learning analytics architecture and service

JISC are looking to create a national architecture that would help data modelling become transferable. They are working with several institutions to develop core services that institutions would be able to implement at a much lower cost than if developing themselves.
JISCLAJISC demonstrated their student app. It’s based on fitness apps where users can view an activity stream with their (and their peers) activity, including performance against student defined targets.
phoneLAJISC are also developing dashboards for students and for administrators:
dashboardLA dashboardLA2

They have released the JISC Code of Practice that outlines some of the ethical considerations institutions should make before embarking on a learning analytics project. This has been worked on with consultation from NUS. They have released their own guidance to University Students’ Unions.Michael finished off discussing what future developments may occur in learning analytics, including links to the teaching excellence framework and personalised next generation e-learning.

Beyond the dashboard: the practical application of analytics (ppt)

Ben Stein, Director, Student Success, Hobsons
Ben from Hobsons spoke about the inevitable rise in student dashboards. While dashboards are an integral part of learning analytics providing methods to display predictive statistics and recommended activities,  it is crucial that the support structures are in place that will provide positive interventions. Mark asked, “does a deeper understanding of the problem actually lead to a solution?” and stated, “ultimately it’s what you do with the information that makes the difference.” Mark then demonstrated a product from Hobsons that provide student / staff dashboards.

Personal tutor dashboards (ppt)

David Mutti, Head of Programme Management, University of Greenwich
David Mutti from the University of Greenwich showed their development of a personal tutoring system that pulls together various pieces of information about a tutee including a very basic application of learning analytics. Although feedback from academics was very positive, actual use was minimal. There was no compulsion to use the system. I thought the system was very well thought out with some good features, but the system was developed and promoted by the IT service. Would an increase in academic involvement lead to a greater take-up perhaps an academic lead?

Piloting learner analytics at the University of London International Programmes (ppt)

Dave Kenworthy, Head of Software Services , University of London Computer Centre and Tom Inkelaar, Head of Management Information, University of London
Tom described how the University of London International Programmes are using learning analytics with their students to try to improve retention. The University have 50,000 distance learning students across 100 countries.
The University implemented Bloom Thrive analytics in partnership with ULCC and Altis. They used student record data along with VLE usage to determine at risk students. As part of this data, they created a happiness block in their Moodle environments where students could say how happy they were as well as providing some free text comments.
The University found it relatively easy to determine which students were at risk but the intervention is more difficult when spread across such a wide geographical area. Another challenge they faced was regarding the data protection and whether the appropriate consent has been received for all students.

The conference was a very useful event to attend, and it demonstrated that although we are not currently implementing any centralised learning analytics we are in a good place to do so as required. The data sets we have could provide a rich learning experience from students.

Open Badges Presentation

I’ve been invited to speak at the “e-assessment question” conference next week at the America Square Conference Centre in London. I’ll be discussing the process that Newcastle University has recently gone through when exploring the potential of Open Badges, and will talk about the research I am undertaking about the implementation of Open Badges across Europe. I think it will be a very different conference to the “standard educational conference” I attend – Newcastle University is the only educational institution in attendance.

This is the Prezi I will use when presenting.

Implementation of Open Badges in Education

On the 23rd of June 2015 a survey was sent out to educational institutions via UK mailing lists, and globally via Twitter inviting institutions to answer three questions about their implementation of Open Badges. The three questions were:

  • Can you describe what stage of the implementation of Open Badges are you at?
  • What areas are you considering using Open Badges for?
  • How are you planning to issue the Open Badges?

I am delighted with the 57 responses we received in the survey, and with the ongoing discussion on the JISC Alt list.

The findings of this exercise is found on the attached .pdf.

Implementation of Open Badges in Education

Centre For Recording Achievement – 4th International Seminar

I’ve recently returned from the CRA conference in Plymouth. The seminar was titled “Researching and Evaluating Personal Development Planning and e-Portfolio” and had attendees from as far afield as Tokyo and the United States.

Introductory Plenary and discussion

The conference started with an introduction from Rob Ward, the Director of the CRA, and from Pauline Kneale, the Pro-Vice Chancellor for Teaching and Learning at Plymouth University. The title of the first main session was “The global ePortfolio Research Forum – Joining the dots and identifying the gaps.”. There were three stimulus contributions from:

1) Bret Eynon, LaGuardia Community College and Laura Gambino, Guttman Community College (CUNY)

This was an outline of work that had taken place across 24 campuses in the United States. The site offers data, practices and strategies, showing how ePortfolio can advance learning, deepen pedagogy and assessment, and support institutional change.

2)Beverley Oliver, Deakin University, Australia,

3) Igor Balaban, Project Co-ordinator, Europortfolio, University of Zagreb, Croatia.

Igor spoke about the work of EuroPortfolio, who are trying to co-ordinate research and good practice from around Europe to help provide strategic guidance for ePortfolio development.

We were encouraged to discuss with colleagues the dots and gaps in our organisation, and what we might want to take away from conference to work with those. This was a good opportunity to reflect on the success we have had so far and the distance we still need to go, and compare that with another institutions at the same time.

Researching our practice

The seminar then split into different strands, and I attended the strand themed,”Researching our practice.”

The first session looked at some research around 4 main statements about reflection and ePortfolio.

The statements and results are available here. – http://www.recordingachievement.org/images/pdfs/SEMINARS/kathi.pdf.

The second session had John Peters describing the use of students as partners in research. This chimed with the keynote from the Durham conference by Abbi Flint from the HEA about student as partners, and many of the same references were made (Mike Neary, Mick Healey). I felt the really interesting part of John’s presentation was  the part about student appreciative inquiries as a research and change management approach. Appreciative inquiry methods look at what’s done currently that is seen as good, and how can it be improved even further.  He also described the 4D approach (discovery, dream, design and destiny) and how this guides the research. I think this is an excellent model and would like to conduct some research at Newcastle University based on an appreciative inquiry methodology.

[click on the image to expand]



Research and project report papers

The next session was another split session and I attended an interesting talk by Andy Howard from the Univeristy of Sussex on “Measuring and tracking the impact of curriculum based and co-curricular reflective Careers & Employability programmes.” While very engaging, I didn’t feel as though I had learned anything that I can use at Newcastle University. It was nice to see that their award scheme uses ePortfolio in a similar way to ours.

Dinner and Launch of Rapport

A fantastic dinner was  had at the Barbican in Plymouth. This was an excellent opportunity to relax and meet new friends. As I had attended this conference on my own, I found this an excellent opportunity to meet new contacts in the field of ePortfolio. This event also included the launch of Rapport. Rapport is the new International Journal for Recording Achievement, Planning and Portfolios.

Day Two

An early start on day two introduced Kate Coleman, Project Manager, OLT Project,Curate, credential and carry forward digital learning evidence, Deakin University Australia. Her presentation was titled  “Digital Portfolios and Open Digital Badges – friends or foes? In her presentation she outlined how badges are used at Deakin. You can view Kate’s video she made for the conference below:

As a long-time convert to Digital Badges, I was very interesting in their implementation at Deakin University. I’m interested to see how long it will be before badges become an integral part of the educational process in Russell Group Universities.


Alison James, Associate Dean, Learning and Teaching, London College of Fashion presented the morning keynote presentation. An excellent talk about her work encouraging reflective practice in creative subjects. She discussed the use of lego soft play, the different reactions from academics when introducing lego, and the fantastic reflection that it can bring out when a skilled facilitator is involved.

Ten Years and 12,000 ePortfolios Later

Agnes (Tracy) Hooper Gottlieb, Seton Hall University, NJ, USA: Ten Years and 12,000+ ePortfolios Later: A Transformative Project at Seton Hall University.

In 2005 Seton Hall University introduced a new one semester, one-credit module that all new students had to undertake. This is approximately 12,000 students each year. The course was called “University Life” and was a reflective course exploring the reasons for coming to University and the change in the students personalities.

This course is still run and is an integral part of University culture. Analytics  show 70% of students who get an A in “University Life” will graduate, while only 30% of students who get a C will.
The students have to blog at various points throughout the course, sharing this post with their tutor and they can also share with their peers if they wish to. The first exercise in the first week is to reflect on why they have chosen to come to Seton Hall. The analysis of this exercise looked for things like connection to family. If family is not mentioned, that student is flagged as possibly not having as strong a support mechanism as other students. This type of analysis has helped improve Seton Hall’s retention figures, especially during the first semester.

Accessing the student experience

Sarah Jeffries Watts, University of Birmingham, UK: Reflections of the unexpected: exploring student narratives on partnership working.

Helen Bowstead, Ricky Lowes, Emma Purnell, University Of Plymouth, UK: Getting students to talk back. Eportfolio based learning and the potential for dialogic feedback.

The University of Plymouth used Pebblepad to deliver feedback to their students for one of their modules. The students had the opportunity to reply to their feedback and dialogue between the academic tutor and the student was encouraged. I thought it was interesting that the students’ expectations had to be changed to encourage to engage in this discussion. They were used to receiving feedback using a “transmission” model of delivery, while this “dialogic” model was new to them.

The full programme and resources is found at the following address: