Notes for the ALT Summer Summit Panel 27 August 2020
Learning Technology Beyond the Crisis: Policies for a sustainable future
On August 27th, I participated in a panel (1) discussion at the Association for Learning Technology (ALT) Summer Summit together with Mary Burgess (BCCampus), Laura Czerniewicz (CILT, UCT) and Anne-Marie Scott (Athabasca U). A big thanks to Keith Smythe (UHI) for chairing the session so well. What follows is a slight expansion of my preparation notes. I’ve added links to the programme where referenced, together with a few other references. Recordings of sessions are available from the Programme.
The first question I asked myself when I began to think about the pandemic, sustainability, and policy was “What is conjunctural and a facet of this specific crisis, and what is likely to be structural, or at least with us potentially for years?”. I don’t have any specific medical expertise or indeed a crystal ball, but looking at the emerging evidence - including some pretty scary visuals on infection rates and US College returns in the New York Times this morning - it seems prudent to expect this situation to be with us for some time.
Perhaps come as no surprise, given my background, that the policy areas I want to focus on in these few minutes are around technology, specifically the acquisition of software to support the academic mission, and some concerns about privacy. So I’m offering thoughts on a couple of pieces of the jigsaw.
We’ve seen a good many commercial-proprietary software vendors in higher education space offer sweetheart licensing deals in the initial stages of the pandemic as institutions responded rapidly to move online. How long these licensing deals will be maintained remains to be seen. What happens - as seems likely to me - when licensing costs rise also remains to be seen, but those costs are going to be passed on into someone or something.
It strikes me that this should prompt a review of the policies and processes by which institutions acquire and maintain software and software services. And you’ll notice I didn’t use the word “procure” or “procurement” in that sentence - because often procurement processes themselves rule out open alternatives to commercial-proprietary software. So there’s an institutional policy level, and a jurisdictional policy level to consider there.
How do we deploy and support open software? I found Melissa Highton’s comments in yesterday’s opening panel (2) very interesting, especially when she described the sense of emerging solidarity between institutions in Edinburgh. It will be interesting to see how far that solidarity develops and matures. There are pointers from outside the UK here: in France (3) a consortium of universities are developing shared or mutualised services around open source code as platforms for innovation. That’s a model - supported in France by the Ministry of Higher Education and Research - that bears closer examination for the future.
In France, of course, there is a great focus on data protection, and that’s been a factor in prompting ministry support for a sector-wide approach. Which brings me to the last point I’d like to raise. There’s a danger in any situation such as this, that in focussing on supporting teachers and learners move to wholly online very rapidly, that basic safeguards will be set aside or sidelined. There are privacy concerns around data collected by major platform providers and what that data might be used for. That connects very directly with the topic of yesterday afternoon’s keynote session with Angela Saini (4) that Charlotte Webb (5) amplified this morning. How do we better expose the intrinsic bias of predictive algorithms?
This year’s A Level results chaos in England has opened a few eyes in the UK, and has already had some impact on local government use of algorithms to determine welfare and benefit provision (6). It strikes me we should seize this opportunity at an institutional and national policy level. We should demanding, as a minimum, that algorithms and the code that surrounds them are open to meaningful inspection and interrogation. Calls for algorithmic accountability should be a principle - but at present they’re also a useful tactic. We need to explore the detail of what might enable “meaningful interrogation” as a matter of urgency. Perhaps there’s a very specific role for our colleagues in Computer Science here. As an aside, exploring other approaches to making the impact of predictive algorithms more widely accessible, such as the use of counterfactuals, strike me as very valuable. The Oxford Internet Institute has some interesting work on counterfactuals if you’re interested in reading further (7).
This community, and communities like it around the world, have begun to make open education a reality. I think the conversations at this event have demonstrated strength and inclusivity. Thanks for the invitation to participate.
(1) ALT Summer Summit 2020 Closing Panel: Learning Technology beyond the crisis: Policies for a sustainable future https://altc.alt.ac.uk/summit2020/sessions/learning-technology-beyond-th...
(2) ALT Summer Summit 2020 Opening Panel: https://altc.alt.ac.uk/summit2020/sessions/learning-technology-in-times-...
(3) ESUP-Portail https://www.esup-portail.org/
(4) ALT Summer Summit 2020 Q&A With Angela Saini https://altc.alt.ac.uk/summit2020/sessions/qa-with-angela-saini/
(5) ALT Summer Summit 2020 Keynote: Charlotte Webb https://altc.alt.ac.uk/summit2020/sessions/keynote-charlotte-webb/
(6) Guardian 24th August 2020 https://www.theguardian.com/society/2020/aug/24/councils-scrapping-algor...
(7) Oxford Internet Institute Blog 15th January 2018 “Could Counterfactuals Explain Algorithmic Decisions Without Opening the Black Box?” https://www.oii.ox.ac.uk/blog/could-counterfactuals-explain-algorithmic-...