TEASIG Webinar Series
Most recent webinar: Rudi Camerer from elc-European Language Competence on 19th February – ‘A CEFR Special – The new Companion Volume highlighted’
Details of previous webinars can be found on the Previous TEASIG webinars page.
TEASIG at Pearson London, November 1 & 2, 2018
“Technology for teachers in assessment – the immediate future”
David Booth – Integrated skills testing and automated scoring – where are we going?
Larry Davis & Veronica Timpe Laughlin – Assessing spoken interaction: How can technology help?
Peter Cormack – Teaching to the test- construct relevant and irrelevant approaches
Aylin Ünaldi – Automatic Text Analysis in Reading Research and its use in Reading Assessment
Technology, despite its extremely broad context, is omni-present in todays teaching world and its benefits are evident across many different teaching domains. The use of the internet, tablets, laptops, mobile phones, learning apps, social media… the list of where technology has influenced and infiltrated teachers’ and learners’ worlds is endless. Learners can benefit from the use of technology like never before. Even the recently accepted expression “e-learning” differentiates today’s teaching and learning from an older style of course book and whiteboard.
So much has changed in such a short space of time, but many teachers feel they are running just to stand still. The endless list of new software and apps means daily challenges that are difficult to meet. The very speed of technological change has lead to widespread resistance.
Testing and assessment has not been spared the technological onslaught either. Larger and smaller scale testing has clearly benefitted from advances such as online testing, ease of preparation material, more objective marking, ability to source and prepare authentic material, whilst audio and video software has made test material development so much easier. But, like learning, technology has brought about criticism and resistance in the world of testing and assessment. Amongst major concerns are keyboard inputs in listening tests where older students can struggle, adding construct irrelevant variance. Using online testing to assess spoken interaction has been criticized for not replicating a real-world human communication. Certain test service providers advertise online testing as a huge advantage without any clear evidence to support this. Additionally test security can easily be compromised by the ability to record tests on hidden portable electronic devices.
Presentations from the event to download: