Responding to the “Arrival”: Essential Background Information and Strategies for Language Instructors in the Age of Human-Like Language Technologies (Machine Translation and Large Language Models)
Machine translation (MT) systems were once clunky and error-prone but have improved greatly in the last decade. Within the last two years, large language models (LLMs) like ChatGPT have achieved state of the art performance at translation and multilingual text generation. The proliferation of LLMs and MT tools has created an existential crisis for K-16 educators, as they are capable of producing plausible sounding texts in a variety of languages with minimal prompting. This talk aims to equip language instructors with relevant background knowledge on the history and technical details of modern language technologies. This will include an intuitive description of transformers and neural networks, as well as a summary of the latest research on the limitations of LLMs. Lastly, we will discuss and share best practices for creating language tech-proof and language tech-assisted curriculum.
Dr. Joel Walsh
Software Engineer, Privateer Space
Joel Walsh is currently a Software Engineer at Privateer space, where he develops machine learning software for a number of vision and language systems. Prior to this, he was an AI research intern at Finetune Learning where he worked on integrating knowledge graphs into Large Language Models. He received his Phd in STEM Education and M.S. in Computational Science, Engineering, and Mathematics at UT-Austin. Both his dissertation and masters thesis explored applications of knowledge graphs and natural language processing in educational and space situational awareness contexts. Prior to returning to graduate school, he spent five years as a public high school teacher in South Los Angeles.