I’m delighted to be here today to attend this conference and more than happy to be giving a brief introduction— making interdisciplinary connections around language and AI is the exact right thing we should be doing at Berkeley in 2024.Thank you, Kayla van Kooten, Kimberly Vinall, and Emily Hellmich, for organizing the conference. And also a huge shoutout of appreciation to Sarah Fullerton for making the connection with the planners and me, and for her overall vision and collaborativeness.
As I’m so interested in what follows, I’ll be keeping my remarks accordingly brief! A few words to let you know about the Berkeley AI Community – followed by a short reflection on the importance of this conference and the work of humanists and especially language experts for AI.
I don’t want to miss this opportunity to let you know about the Berkeley AI Community. This is an inclusive, diverse group spanning many disciplines. Everyone is welcome; students faculty and staff. Also, as the D-Lab so eloquently puts it – you do not need to be an expert to participate or engage with the group!
AI COMMUNITY
All the information you need to join including the google group is on the CTO website. Kara Ganter, who some of you know, co-chairs the Berkeley AI Community with me, and we are both thrilled at how it has taken off. My original vision for the community was that it would become a hub for people engaging with all forms of AI, those critiquing it, those inventing it and those exploring how to use it meaningfully. Since we kicked off late last year (2023), the community has held sessions to contribute feedback to the California Governor’s office on how the state should respond to GenAI, we’ve held listening sessions attended by more than 100 people who helped us provide input as a community to the UC system on contracting efforts with Open AI for ChatGPT Enterprise. Another major purpose for the group is to accelerate knowledge across many groups at Berkeley looking at AI from different angles and help people with similar interests and challenges make connections. There is a fairly active google group, we are curating events, and also working on an informational resource to track trends, research articles and so on. I hope to see you there!
Now, to the conference.
The evolution and explosion of tools and techniques that support scholarship have changed what’s possible. I am not sure I will articulate this the right way – but watching the research and dialogue in various technical communities— both in academia and also in the tech industry— it feels like a shift is happening.
It feels, retrospectively, like the technology focus of the last 50 years has been tech-centric, often disproportionately about the tools or technologies themselves. It feels like we are shifting into an era where, yes: now we have these foundational capabilities and technology. Okaaay, great.
The greater future, potential value of technology feels like it is no longer as inherent or centered in the technology (or tool construction) itself. Technology needs wider interdisciplinary engagement to realize technology’s fuller potential. In other words— with the growth of user experience and design, automation and now generative AI, we’re collectively moving up the stack from infrastructure to enabling wider access and productivity for non-engineers. Moving from infrastructure to enablement is also pretty much how I think about a CTO’s job at Berkeley. How does the University remove the barriers and make sure the technologies, services and capabilities can get into the hands of super smart, super creative, super curious people. Ie, you- so you can interrogate the technologies themselves from your own disciplines and help us all see what’s possible, by exploring the questions you think are import. A shift from a more limited STEM perspective, to a wider lens. Note this is not excluding STEM- it’s just increasing the participation from more disciplines –this is also a major recognition and focus of the University!
At a time when some of the narratives about AI seem pretty disempowering to people (or swing from super optimistic and perhaps wildly hyperbolic to downright dystopian and depressing!)– we need the humanities. The human centered disciplines— they involve the study of us, humans. Our culture and our languages.
So today there are an interesting and growing set of technological capabilities that happen to be controlled— by language. Who better to study a technology guided by and generative of language— than linguists, and scholars of culture and language?
Bringing in deep expertise from outside the STEM disciplines at universities like ours, and groups like the BLC and the Townsend Center to collaborate — to engage with , interrogate and develop AI, may afford just the sorts of fresh perspectives that are needed if we are to evolve the technology and society’s collective response to it for the benefit of people.
Thank you.