Skip to content

Inside AI at the UW: A conversation with Noah Smith (Part 3)

Image of Noah SmithIn a three-part interview, vice provost and endowed chair for artificial intelligence, Noah Smith, addresses the University of Washington’s role and responsibility in the development and ethical use of AI, preparing students for careers and funding support for projects that explore how artificial intelligence can enhance teaching and learning.

In this third installment, Smith addresses how AI can enhance the student experience, how the UW can prepare students for careers and the relationship between higher education and industry.

How can AI technologies enhance the student experience?
I think there are at least as many answers to this as there are students; ask a student and they’ll likely have an experience with AI that I wouldn’t have thought of. The one thing that stands out to me about language model-based AI is that it never shows impatience with the user, and it certainly never gets tired. In at least this one way, it’s superior to me and every teacher I’ve ever had (and I had some great ones). We all have limited patience and eventually grow tired and less effective.

If we establish a cultural norm that students are ultimately responsible for their learning and do not choose shortcuts, AI systems may function like an infinitely-patient study partner that helps with mastery of subject material, offers challenging questions to help students engage more deeply and even makes connections across topics a student is studying.

An example. Personally, I have used AI to help me write code for a flashcard app to help me with vocabulary in a (human, not programming) language I’m learning. I could have written the code on my own, but I’m rusty and the process would have been slow. I could have made flashcards with paper like I did as a student, but I can barely read my handwriting. It’s still on me to make time to use the flashcard app and learn the words.

How can we best prepare our students for jobs and careers?
Our job isn’t to chase every new AI tool. It’s to help students learn how to think about AI, learn with AI, talk sensibly about AI, take responsibility for continuing to learn and how to keep advancing their own judgment and creativity. If we can build that mindset, they’ll be equipped for whatever comes.

What skills do they need to enter a workforce in which AI is increasingly prevalent?
Everyone needs AI literacy: how the tools work (at some level of abstraction), what is still not fully understood about them, what is considered responsible use in their professional community and how the outputs fundamentally differ from answers given by a human expert, another student, a web search, etc. And many skills our students need are the same as ever: communication, collaboration, problem-framing, ethical judgment and the ability to adapt as the landscape continues to change.

One way I think the UW can lead (and already is leading) is in answering the question, “What is AI literacy today, and how do we keep improving it tomorrow?”

With enormous power and potential to change our world, what jobs will AI replace?
AI won’t replace entire professions so much as it will reshape tasks within them. I expect that tasks that are routine, repetitive, rules-based and/or easy to check will see the most automation. Hopefully tasks that are dangerous or unhealthy for humans will become less so.

There’s a lot of talk about how workers need to adapt to take advantage of automation, but I also think our society’s organization of work will need to adapt. Ideally, the benefits of automation are shared widely. If 20% of everyone’s job is automated, why don’t we move to a four-day work week?

Along those lines, task automation can create bandwidth to do the things only humans can do: critical and strategic thinking. At a time when everyone is stretched thinly, this technology can make the space for us to do more of what inspires us in our careers. Such changes are not unprecedented in our society, and I can’t think of a better way to activate human creativity than to give everyone more time for their creative pursuits.

Industry operates in a highly competitive ecosystem, motivated by profit and urgency, while academia moves at a more measured pace, focused on the public good and transparency. Yet, industry and academia need each other. Industry needs our graduates to fill jobs. Academia needs industry to get their innovations to the marketplace. How do we balance this symbiotic relationship and stay true to our values?

We find balance by being clear about each side’s strengths. Industry moves fast and builds “at scale”; academia takes the long view and protects the public interest. The partnership works best when we let those strengths complement each other. So let’s start by being explicit about the UW’s values around transparency, ethics and academic freedom.

Industry needs developed talent and the kind of new ideas that tend to emerge at universities. But academics need help translating those ideas into real-world impact. The key is setting the terms of collaboration: we need open research where possible, strong conflict-of-interest practices and a shared understanding that the university’s mission comes first.

I’ve seen many examples of productive, principled coordination and collaboration across the industry-academic divide. I’ve also seen the tensions. I don’t think AI fundamentally changes this, except maybe that industry, especially tech industry, is feeling a sense of urgency and competitive pressure right now. But wise tech leaders know that they also can’t stop thinking about the long term, and this is exactly where academia, especially U.S. academia, has a proven track record as an incubator for creative minds.

Missed part two of this series? Please visit our News page to read how the UW is addressing concerns around privacy, ethics and the impact of AI on fields such as the humanities