After spending childhood weekends devouring stacks of library books, Charles Lee Isbell Jr. knows that good science fiction isn’t really about the futuristic but rather about the eternal. Beneath the genre’s fantastical and otherworldly hallmarks lie stories of people grappling with inherently human questions of morals and ethics that are ubiquitous in our world.
Isbell’s career has been defined by the human essence of even the most technological advancements. As an accomplished computationalist and decorated scholar of artificial intelligence and machine learning, Isbell is interested in building and sustaining smart systems, whether those systems exist within a computer or across a campus community. He comes to the UW from the Georgia Institute of Technology where he most recently served as the John P. Imlay Jr. Dean of the College of Computing. As the new provost of UW–Madison, Isbell is the university’s chief academic officer, tasked with leading all 12 schools and colleges that make up the UW. It’s a role for which understanding of intelligent systems is integral and for which recognition of the human factor is essential.
And for cutting red tape, he keeps a lightsaber on his desk.
You’ve defined AI as “building intelligent systems that not only learn from the environment but that learn from people.” Has this frame of thought influenced your approach to academic leadership?
Oh, absolutely. People are at the center of everything. … I’ll tie it into the way I think about research. I come from a place where computing is its own college, and the idea of that was that it was its own disciplined way of thinking, and that really matters. Ways of thinking really matter. The reason you have to take science and math and history and all these other things is not just to give people something to teach, but they’re all different perspectives on how to think and how people are. You have to build your world around that. For my own research, the reason I do the interactive AI stuff … is that if you think about what’s changed in computing and technology over the last 50 to 60 years — what’s changed in this way of thinking that suggests that models, languages, and machines are equivalent — is that little triangle has a person in the center of it, and that’s what gives it meaning: the fact that the human being is what’s making all of the rest of this worthwhile. It drives all kinds of interesting problems. And if you think about it that way, in terms of the way the research works and the way that I came to think about my scholarship, then you realize very quickly that that’s how you have to lead. That’s how you have to build organizations. You have to think about the people and where they fit and change the organizations to work around the people.
You’ve been studying AI and machine learning well before its recent entry into the public conversation. What did the field look like when you started studying it, and how did your scholarship follow its evolution?
I wanted to build Data [from Star Trek: The Next Generation], the androids R2-D2 and C-3P0 — Star Wars. I wanted to build something smart. I remember the first time I wrote an actual program: I built a computer — or I stood next to an engineer who built a computer for me, but we pretended that I was building it — I was 12. And I remember writing a little program asking it to do something, and it did exactly what I asked it to do. That was compelling: understanding the difference between something that does what you tell it as opposed to what you want it to do.
What’s changed over the years is scalability of computing, scalability of data, and all that data is people data. … We’re actually getting more social data and interactions between people, which has all these second-order and third-order and fourth-order and fifth-order effects. That’s what’s changed, and that’s how my interest in it has changed over time. It’s moved from getting the machine to do what I want to do [to] getting the machine to understand the most important data in the world, which is surrounding people.
I was interested because I wanted to build something as smart as people, and it’s turned out that I’ve ended up in a place where all the smartness of people is all around us, and you can model and interact with that, and that is very cool.
There are a lot of conversations happening about AI’s role in education. How might AI influence academia?
It will both change nothing and change everything. All computers really do is they allow us to do the things that we already do much more efficiently, both the good things and the bad things. I am not worried about someone on the UW’s campus or anywhere else building Skynet and having it blow us all up, à la The Terminator. We don’t have to wait 20 years. We are already building the systems that are causing harm. At the center of that kind of technology and the sort of things we’re doing has to be responsibility and awareness of that responsibility. The way that plays out in academia is in three different ways. One, it affects the kind of research that we do and the way that we do that research. Two, it affects the way we can administer and build systems on the way the university itself works. And three, it affects the way the students interface with us and the rest of the world.
On the first front, computing and AI and all the rest of that is just going to drive the way we do all kinds of discovery. … The way we do our work will change, the way we do research will change, no one will be able to get away from this, and we will be better because of it at the end.
It also impacts the way that we could be serving people. If we have all of this data floating around, and it’s all human data, then why can’t we use that to help make people more successful? There are lots of universities out there … using data to actually figure out things like why people leave the programs, why they leave the university, where they fall off. And it turns out once you have that, you can design very simple interventions that make all the difference in the world. … We should be doing those kinds of things. It should be much easier for us to help students, not so much with different learning styles, but to see how they are interacting with the way that they’re learning. Some of what we do is automatable, and that can provide tools to support what’s fundamentally a human endeavor.
Regarding the third point, which is around how students interact with the world, what does ChatGPT mean for education? What is ethical? What is responsible? What is cheating? What is the tool? If you were to take a transcript of these words and ask ChatGPT to write a first draft for you, is that good or is that bad? I might have an opinion about it, but it’s an ethical question that is a man-made one. We have to ask the question, “What is good or bad?” What actually is cheating? What do we need to teach students about what are acceptable and unacceptable uses of these kinds of technology? … We have to figure out how to engage with the students to give them a sense of what’s okay and what’s not okay.
So, AI is touching everything and in ways that I think are more profound than is obvious if you do not reflect very carefully. I will point out that there’s a theme here: all of those are human things. It’s how humans interact with one another. It’s not making a slightly faster robot to mow the lawn ... but it’s how people interact with one another, how they see themselves interact with one another. That’s what it all is.
You’ve been the first and/or the only Black person in many of the spaces you’ve occupied and the leadership positions that you’ve held. How have these experiences influenced your philosophy on leading an institution?
I was the first Black professor in the College of Computing at Georgia Tech and remained that way — their first Black assistant professor, associate professor, professor, associate dean, dean, first Black dean of the College of Computing — certainly at an R1 [research institution] — and the first Black provost at the University of Wisconsin. A lot of firsts, which says more about the rest of the world than it says about me in particular. But what my experience has given me is an appreciation for how a variety of perspectives and experiences can change the conversation.
I’ve said that we don’t have to wait 20 years to see how technology is being misused. There’s lots and lots of examples of it now, and so much of it boils down to not having the right people in the room to point out the silly thing that you’re doing because you’re just not thinking about it. There’s a reason why women are more likely to die of heart attacks: because so many of the studies on heart attacks have been for men. There’s a reason why men are more likely to die of breast cancer when they get it: because so many of the studies around breast cancer are for women. There’s no one else in the room to say, “Actually, what about this sort of perspective?” So, I hope I bring perspective as a Southerner to the Midwest. I hope I bring perspective as someone who grew up on the wrong side of the tracks, and as someone who has had a particular set of experiences as a Black male in the U.S..
To me, these kinds of questions boil down to questions of invisibility, but not the invisibility most people think of. People think of invisibility as, “You’re in front of me, and I don’t see you,” but that’s not the invisibility that matters. The invisibility that matters is that you are not in the room, and I don’t notice your absence. That’s the thing that matters. I appreciate that because of my own experiences and what I’ve been able to see others go through. So I hope that when I’m asked to make a decision or I’m in the room having a discussion, I notice the people who aren’t in the room as much as I do the people who are.
You once conducted an experiment from which you concluded that you need only observe someone for two days to predict what they’ll do next. If we were to observe you for two days, what might we be able to deduce about your plans for the university?
If you were to observe me over a two-day period, I think what you learn is two things. I am really interested in understanding the environment that I’m in — which is always true; I think it’s just interesting in and of itself. But of course, I have to serve this environment, so I need to know that. The second is that this is a great place. I try to only be in great places, but also places where I can bring something and help the people who are going to try to make it greater and make it greater faster than they would’ve otherwise. I am here to support the traditions, to keep what is fundamental about this place the same, but to help it to adapt as we move into 2025 and 2030 and 2040 and 2050 and to set the stage for whoever’s going to follow me, however many years in the future. … That’s what I care about: to affect change for good purpose.
What excites you about the UW?
No, not at all.
There’s a lot that excites me. One is the city itself, actually. This is a very different place from where I’ve lived, at least for the last 20 years or so. But it’s a place that’s kind of fundamentally intertwined with this university, and those notions of community and connection really matter to me. … I care very much about the community and the environment that we’re in, and the UW appears to be quite intertwined in a way that I think is interesting and good on balance.
The other thing is that this place is really broad. I came from a place with six colleges, soon to be seven. Before that I was at a university with five colleges that became six. This is a place with an agricultural school and an education school and a law school, as well as engineering and letters & science and so many others. This is a broad place that touches everything that is a part of the human experience, and I am excited to learn what that really means and how a place like this gets along — very different styles of scholarship, very different views of what it means to be preeminent, very different views of what it means to succeed — but somehow all works together. That balance between the sort of diversity and the autonomy on the one hand, but also moving in the same direction at once — what an opportunity, just to be a part of environment like that.
What is something from Georgia Tech that you’d like to bring to the UW?
One thing that I really like about [Georgia Tech] — remember, it’s my alma mater — is it is fundamentally entrepreneurial, and that has expressed itself in a lot of ways. In particular, my college [the College of Computing] grew really quickly, and it was unafraid to take risks. … There is no safe way to be great, and I would love to make certain that I keep with me the sort of entrepreneurial spirit, that sort of willingness to take risks — calculated risks but risks nonetheless — in order to achieve greatness. I hope that follows me everywhere I go for the rest of my life.
You’re something of a music aficionado. What will you be listening to when you’re walking around campus?
I’m trying to rediscover ’90s-style R&B right now, but my big thing is hip-hop and funk from the ’70s and ’80s.
When you’re not in Bascom Hall, where are you going to be unwinding?
I’ll be playing Ultimate Frisbee on weekends, and I’ll be playing racquetball during the week. When I went out recently and played, I already ran into somebody who’s at the university who’s a very good racquetball player. We ended up on the same doubles team. We won. (I will give him some credit for that.) I’ve seen a lot of people from the university playing Ultimate. So I’ll be out there on the fields playing, or I’ll be in the courts playing. Come say hi.