Q & A with Stephen Addison: April 2023

Dr. Stephen Addison

Stephen Addison, Ph.D.,

Professor of Physics

Dean, College of Natural Sciences and Mathematics

University of Central Arkansas

IN THE MIDST of the current Chatbot boom, it’s been hard for a lot of us to fully understand both its importance and its long-range implications. So this month we decided to call on one of this newsletter’s most frequent contributors, Dr. Stephen Addison of UCA. As usual, he puts things in perspective with wisdom, good humor, and even a dollop of pop-culture context.

————————————–

As the ultimate tech layman, I’ve been seeing all this sudden Chatbot coverage and thinking, Wow, are today’s college students going to be able to keep up with this stuff now?

We’ve been including AI and machine learning in our courses for years. But let me elaborate on the different kinds of artificial intelligence. There are really three kinds, and people tend to talk about “AI” in general without making any distinctions.

So, there’s artificial intelligence, and the Chatbots are an example of that. What we really mean by artificial intelligence is that you can use computers to get useful results quickly, but those systems have no memory that they ever gave you those results. And they have no understanding of what those results are or mean. For the most part, they’re based on language-predictive models. The bots are trained on large data sets, and what you get out is what you put in. By putting in different sorts of information, you get different qualities of answers.

Someone with Chat GPT tried it on some arithmetic. The bot added two numbers and got seven as the result. The person then said, “Well, my wife says the result’s eight, and my wife’s never wrong.” So, Chat GPT then just responded, “Well, the answer’s eight. I’m sorry. My training data only went up to 2021, so I must be out of date.” In other words, it used the result that the person’s wife was never wrong to contradict itself. So, there’s no understanding in those systems.

But there are two other sorts of artificial intelligence that we need to talk about. One is called AGI, which means artificial general intelligence. An AGI system would be one that could respond exactly like a human, and would understand what it is doing. As yet, those systems don’t exist.

Then there’s the one that everybody fears: artificial superintelligence, or ASI. That refers to systems that would be beyond human capacity. But everybody is really influenced by that old Arnold Schwarzenegger movie Terminator, in which a computer system became sentient. Well, an AI system is not going to become a sentient system, because it’s just a series of algorithms that can produce answers quickly.

But recently, people have said, “Well, maybe we’re seeing some of this”—because some people have found themselves being insulted by the Chatbot, so they thought it had developed human capabilities. Well, no, it hadn’t. It’s just that some of the training conversations were in fact snarky, and it used some of those conversations to insult the people who asked the questions. But the system had no idea what it was saying. It just basically works on probabilities in predicting what the next word needs to be. If it’s there in the training data, it can appear in the answer.

Even so, there appears to be a widespread fear of job loss because of the Chatbots.

Yes, but it’s not because of AI, it’s because of computers in the first place. Ten years ago, there was an Oxford University study that basically said that 47 percent of all current jobs in the United States job market would disappear by 2035. Is that happening? Absolutely.

But it’s nothing new. Jobs have always disappeared. The particular problem that we have now is with the pace of the jobs disappearing. In the past, change took place more slowly. It was possible for somebody to continue out their career—it was just that their children couldn’t go into that career. Today, change is happening so fast that, yes, careers will be ending. And people will need to be retrained. But all sorts of new jobs will reappear as a result of these things.

We have an extreme shortage of data scientists at this point. We have an extreme shortage of people in the computer industry. But the issue we have is that it’s tough to retrain many of the people whose jobs are disappearing. Because the easiest jobs to computerize are the repetitive jobs.

Is there a certain age group of people whose jobs are being outdated? These aren’t young people, generally, are they?

Well, there could be some. It depends what they went into—some people went into the wrong field. But it’s not so much about age, but about whether people have kept up with changes in the world. I’m 66. When I became a faculty member, I was the first faculty member on this campus who had an IBM PC machine in his office. I had the first faculty email address on this campus. And I constantly kept updating my skills. The people in danger are the people who have assumed that they could keep doing the same thing forever.

The people I was thinking about are those who’ve been going to a factory for years. But suddenly, that’s all going to be automated.

Right—but not “suddenly” at all. Computerization in those jobs has been going on for 30-plus years, so, in fact, lots of our factory workers are very computer savvy. And it’s not necessarily those people who have the most difficulty being retrained. Now, if you were making shirts in a shirt factory, then probably you weren’t using very many computers. But if you’re in an industrial factory, industrial robots have been a fact of life for years.

If you go to a Kimberly-Clark plant these days, you risk being run over by a robot out on the shop floor. But I visited a Kimberly-Clark plant in the 1990s, and the same was true back then. The robots were just less sophisticated.

You said you’ve been teaching AI for years. Tell me about that.

We teach people to use such systems, and to exploit them to do new things. When you see technologies like this, you can either fear them and think, “Oh, I’m never going to understand that.” Or you can embrace them, and figure out, “Okay, what’s the next step? What else can I do with this?” And that’s what the new jobs come from.

Just look at who some of the major employers are these days—Microsoft, Facebook, Twitter. All of those were jobs that wouldn’t have existed not very long ago. These are jobs built around algorithms. They employ a different sort of worker, but there are people at all levels, from data entry on up. So what we do in our classrooms is make sure our students are exposed to the latest technologies.

As you know, I’ve also been working with partners throughout the state to upgrade and standardize computer science programs in Arkansas high schools. It wasn’t that we wanted everybody to take a computer science course because we thought everybody would become a computer scientist. We wanted them to take a computer science course so they would understand the implications of computing. So, to recognize, “Okay. These are jobs that have a future. These are jobs that don’t have a future.”

As for artificial intelligence and our curriculum, we may not have called it AI when it was first introduced. But data science is a new name, too. So, while the actual name may not have been around, the ideas were there. People talked about the separate ideas separately before aggregating them under the AI rubric. But people have talked about things like neural nets and things like that for many years. And we’ve probably had named courses in our curriculum for at least 15 years, probably longer.

So, how big is this development today? Suddenly, the Chatbots are everywhere. A lot of people see them making huge changes in the world.

But those changes have been happening all along. And Chat GPT has actually been around a few years. It’s not even the current version; it’s the old version that they’re released to the world free, basically. There are more sophisticated versions out there at this point. Ultimately, they all provide us with more information, and we’ve got to develop methods of dealing with that increased amounts of information.

But in terms of direct challenges at the moment, AI’s going to happen. We’re not going to pause it. We can’t. Because, if we were to decide, “Okay, we’re going to have a moratorium on it,” we would, a couple of years later, look up and find ourselves so far behind that we would never be able to catch up again.

If you were a career counselor these days, what would you tell some young person just out of high school who wants a tech career, but doesn’t know exactly which field to go in?

Well, I’d tell him or her that things haven’t really changed—what a tech professional needs to be able to do today is write and speak clearly. Communications skills are still key to getting a good job. You also need to be able to think critically, and you can learn to do that in a variety of majors. It’s just, pick one where you do.

The other thing I would say is, you need some domain knowledge. But the domain knowledge is independent of those other skills that I mentioned. You should be able to think critically about any domain of knowledge. And you should be able to write and communicate in any domain that you happen to be working in.

You may change your domain specialization over your career, but all of those other skills, you need to be able to adapt. So, you can follow your passion as long as you develop those skills. I’ve joked that people could follow their passions these days as long as they also attended the Arkansas Coding Academy—so they can learn those hard skills that will enable them to get a job.