From left to right: Dr. Cheong Wei Yang, Vice Provost of Strategic Research Partnerships; Anthony Tang, Associate Professor of Computer Science; Jiang Jing, Professor of Computer Science; Ngo Chong Wah, Professor of Computer Science
By Alvin Lee
SMU Office of Research – Artificial Intelligence, like almost all other technologies, can be used for both good and less-than-benign purposes. Fire provides warmth and light but can also be used to commit arson, while dynamite was originally invented to facilitate mining and construction but has been used for destructive ends.
“How well you use the technology, and find a way to use it, is as important as the technology itself,” noted Cheong Wei Yang, Vice Provost of Strategic Research Partnerships at SMU. Dr. Cheong was the moderator of a panel discussion on Artificial Intelligence during a Global Young Scientists Summit (GYSS) 2024 site visit to SMU on 8 January. The panellists were: Jiang Jing, SMU Professor of Computer Science, and Director of Artificial Intelligence and Data Science Cluster; Ngo Chong Wah, SMU Professor of Computer Science, and Director of Human-Machine Collaborative Systems Cluster; and Anthony Tang, SMU Associate Professor of Computer Science.
Generative AI, which is almost synonymous with OpenAI’s ChatGPT, took centrestage just as it has in recent times. Elaborating on Dr. Cheong’s observation of technology’s dual usage, Professor Jiang pointed out that when ChatGPT was first released, one could ask the question, “How can I make a bomb?” and step-by-step instructions would be presented as answers. OpenAI has since addressed that loophole, but it has not stopped people from asking the same question in different ways to trick the software into giving them the information.
Related to that is the question: Who decides what is permitted and what is not?
“Right now it’s a small group of people at OpenAI, but they are not representative of the world population,” says Professor Jiang. “The training data has biases. Users may not be aware of these biases. How can you ensure that users won’t blindly accept the suggestions of such tools?
“People look at ChatGPT and see its power answering factual questions, and get the impression that AI knows everything. If we educate users that there are biases and such, they will understand the big picture in the same way children grow up to understand advertisements are advertisements, not factual information.”
Professor Ngo observed that few, if any, people fully understand how a large neural network such as ChatGPT works. He noted that most of us use mobile phones and watch television without understanding completely the underlying technology, and it does not make much of a difference whether users did or did not. But if those who created an AI system do not completely understand how it functions, then it could pose problems.
“Right now, ChatGPT is software. One day, ChatGPT might be embedded into hardware, such as a robot. Robots can then talk to robots, and things might get out of control,” he says.
Despite such concerns, Professor Ngo gave examples of AI’s benefits. “I have students from places where English is not the native language, ranging from Vietnam to Mainland China. Sometimes they don’t understand each other. Right now they communicate by typing in text in their native languages to generate an image on ChatGPT, and they take it from there.”
Professor Tang told the audience of 86 participants from over 50 universities worldwide that AI applications could change not only how content is produced but how it is consumed. “What if when I get a large email, I put it through ChatGPT to give me a summary and maybe generate a reply, but then my recipient does the same?” he says. “There are questions we need to ask ourselves, such as: What happens when this becomes a substitute for interacting with one another?”
During the Q&A session, when the panel was asked if the seeming obsession with AI might be a fad, Professor Tang cast the collective eye back to five years back.
“Many of us sit on the review committees for research grants. Five years ago ‘metaverse’ was in every grant proposal. Now the phrase ‘generative AI’ has taken its place. While it is difficult to imagine it right now, one day, something else will take its place.”
Back to Research@SMU February 2024 Issue
Want to see more of SMU Research?
Sign up for Research@SMU e-newslettter to know more about our research and research-related events!
If you would like to remove yourself from all our mailing list, please visit https://eservices.smu.edu.sg/internet/DNC/Default.aspx