Quantcast

Northeast South Dakota News

Thursday, November 14, 2024

Artificial intelligence utilized for SDSU prof’s social media research

22

Update | Pexels by Anna Tarazevich

Update | Pexels by Anna Tarazevich

The use of artificial intelligence in everyday life is on the rise. Between voice assistants, search engines, spell check and even financial fraud detection, AI is becoming something that humans rely upon every single day.
Kaiqun Fu, an assistant professor in South Dakota State University's Jerome J. Lohr College of Engineering, has been using AI to assist in his various research activities the past couple years. His projects—which include urban perception on deep learning, traffic impact analysis for smart cities, and emerging technologies predictions—all rely on AI to help sort through and analyze mountains of data.

Social media mining

At Virginia Tech—where Fu earned his master's degree and doctorate—a focus of his graduate research work was on social media mining. That has carried over to SDSU, where Fu is working with Yangxiao Bai, a graduate assistant in the Department of Electrical Engineering and Computer Science, on a project that revolves around automatic storytelling from social media mining. 
"With social media mining, we want to find out—generally speaking—valuable inferences from a big chunk of user-generated text data," explained Fu, who teaches courses in the Department of Electrical Engineering and Computer Science. 
For this project, Fu and Bai wanted to gather data and learn what people were talking about in relation to the COVID-19 pandemic.
"We want to know what people are talking about because people may post about COVID on Twitter," Fu said. "In one minute, there might be thousands of posts, which is a good chunk of data but far too large of a size. We cannot just highlight the summary of what people are talking about with that much data."
COVID, one of the defining events of the 21st century, garnered social media posts from millions of people around the globe. The number of posts to Twitter during the height of the pandemic was astronomical and would be nearly impossible for a human to glean any sort of coherent storyline or summary. 

This is where AI is utilized. Fu and Bai developed an algorithm that incorporates deep learning—a type of AI that imitates the way humans gain certain types of knowledge—to gather and read the large swath of posts.

"Basically, what the algorithm will do is that it will read large amounts of social media posts," Fu explained. "We want to think of something that can summarize this large chunk of tweets and give us some sort of storyline. We need to utilize artificial intelligence models." 

The AI model searches through historical Twitter data and collects tweets using a list of keywords that are associated with COVID. Two major models were used to collect and read the posts. The first, called the Latent Dirichlet Allocation, was utilized for the topic modeling required. 
Topic modeling is a method for automatically searching, organizing and summarizing large swaths of data, often from electronic archives. LDA is one of the most popular topic modeling methods. 
The second part required the use of a model known as the Bidirectional Encoder Representations from Transformers (BERT)—a deep neural network language-based model. 
By using LDA and BERT, Fu and Bai were able to automatically generate a summary of historical, COVID-related tweets. Bai presented on this project, titled "PanTop: Pandemic Topic Detection and Monitoring System," at a top AI/computer science conference. 

Similar work

This project was similar to a previous research study of Fu's, which used AI to determine the current sentiment levels among commuters who use the Washington, D.C., metro transit service, WMATA. 
"When there's a malfunctioning elevator or a huge delay on the 'Orange Line' or 'Blue Line,' people will often post on social media about it," Fu explained. "We used AI to mine those posts and evaluate the sentiment level of the general commuter view about the current status of the transit service."
The WMATA sponsored that project as they had a vested interested in understanding the complaints or undetected delays in their services. Fu's research, which was completed alongside a number of other colleagues, showed that RISECURE, a system they developed that uses real-time social media mining, aides in the early detection of threats, incidents and events within a metro system. 

Original source can be found here.

ORGANIZATIONS IN THIS STORY

!RECEIVE ALERTS

The next time we write about any of these orgs, we’ll email you a link to the story. You may edit your settings or unsubscribe at any time.
Sign-up

DONATE

Help support the Metric Media Foundation's mission to restore community based news.
Donate

MORE NEWS