
N.C. A&T Researchers Study Deepfakes Detection, Impacts on Political Leaders
By Jamie Crockett / 04/24/2025 Research, College of Engineering
- 4-H and Youth Development News
- Academic Affairs News
- Accounting and Finance News
- Administration and Instructional Services News
- Admissions News
- Agribusiness, Applied Economics and Agriscience Education News
- Agricultural and Natural Resources News
- All News
- Alumni News
- Animal Sciences News
- Applied Engineering Technology News
- Athletics News
- Biology News
- Built Environment News
- Business and Finance News
- Business Education News
- CAES News
- CAHSS News
- Chancellor's Speaker Series
- Chancellors Town Hall Series
- Chemical, Biological, and Bio Engineering News
- Chemistry News
- Civil, Architectural and Environmental Engineering News
- COAACH News
- COE News
- COED News
- College News
- Community and Rural Development News
- Computational Science and Engineering News
- Computer Science News
- Computer Systems Technology News
- Cooperative Extension News
- COST News
- COST News
- Counseling News
- Criminal Justice News
- Deese College News
- Economics News
- Educator Preparation News
- Electrical and Computer Engineering News
- Employees News
- Energy and Environmental Systems News
- English Department News
- Faculty News
- Family and Consumer Sciences News
- Graphic Design Technology News
- Hairston College News
- Headlines News
- History & Political Science News
- Honors College News
- Human Resources News
- Industrial and Systems Engineering News
- Information Technology Services News
- Innovation Station News
- Journalism & Mass Communication
- JSNN News
- Kinesiology News
- Leadership Studies and Adult Education News
- Liberal Studies News
- Library News
- Magazine News
- Management News
- Marketing News
- Mathematics News
- Mechanical Engineering News
- Media Spotlight News
- Natural Resources and Environmental Design News
- News Categories
- Nursing News
- Psychology News
- Research News
- Social Work News
- Staff News
- Strategic Partnerships and Economic Development News
- Student Affairs News
- Students News
- The Graduate College News
- Transportation & Supply Chain
- University Advancement News
- Visual & Performing Arts News
EAST GREENSBORO, N.C. (April 24, 2025) – Using artificial intelligence (AI) to create a professional headshot, swap faces with someone else or see what your future child may look like, and more, has underscored how the ability to alter images and video has become increasingly accessible.
“Maybe six years ago you would need a computer scientist or a computer engineer to do what are we are seeing today, but now anyone can use AI,” said Kaushik Roy, Ph.D., director of the Center for Cyber Defense at North Carolina Agricultural and Technical State University. “It easily becomes a huge problem when fake videos and images are out there.”
That is particularly true for politicians during election campaign season, who are targets of bad actors using software to manipulate images, videos and audio to incite fear and spread misinformation.
Roy, Kashifah Afroz, a student at the STEM Early College at N.C. A&T, and Ph.D. student Swetha Chatham studied types of “deepfakes,” how to detect the models and how to determine if content is authentic or fake.
“As part of my community engagement, I work with high school students interested in research and Kashifah reached out to me to get involved and started working with me when she was a high school junior,” said Roy. “This is great exposure at a very early stage - not just getting involved in research, but also taking the lead as a first author on a research article is impressive. Opportunities like this help students build their resumes and enhance their applications as they consider various colleges.”
In the paper “Understanding the Threat of Political Deepfakes” first authored by Afroz and presented at the 4th annual IEEE Conference on AI in Cybersecurity at the University of Houston, the team referenced several examples of deepfake content, including “false speeches created by an AI tool from former president Barack Obama surfaced after his presidency.”
The team listed relevant studies and their authors, the datasets used and detailed what each one helps improve deepfake detection success rates.
The three main types of deepfakes include manipulated audio, image and video content, which are then broken down into various subtypes. For example, lip sync, face-swap and puppet master are the three subtypes for video manipulation.
“Lip sync uses a trained network to take an authentic video and map the mouthing and position of features to be consistent with synthesized audio,” the team explained in the paper. “This form can take segments from the original video to replace parts of the audio to create a smooth blend.”
The researchers noted in the paper that video deepfakes are the most common threat to a politician. When people consume this type of content, especially during wartime, it can “cause harmful effects that pose a threat to national security, such as mass panic among the citizens.”
Advances in technology have rendered the ability to spot missing or inconsistent blinking almost useless in deepfake videos, for example, as most systems are more “humanoid” and realistic, factoring in blinking. Most laypersons are not able to tell the difference because of these developments.
“AI is inevitable and it is everywhere,” Roy said. “Even to understand what is happening, a person would need to improve their AI literacy and seek out information.”
Roy suggested platforms like YouTube and even Chat-GPT as resources to learn more about AI; however, users should know that not everything is trustworthy, so they should engage with a level of caution.
Media Contact Information: jicrockett@ncat.edu