Exposing how automation apps can spy—and how to detect it

Credit: Pixabay/CC0 Public Domain

A team of University of Wisconsin-Madison engineers and computer scientists has identified vulnerabilities in popular automation apps that can make it easy for an abuser to stalk individuals, track their cellphone activity, or even control their devices with little risk of detection.

After designing an AI algorithm to identify hundreds of automation sequences that could be used maliciously, the researchers are now developing an online service to find this covert abuse on digital devices.

The UW-Madison researchers—computer sciences Ph.D. student Shirley Zhang (pictured above), Kassem Fawaz, an associate professor of electrical and computer engineering, and Rahul Chatterjee, an assistant professor of computer sciences—are presenting their work at the USENIX Security Symposium in August 2025, held this year in Seattle.

The researchers have alert students to thank them for the discovery. Chatterjee’s research group operates the Madison Tech Clinic, an initiative staffed by volunteer UW-Madison students and faculty to aid survivors of domestic and intimate partner violence and other technology-facilitated abuse.

Clinic volunteers had often seen abusers using tools like spyware apps or stolen passwords to stalk, harass or embarrass survivors. Then they discovered some abusers used automation apps, like Apple Shortcuts, to quickly and easily take over digital devices. And because of the nature of these apps, the digital intrusions were much more difficult for users to detect.

“Because of all of the capabilities of these automation apps, you can do a suite of things that previously would have required more technical sophistication, like installing a spyware app or using a GPS tracker,” explains Fawaz. “But now, an abusive partner just needs a little time to set up these capabilities on a device.”

In recent years, tech companies have released an array of automation apps—including native apps like Apple’s Shortcuts and Bixby Routines on Samsung phones as well as third-party apps like Tasker and IFTTT—to help simplify digital tasks. Using nontechnical menus, users can “program” the apps to do things like automatically turn down a phone’s volume at school or work, sort photos into specific folders, set up routines for smart home devices like lights and thermostats, or launch a specific playlist when the user gets into their car.

On the flip side, an abuser who has access to another person’s phone for even a few minutes can set up automation routines that share location or texting information, or enable them to overload or control a phone, take unauthorized videos, and impersonate them, among other activities.

Each of these automations acts like a mini-app. However, since they are housed within the larger automation app, phones and tablets do not treat them like individual apps.

In other words, unlike the sounds, badges or banners that alert users when, for example, they receive a text, automations don’t trigger notifications when they’ve been activated or are running. That means malicious automations may go undetected.

To make device access even easier, abusers can find many of these malicious automation shortcuts on social media or other public platforms.

Chatterjee first posed the issue to students in his seminar course, CS 782: Advanced Computer Security and Privacy. Zhang and computer sciences student Jacob Vervelde took on the task, first finding out how perpetrators could exploit automation apps. After the class ended, Zhang continued the investigation as part of her graduate research in Fawaz’s lab.

She next surveyed the public repositories, finding 12,962 automated tasks of all sorts for Apple iOS alone. The researchers then developed an AI large language model-assisted analysis system to detect shortcuts with the potential for abuse. In the end, they found 1,014 combinations that, if placed on someone’s device, could enable abusive behavior.

Next, they used test devices to confirm that it is indeed possible to use those 1,014 shortcuts to perform activities such as sending malicious emails from another person’s account, overloading a phone so it is unusable, locking a user out, turning on airplane mode, and stealing photos—all without leaving obvious, detectable traces.

Zhang says the team notified tech companies about these issues. “One company told us that users are responsible for their own devices, and they should create strong passwords and make sure the devices aren’t accessible to other people,” she says. “But that doesn’t reflect reality; that’s not how things work in the abusive relationships we see.”

The researchers’ analysis also showed that conventional security and detection strategies were of little use: Permissions settings apply to apps and not individual automations, notifications can be easily turned off, and third-party malware detectors don’t scan for malicious automations.

That’s why the researchers decided to turn their AI large-language-model-based evaluation tool into its own app—an online service people can use to detect these malicious recipes, says Fawaz.

While AI may be the current solution to the issue, the team is also concerned that AI will also enable even more digital abuse: Combining AI assistants and automation apps, for example, will likely make it even easier for abusers to cook up recipes for malicious digital tools.

In the meantime, Zhang, Fawaz, Chatterjee and their collaborators will continue to be on the lookout for emerging forms of digital abuse and ways to mitigate it.

“This project is a strong example of the Wisconsin Idea and the ‘circle of research’ in action,” says Chatterjee. “It began with a community-outreach initiative, grew through our course curriculum, and was brought to life by the Kassem’s Wisconsin Privacy and Security research team. Ultimately, it will give back to the community by providing a tool designed to prevent the abuse of automation apps and help protect survivors.”

Provided by
University of Wisconsin-Madison


Citation:
Exposing how automation apps can spy—and how to detect it (2025, August 12)
retrieved 12 August 2025
from https://techxplore.com/news/2025-08-exposing-automation-apps-spy.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.