showSidebars ==
showTitleBreadcrumbs == 1
node.field_disable_title_breadcrumbs.value ==

The intelligent gatekeeper of your private data

By Michelle Lee Twan Gee

SMU Office of Research & Tech Transfer – We have all done it.

We need an app. We download the app. Then this message appears: Do you agree to grant permission to this app to have access to your files/location/contacts/photographs? We have a moment of hesitation but quickly click on yes because, hey, we need the app.

App permissions control what your app can do and the access it has to your data or hardware. By clicking “Yes”, you could be giving access to stored data like contacts and media files or hardware like cameras or microphones. Once you have granted permission, the app can use that feature however it likes from that point on.

It is not your fault

It is not your fault, says Professor Jiang Lingxiao, who is Associate Professor of Information Systems and Deputy Director of Research Lab for Intelligent Software Engineering at SMU.

It is the fault of app developers, he explains, because their apps are designed with permission requests that most users cannot understand. You might call it a form of legalese.  

However that split-second decision can have huge consequences. Information leaked to a malicious third party can be used for phishing. Harvested information can be sold to advertisers. Your identity can be stolen.

Now, an SMU team he is leading is seeking to rebalance the power equation with a solution that will put the controls firmly back in the hands of consumers. The team calls it the Auto Privacy Model (APM).

The solution not only makes smart decisions for you as to what access to allow, but also gradually learns your permission granting preferences so that the decisions it makes eventually are finely attuned to your priorities and your security concerns.  

The team started off by assuming that most users would get it right on what are legitimate access and what are not. And that most users would base that decision on the interface element of the app, or GUI (Graphical User Interface).

For example, the GUI element could be a search bar in the context of a messaging app. “Search” probably means search for a contact or some message content. It therefore makes sense for the app to process your private contact and message data necessary for it to search content.

Another example: If the interface element is “picture” profile. It probably makes sense to allow the app access to your photo gallery to choose your profile photo.

“The general idea is we can infer the access needs to private data based on the user interface element,” states Professor Jiang.  

The team also looks at the functionalities of different apps. The functionality will define the appropriate private data needed for each app.

Explains Professor Jiang, “For example, I want to write an email. It probably doesn’t make sense to send out a lot of contact information to third parties. You only need one user’s email address.”

The team next analysed big data sets of apps and permission granting behaviour among users to determine the common features of apps and the majority permission granting decisions made by users of these apps. This is based on the team’s hypothesis that the decision made by the majority is the right decision. 

The team then built a relational model between functionalities, the elements on the GUI and majority decisions made by users in specific contexts.

The result will be a consistent modelling of access-granting decisions across the broad spectrum of apps. Any requests that deviate from the accepted normal requests will be flagged as possibly illegitimate.

Fine control that is intelligently automated

But how exactly does the app make things easier for consumers?

The APM solution is designed to be an app that users install on their phone. This solution works like a wrapper around the application programming interface (API) of the mobile phone system, such as Android. The API is a gateway for apps to access data and functionality in the system. By wrapping around the API of the mobile system, the solution will be able to monitor and manage apps' access requests to private data.

The solution does not change the underlying system; it is just one layer that sits on top of the system.

With the solution, when a user launches an app, it will be launched through the wrapper rather than the mobile system directly. In effect, the solution will act as a gatekeeper.

And it is a gatekeeper that is on guard 24/7.

When an app makes a request to the mobile phone system for private data via the API, the solution intercepts the call and makes a decision based on the relational modelling between functionality, the app’s graphical elements and most users’ decision on that request for that particular context.

When the user installs the solution, all he needs to do is decide whether to have the solution provide fake data to an illegitimate request or simply block the request.

Each time the user interacts with the app, the detection of possibly illegitimate access happens in real time for each access to private data by the app. And the solution responds automatically – usually without having to consult the user.

“It is a very fine-grain control because it looks at each GUI element and functionality in the app to make context-sensitive decisions based on what are visible to the user and how the user interacts with the app at the time, and it is real-time control. And decisions are automated by the solution.”

If there is some likelihood that a request may be legitimate but also a probability that it may not be legitimate, the solution will prompt the user: hey, do you really want to allow access?

And the solution will get smarter over time. Built on deep learning techniques, the solution will get better at anticipating its user’s preferences in terms of permission granting and customise its decisions accordingly. 

Reducing the burden on consumers, raising the bar for app developers

The breakthrough aspect of the solution is the automation of decision-making based on a relational modelling of three factors: app functionality, GUI elements perceivable by users, and the decisions made by most users.

“From the user’s point of view, the solution is just a normal application. It is just a launcher. Because it is just a matter of launching apps from the virtualised wrapper layer, it doesn’t impose any sort of burden on the user. And because decision-making is automated, the user just has to give instructions to the solution the first time the solution is installed and during the first few usages of the solution only. Following that, the user doesn’t have to spend time figuring out whether to allow permission or not.”

Ultimately the SMU team hopes to push app developers to be more aware of the need to protect consumers’ privacy.

“Hopefully the solution will raise the bar in terms of what are the permissions really needed for certain functionalities. When enough users complain about apps’ attempts to gain access to their private data without providing user-perceivable clues and when enough blocks are applied to the apps due to illegitimate requests to private data, app developers will be pressured to improve privacy protection in their app.”

The project is supported by the National Research Foundation Singapore, under SMU’s National Satellite of Excellence in Mobile Systems Security and Cloud Security (NSoE MSS-CS). 

Hard at work since October 2019, the team of five plans to conduct a small-scale users’ trial in mid-2021 to evaluate how accurately the APM solution adjudicates data access requests by apps.

The team is excited about what is to come, and they have cause to be. An empirical study carried out has shown promising results; a number of classification models on which the solution is based have been able to differentiate malicious apps from normal apps accurately.

“That means our auto privacy model idea can work on a larger scale,” states Professor Jiang.

The team hopes to eventually offer the solution as a free app on Google Play for deployment in android systems.

Back to Research@SMU Nov 2020 Issue