Google's smart algorithms to make Android apps less intuitive
Machine learning will help Google identify potential privacy breach for Android apps.
If you use an Android smartphone running Android 6.0 Marshmallow or above, then you must have noticed every new app asking for permissions. This was a feature added by Google to let users have more control over an app. Users have the ability to decide what parameters of the device an app has control over so as to make a safer ecosystem.
However, over the course of time, app developers have cleverly made it necessary for the user to give the app access to almost all the parameters of the device for vested interests. This had led to the bombardment of bloatware on the Android ecosystem. This also makes Google’s attempt of keeping the Android ecosystem clean completely useless.
Google though, hasn’t given up and is now using the power of artificial intelligence to deal with rogue apps asking for permission to every parameter of your smartphone. Google is utilising its prowess in the field of machine learning to sort out the apps that are vicious.
A new algorithm will now create clusters of mobile apps with similar capabilities. It will sort apps into different categories based on the app meta data such as app description and number of installs. Then, peer groups are used to identify anomalous, potentially harmful signals related to privacy and security; from each app's requested permissions and its observed behaviours.
The correlation between different peer groups and their security signals helps different teams at Google decide which apps to promote and determine which apps deserve a more careful look by security and privacy experts. Google will also use the data to help interested app developers improve privacy and security of their apps.
What’s satisfying to see is that Google is trying hard from its side to utilise the best of technology and deal with the issues of cyber privacy. Machine learning will go a long way in creating an Android ecosystem that’s safe and sound.