New Samsung smarphone showing security leak allow signing of Android malware apps

Major Android Security Leak: Manufacturer Signing Keys Used To Validate Malware Apps

A security leak involving manufacturing keys from major device producers (such as LG and Samsung) has created a path for malware apps to make it onto user devices in the guise of legitimate updates.

These malware apps can give an attacker full access to an Android device, as the operating system trusts any app that has been signed with this key with complete system-level access. This attack would not necessarily require the end user to download a new app; it could be inserted as an update to an existing app on the device. It would not matter if the app had originally been installed via the Play Store, a manufacturer-specific outlet such as the Galaxy Store, or if it was independently sideloaded onto the device.

Security leak involves keys from several manufacturers

The security leak was revealed by Google, and did not name the manufacturers involved. However, independent researchers were able to learn the names of some of the companies that had keys stolen via subsequent listings on VirusTotal: Samsung, LG, Mediatek (one of the world’s largest chip designers), RevoView (a manufacturer of network devices and cameras), and SZROCO (manufacturer of the Walmart-exclusive Onn budget tablet line).

Though Google has just recently revealed the security leak to the public, it says that Samsung, LG and all of the other known impacted companies had remediated the issue as of May 2022. However, third party Android app archiving site APKMirror reports that malware apps using signed keys from Samsung have been uploaded very recently. Another concerning aspect is that VirusTotal reports some exploits involving signed malware apps that date back as far as 2016.

Google says that Android offers several layers of protection against these malware apps, including active scanning by the Google Play Protect service and “mitigation measures” implemented independently by each device manufacturer. The potential damage is also minimized if manufacturers regularly rotate the keys that they use, though there is no real way for the general public to know if this is being done. Google also says that it has not detected any of these signed malware apps available on the Google Play Store. Given all of this, the greatest risk appears to come from sideloaded apps downloaded from an independent site such as APKMirror.

The involved manufacturers say that they have remediated the issue within their own environments, but it is impossible to know if there are more manufacturers out there that were impacted (and what their current status might be). Ivan Wallis, Global Architect at Venafi, notes that it is crucial for any manufacturer with these signing keys to take immediate action: “This is a great example that showcases the lack of proper security controls over code signing certificates, in particular the signing keys for the Android platform. These certificate leaks are exactly related to this, where these vendor certificates made it into the wild, allowing for the opportunity for misuse and the potential to sign malicious android applications masquerading as certain “vendors”. Bad actors can essentially gain the same permissions as of the core service. The lack of the who/what/where/when around code signing makes it difficult to know the impact of a breach, because that private key could be anywhere. At this point it must be considered a full compromise of the code signing environment and key/certificate rotation must happen immediately.”

Malware apps spotted amidst downloads available via sideloading sites

Though these security keys are sometimes used to sign manufacturer apps, the primary purpose is to verify the legitimacy and status of the version of Android a device is running. That’s how the malware apps can end up with complete administrative access over a device when one of these keys is successfully deployed. Manufacturers are supposed to carefully secure their certificates, but as Samsung’s recent streaks of cybersecurity woes demonstrates, this is hardly a foolproof system.

A malware app will usually have to trick the user into giving it enhanced permissions in some way, by clicking on a file or link or prompt. This security leak is more severe than usual as simply downloading a malicious app update could fully compromise the device. Google recommends to manufacturers that they limit the use of these keys in their apps as much as possible, but some (such as Samsung) use it in hundreds (including highly sensitive apps like Samsung Pay and Samsung Account). A threat actor would have to have access to the internal network of the manufacturer or their app store to get it into the app, making it very unlikely that malicious updates will be passed, but users may want to avoid direct downloads or updates from Samsung and LG until these impacted companies can verify they have remediated the issue.

The security leak could grow to include more manufacturer keys; the best source of information going forward is likely VirusTotal, as Google seems to have decided against naming the involved parties. Android devices that exclusively use the App Store for software are likely insulated against these malware apps, however. Manufacturers can negate the exploit by rotating the security keys that the apps use, but this is more difficult with the older “V2” versions of certificates; these must be included in a device security update, whereas the newer “V3” version allows for rotating on the fly. Newer devices that still receive security updates are thus better protected against this exploit as well.

#Security leak of manufacturing keys from major device producers (such as LG and Samsung) allows signing of #malware apps, providing full access to an #Android device, as the OS trusts any signed app with complete system-level access. #respectdataClick to Tweet

Tony Hadfield, Sr. Director of Solutions Architects at Venafi, suggests that better internal documentation of how signing keys are handled is necessary to head off future security leaks of this nature: “This is a great example of what happens when organizations sign code without a plan to manage code signing keys. If they keys fall into the hands of an attacker it can lead to catastrophic breaches. The only way to prevent this kind of problem is to have an auditable, ‘who/what/where’ solution: how do you control signing keys, where are they stored, who has access to them, and which kind of code gets signed? You need this information to protect your keys and also respond quickly to a breach by rotating your public and private keys.”