We recently discovered that Android devices from multiple major brands sign APEX modules—updatable units of highly-privileged OS code—using private keys from Android’s public source repository. Anyone can forge an APEX update for such a device to gain near-total control over it. Rather than negligence by any particular manufacturer (OEM), we believe that unsafe defaults, poor documentation, and incomplete CTS coverage in the Android Open Source Project (AOSP) were the main causes of this issue.

Google assigned the issue CVE-2023-45779, and most affected OEMs have now fixed it. Any device that comes with the Play Store is no longer vulnerable if it advertises at least a 2023-12-05 Security Patch Level (SPL).


APEX modules allow OEMs to update certain files in an OS image without issuing a full OTA. To do so, they locate each updatable unit (e.g. Bionic or ART) in its own ext4 filesystem image inside an .apex ZIP file, which gets mounted under /apex. An initial version of each APEX is preinstalled in /system/apex (or /vendor/apex, etc) during the OS build, but those versions can be superseded by updates installed later in /data/apex.

To ensure APEX updates are trustworthy, Android checks that each one is signed with the same keys as the preinstalled version of that APEX. APEXes carry both a standard APK signature and an AVB signature on their interior filesystem, and both are checked in this way. So to create a valid APEX update, one must possess both the APK and AVB private keys that were used to sign that APEX when the OS was built.

When it comes to signatures, though, an “OS build” isn’t just one step. Android’s core build system signs every APEX, APK, and OTA image it produces with a fixed set of “test keys”, and there’s no way to change that. Test keys are public in AOSP’s source tree: for example, this is the test key that signs the filesystem of the com.android.art APEX.

As described in AOSP’s documentation, the job of re-signing a build with OEM-held “release keys” falls to a Python script called sign_target_files_apks, which unpacks a built image, replaces all the signatures, and repacks it. Re-signing as a separate step has several benefits, but it also introduces the risk that not every test signature will get replaced. And that’s exactly what seems to have happened for several OEMs.

Vulnerability details

We analyzed OS images of recent Android devices from 14 reputable brands (listed below) and found that seven of those devices contained at least one preinstalled APEX signed only with AOSP test keys, for which anyone can produce an update.

Every vulnerable device we found had one highly-privileged vulnerable APEX in common—com.android.vndk. This APEX holds shared libraries that HALs in the /vendor partition link against. Thanks to the existence of Same-Process HALs, those libraries get transitively loaded into most processes on an Android system, including

  • zygote64, from which System Server and every app process is forked;
  • surfaceflinger, through which all screen contents pass;
  • and of course all the HALs.

We demonstrated code execution in all of the above via a malicious com.android.vndk.v31 APEX update for a vulnerable device, a Lenovo Tab M10 Plus (Gen 3, Wi-Fi) running Android 13. You can find our proof-of-concept, as well as a script to check for vulnerable APEXes, here.

As an aside, com.android.vndk is somewhat odd because there are multiple copies of it—one for each Android API level /vendor can target, of which there are several thanks to Project Treble. Each copy has a different APEX name (e.g. com.android.vndk.v33 for Android 13), and it’s only useful to exploit the one /vendor actually uses. On all the devices we tested, every copy was equally vulnerable.

Attack scenarios

Fortunately, Android tightly controls who can install APEX updates. Although APEXes look like APKs and are installed via PackageManager, the INSTALL_APEX flag is restricted to packages that hold android.permission.INSTALL_PACKAGES or android.permission.INSTALL_PACKAGE_UPDATES, both of which are signature|privileged permissions and cannot be obtained by third-party apps. And even packages with that permission must additionally either

  • run as the system user
  • run as the shell user
  • be designated as “module installer” in /{system,vendor,product,odm}/etc/sysconfig/

In practice, that limits the exploitability to four attack scenarios:

  1. A user uses adb shell to exploit their own device, gaining access to nearly everything a typical “root” gets them. Since access is distributed across many SELinux contexts, root-aware tools and apps won’t work unmodified. On the other hand, root detections won’t trip: to them, the malicious APEX will be indistinguishable from legitimate OS code.
  2. A malicious actor with physical access to an unlocked device uses adb shell to install persistent malware without the user’s knowledge and gains long-term access to all data and activity on the device. The malware will likely go undetected by on-device scanners for the same reason a root will.
  3. A malicious actor chains this exploit to one that gets them code execution in com.android.vending (the “modules installer” on all Project Mainline devices) or a system UID app to escalate their privileges and gain persistence.
  4. A malicious actor who gains access to whatever Google Play backend serves APEX updates remotely exploits devices en masse. We do not know the details of Project Mainline’s infrastructure so cannot assess how feasible this is. For example, it becomes far more plausible if OEMs are allowed to upload APEX updates to Google Play than if only Google is, as there are more credentials for an attacker to compromise. In that case, depending on the specifics of update targeting, a malicious OEM could potentially even exploit other OEMs’ devices.

Root cause

Why did so many OEMs make the exact same mistake? Recall that AOSP comes with instructions to re-sign a build, a script to perform that re-signing, and—as a last line of defense—its extensive Compatibility Test Suite, which enforces various compatibility and security guarantees. Shouldn’t one of those have warned OEMs of vulnerable APEXes?

In answering that question, we uncovered a number of deficiencies in AOSP that, together, make it far easier to create a vulnerable build than a secure one.

Incomplete CTS coverage

Most critically, we found that several APEXes are not checked by CTS at all. Two CTS tests look for test keys: PackageSignatureTest checks both APKs and APEXes for insecure APK signatures, while ApexSignatureVerificationTest checks APEXes for insecure AVB signatures. But both tests hardcode lists of test keys, which over time have diverged from those actually in use. As a result, several vulnerable APEXes are not caught by either test.

The nature of APEXes makes such divergence inevitable: unlike APKs, which share just a few test keys, APEXes all have different test keys, meaning each new APEX is a new opportunity for divergence. In our view, the only fix is for CTS to check signatures against the source of truth—the build system. In fact, the build system already records which test keys it uses to a file called apexkeys.txt, which plays a part in…

Unsafe defaults

sign_target_files_apks, which re-signs a build, doesn’t guarantee replacement of every test signature. On the contrary, it doesn’t replace any test signatures by default! Run without arguments, it signs each APEX and APK using the very same test keys the build system did, which it finds by parsing apexkeys.txt and apkcerts.txt respectively. We assume this default was intended as a starting state for arguments like --key_mapping, but we’re unsure why the absence of such arguments doesn’t result in at least a warning.

Here too, a risk that was low for APKs—forgetting to specify a test key’s replacement—became a near certainty once APEXes appeared. Because each APEX has its own test keys, each must be mapped to a release key individually. And although enumerating every APEX in Android is clearly error-prone—especially as new Android versions regularly add and remove APEXes—that’s exactly what AOSP’s documentation instructs OEMs to do. And speaking of that documentation…

Poor documentation

AOSP’s “Sign builds for release” article begins by declaring that Android uses signatures in “two places”, APKs and OTA updates. There is no mention of APEX signatures in the introduction, nor anywhere prior to a section titled “Advanced signing options”, which gives the guidance above.

Furthermore, neither the names nor the locations of APEX test keys themselves (e.g. this one) make it clear that they’re test keys. In contrast, APK test keys all live in a single directory alongside a notice that they should “NEVER be used to sign packages in publicly released images”.

Without documentation to the contrary, an OEM might believe that APEX re-signing is optional or unimportant until CTS failures arise. They then might address those failures and think nothing more of the matter, confident that CTS and sign_target_files_apks know what needs signing. And who could fault them for that?

Other factors

Although far less important than the three issues above, these two details may have also misled OEMs:

  1. In Android.bp files, some APEXes, including com.android.vndk, are marked updatable: false. That might lead OEMs to believe that such APEXes cannot be updated and so don’t need secure keys, but in fact all it means is that the build system will not “enforce additional rules for making sure that the APEX is truly updatable”.
  2. com.android.vndk is not in Google’s documented list of APEXes. Additionally, the AVB test key files for com.android.vndk end in .pubkey rather than the more common .avbpubkey, which could cause naive enumeration strategies to miss them.


We reported our findings privately to Google on September 19th, 2023. Google acknowledged our report immediately, and the Android Security Team confirmed it as valid within a week. On September 25th, Google issued Partner Security Advisory 2023-11, advising its OEM partners of the issue and how to fix it. Around the same time, Google individually contacted each affected OEM we identified.

To ensure that OEMs re-signed vulnerable APEXes, Google added a test to their proprietary Build Test Suite (BTS), through which all updates to Play Protect certified devices pass, that warned of vulnerable APEXes starting November 1st and rejected them starting December 4th for updates claiming a December 2023 patch level or higher.

Google has also fixed the deficiencies we identified in CTS. We have not seen those fixes, which won’t be made public until the release of Android V. Nonetheless, we believe the vast majority of real-world risk is now gone: OEMs have had ample time to patch their devices since Google’s initial advisory, and a spot check we performed on January 25th revealed that most have done so.

CVE-2023-45779 was made public by Google on December 4th but contained no details. This post and our accompanying disclosure are the first public descriptions of the issue. Google plans to update the CVE with more detail after we publish this post.

Tested devices

Entry format

  • Device name
    Where we obtained the OS image
    Vulnerable APEXes, if any


NOTE: On every device we checked, either all versions of the VNDK APEX were vulnerable or none of them were. For brevity, we’ve only listed the VNDK version that’s actually used by each vulnerable device.

  • Asus Zenfone 9
    Official ASUS OTA ZIP
    com.android.vndk.v32, com.android.uwb, com.android.wifi
  • vivo X90 Pro
    dumps.tadiphone.dev, vivo/v2219
    com.android.vndk.v33, com.android.rkpd, com.android.uwb, com.android.virt, com.android.wifi
  • Nokia G50
    dumps.tadiphone.dev, nokia/phr_sprout
  • Microsoft Surface Duo 2
    Official Microsoft OTA ZIP
    com.android.vndk.v30, com.android.appsearch, com.android.wifi
  • Lenovo Tab M10 Plus (Gen 3, Wi-Fi)
    Physical device
    com.android.vndk.v31, com.android.uwb, com.android.wifi
  • Nothing Phone 2
    dumps.tadiphone.dev, Nothing/Pong
    com.android.vndk.v32, com.android.uwb, com.android.wifi
  • Fairphone 5
    NOTE: Fairphone did their own investigation in response to our report and discovered that Fairphone 3, 3+, and 4 were also vulnerable. You can read their statement here.
    dumps.tadiphone.dev, fairphone/fp5
    com.android.vndk.v30, com.android.uwb, com.android.wifi

Not vulnerable

  • Google Pixel 5
    Physical device
  • Samsung Galaxy S23
    Official Samsung OTA ZIP, fetched with samloader
  • Xiaomi Redmi Note 12 4G
    Official Xiaomi OTA ZIP
  • OPPO Find X6 Pro
    dumps.tadiphone.dev, oppo/op528bl1
  • Sony Xperia 1 V
    dumps.tadiphone.dev, sony/pdx234
  • moto razr 40 Ultra
    dumps.tadiphone.dev, motorola/zeekr
  • OnePlus 10T
    dumps.tadiphone.dev, oneplus/op5552l1

Appendix: disclosure timeline

  • September 6th, 2023: We notice that an Android device we use for testing has APEXes signed with test keys, which prompts us to check other devices.
  • September 13th, 2023: We complete our survey and conclude that the issue is widespread enough that Google should coordinate the response.
  • September 19th, 2023: We report our findings to Google, who passes them to the Android Security Team.
  • September 25th, 2023: Google releases an Android Partner Security Advisory to their OEM partners detailing the issue. The next day, they respond to us, rating the issue High Severity.
  • September 28th, 2023: Google informs us of the Partner Advisory and indicates they’ve contacted affected OEMs directly.
  • October 18th, 2023: Google updates the Partner Security Advisory with details on remediation, stating that the issue will be part of the December 4th Android Security Bulletin and detailing the BTS enforcement schedule.
  • October 26th, 2023: We ask Google if the December ASB will contain enough details to warrant simultaneous release of this post, even if most OEMs haven’t released a fix. Google replies that the bulletin text won’t contain “specific technical information” but that they do “consider [the issue] publicly disclosed” at that point.
  • November 1st: BTS purportedly begins warning OEMs when a build has vulnerable APEXes.
  • November 6th, 2023: We notify affected OEMs that we’ll name them in this post on December 4th, as we believe the ASB will include CTS patches indicating APEXes have been signed with test keys. We offer to publish statements from them. We receive an automated acknowledgement from Nothing, and Nokia Corporation tells us they’ve passed the email to HMD Global, who makes Nokia-branded phones but “don’t have [a] similar responsible disclosure program”.
  • November 7th, 2023: Google updates the Partner Security Advisory to add the CVE number and a note that only “builds … claiming the 2023-12-05 SPL or higher” will be subject to BTS enforcement on December 4th.
  • November 13th, 2023: Lenovo confirms receipt of our email and says they don’t yet have a statement.
  • Week of November 13th: Multiple OEMs create keys to re-sign their vulnerable APEXes, as evidenced by metadata in the updates they subsequently released.
  • November 15th, 2023: Google asks permission to share our detection tooling with OEMs who want an offline way to find vulnerable APEXes. We grant it.
  • November 15th, 2023: We ask Google explicitly if the December ASB will include CTS patches, as our plan to disclose on December 4th relies on that assumption.
  • November 21st, 2023: Google sends us a generic update which formally shares the CVE ID and states in part that they “will be releasing a patch for this issue in an upcoming bulletin”.
  • November 27th, 2023: We notice that the December ASB partner preview, which Meta has access to but most researchers don’t, contains no APEX-related CTS patches. We ask Google to confirm that the ASB will expose details of the issue via a patch. Google replies that CTS patches actually won’t become public until Android V and that they support us giving OEMs more time.
  • November 28th, 2023: Lenovo asks us if we plan to disclose on December 4th, like our notice claimed, or on December 18th (90 days from our initial report), like Google’s Partner Advisory claimed. We reply that it’ll be 4th but we’re considering postponement given the new information from Google.
  • December 1st, 2023: Lenovo follows-up on their question. We opt to officially postpone disclosure until January 30th, 2024 and notify all OEMs of the change. At this point, no OEMs had yet released fixes to our knowledge.
  • December 4th, 2023: The December ASB comes out. As promised, it contains no details an attacker could use to discern the issue.
  • December 7th, 2023: Fairphone thanks us for the postponement, says they intend to provide a statement, and asks permission to credit us in their own disclosure. We accept, and a couple weeks later we exchange statements and links where our respective disclosures will appear.
  • January 16th, 2024: Google offers us a $7,000 bounty for our report, which we ask them on January 25th to donate to charity. (Google, like Meta, doubles bounties paid to charity.)
  • January 25th, 2024: Lenovo asks for confirmation that January 30th is still our planned disclosure date, which we give.
  • January 30th, 2024: This post, our disclosure, our PoC code, and Fairphone’s post all go live.