Summary: Make active use of the secure storage APIs provided by each operating system. In Apple environments such as iOS and macOS, there is Keychain; on Android, Keystore; and on Windows, CNG (Cryptography: Next Generation). But once you dig into the details, there are still trade-offs and design considerations.
How should we protect information?
Summary: If you want to prevent user data from leaking, simply storing a key in a file is not enough.
Anyone who has done software development has probably had thoughts like these at least once:
- “How should I store authentication tokens?”
- “Where should I store user information that must not be exposed?”
These kinds of development requirements can be summarized as follows:
- Reject unauthorized reads — or more broadly, unauthorized access itself, including modification.
- Detect unauthorized modifications if they occur.
The most obvious answer is encryption. In fact, encryption exists precisely to achieve goals like these. An outside attacker who does not know the key cannot derive any meaningful information even if they read the data, and if they modify the data arbitrarily without knowing the key, decryption will fail. So if the key can be stored safely, important information can be protected safely from attackers.
And that is exactly where the problem begins:
How do you store the key safely?
If the key is stored in memory, the problem is relatively simple. Most operating systems protect memory scope securely with the help of hardware. Of course, there are many cases where applications need additional protections on top of that, but implementing such mechanisms directly usually ends up being more trouble than it is worth. And to place that under truly complete control, you would practically have to write code that operates below the application layer, close to the OS itself. In ordinary cases, trusting the operating system’s memory protection is good enough.
But now think about things like authentication tokens mentioned earlier. What if the user had to log in again every single time the application launched? That might make the engineer’s life easier, but users would hate it. And from a business perspective, that would be a serious problem.
Then what about keeping the application running all the time so that memory contents are never lost? You cannot force desktop users to keep their computers on forever, and mobile operating systems generally do not allow this kind of behavior anyway. From the OS’s point of view, such an app is basically a memory-hogging tumor. And do not forget: preserving memory contents usually requires continuous power consumption. So this is not a realistic solution.
A more realistic option is to store such information in long-term storage that cannot be casually accessed. But that leads to another question: Where exactly do you find such a place?
Long live Secure Storage
Summary: Security standards backed by hardware features are sufficiently safe. For ease of explanation, I’ll use Apple’s implementation as the example.
Fortunately, Android and Apple operating systems provide standards for exactly this purpose. On Android, it is called Keystore; on Apple platforms, it is called Keychain.
In Android’s case, the operating system provides the standard, but the actual hardware-specific implementation differs from vendor to vendor. So let’s start with Apple, which controls both hardware and software.
Apple’s key security model is implemented through the following:
- Hardware security integrity verification: represented by processor execution-code verification and the Secure Enclave
- Memory security of the operating system itself
- File permission model
- Application signing
- Cryptographic acceleration
The Keychain API provided by Apple is essentially a database used exclusively for storing keys. You might wonder where this information is actually stored. Apple does not disclose every detail, but in principle, the Secure Enclave itself contains non-volatile storage, and since not all sensitive information can fit there, normal device storage is also used. However, those storage areas cannot be accessed except through the Secure Enclave. In addition, access to this database is restricted by both application and user scope. So unless it is the designated app and the designated user, reading or modifying the stored information is fundamentally impossible. In other words, this is stored on a completely different layer from the ordinary files users commonly deal with.
So what happens if someone tries to access that hardware in an unauthorized way? Naturally, once the allowed number of attempts is exceeded, the storage locks down completely. Even if someone tries brute-forcing the Secure Enclave, entering the wrong key too many times triggers a hardware-level kill switch. Unless you have hands with nanometer-level precision, you should give up on that fantasy. Samsung Knox implements similar behavior as well. The only way to undo this is to initialize the Secure Enclave, and in that process, all stored data is wiped out. It is like the data saying, “If I’m going to be looted, I’d rather die with everyone else.”
You might wonder whether it is possible to disassemble the Apple device, mix and match parts, reassemble it, and attempt various attacks that way. Better wake up from that dream too. The Secure Enclave was designed with this in mind and includes hardware integrity checks. To put it simply, if the combination of hardware identifiers is not one authorized by Apple, it refuses to operate. You may have seen warning messages when unofficial repair shops replace parts in an iPhone. Among those parts, if components critical to the security system are replaced, the entire device can stop functioning. In the end, if the Secure Enclave is locked, the only remaining option is to wipe all storage. Even then, information like iCloud Lost Mode is not erased, so if the device has been stolen, the only way to reactivate it is directly on the device by entering the recovery key to unlock the Secure Enclave’s lock mode.
The operating system strikes again
Summary: The OS, together with the hardware security features it requires, is sufficiently safe. Use them well.
So we now know that accessing that storage area in any way that defies common sense is effectively impossible. What attack scenarios remain?
An attacker could try to interfere with the operating system itself, modify the logic of an app that is allowed to access the target key, or invade the memory space of the app while it is running.
The memory-space attack is fairly straightforward to handle. The processor and operating system already defend against unauthorized memory modification by default. If unauthorized modification occurs, the app crashes or the system triggers a kernel panic. In severe cases, the Secure Enclave may lock as well. Interfering with the operating system itself is of course even harder, because additional software and hardware protections are layered on top of the same mechanisms used for app memory protection.
Then what about modifying the app? After all, the app retrieves the key through the Keychain API and stores it somewhere in memory while using it. Couldn’t an attacker inject code that transmits the key remotely or stores it elsewhere on the device? Unfortunately for the attacker, this is blocked by app signing. Even if unauthorized app modification occurs, every app distributed through the App Store includes verification values proving whether the app was altered after being written by the developer. Since the signing keys are owned by the developer and Apple, a hacker cannot arbitrarily modify the app’s logic unless they somehow obtain those keys. If they do modify it, the operating system simply refuses to execute the app at all. This is one of the main reasons jailbreaking was required to run apps downloaded from outside the App Store. On jailbroken iOS, this protection is disabled or weakened.
Implementations by other companies
Summary: Both Windows and Android have sufficiently safe alternatives to Apple’s information-protection mechanisms. On Windows, however, extra care is needed regarding app tampering.
Android’s Keystore works in broadly the same way as described above. The main difference is that the detailed software implementation in firmware and the specific hardware components used to support it differ by manufacturer. Samsung Knox is a well-known example: like the Secure Enclave, it provides hardware integrity checks, kill switches, and cryptographic acceleration.
Windows is a bit more complicated. For legacy support, there are older mechanisms such as Protected Storage (since Server 2003 / XP) and DPAPI (Data Protection API) (since Windows 2000). Unlike the systems discussed above, these store information as ordinary files, allow all data to be decrypted with a single master key derived from a password, and do not provide per-app access restrictions. So compared to Apple’s and Android’s modern approaches, the security feels weaker. This is not because Microsoft is stupid or careless about security. It is because Windows has such a long legacy history that Microsoft cannot simply abandon apps and hardware environments built before modern security policies were established.
That said, Windows does provide the Cryptography Next Generation API, and if a TPM is installed, Windows can use it as part of that API’s operation. To protect app logic and assets, developers can distribute apps using the packaging model introduced in Windows 8, or they can use Windows AppContainer with traditional Win32 applications to gain similar security benefits. Apps installed through the Windows Store are stored in the C:\Program Files\WindowsApps directory, which is blocked from outside access by default. Apps also cannot freely access one another’s files, and they run inside a virtualized sandbox. It is not full VM-level isolation, but from a security perspective it provides more than enough separation.
If that route is not possible, then the next best option is to sign the app with a certificate, just like in Apple’s app environment, in order to prevent logic tampering. That costs money, though. App stores for each OS effectively handle this more cheaply by issuing and managing certificates once the developer registers. In any case, if you can defend against app tampering and then use CNG or similar platform features, you can store and read information safely enough. The key point is to use features tied as closely as possible to hardware security standards — TPM-related mechanisms being the most representative example. Windows has a huge variety of development environments, so you more or less have to pick the right one yourself.
Even so, it is true that compared to other operating systems, Windows security policy can feel less strict, which might make Microsoft seem incompetent in the security space. But that is not really the case. TPM stores important keys and accelerates encryption and decryption, while Secure Boot verifies hardware integrity, and both have been supported since Windows 8. On top of that, Microsoft designed its own hardware security component called Pluton, which corresponds to Apple’s Secure Enclave or Samsung Knox. A representative example of hardware that includes it is the Xbox line from the Xbox One onward. And this thing is on a different level even compared to the Secure Enclave or Samsung Knox, both of which have at least seen some degree of bypass research. How do we know? Because while the Xbox 360 was eventually compromised — and even then only partially, despite lacking modern hardware security, which says a lot about Microsoft’s software security expertise — after Pluton was introduced in the Xbox One and later systems, nobody has managed to install pirated apps, run them, or extract even trivial data. Put simply, it is an even nastier beast than the others mentioned above. So Microsoft is not incompetent; if it really wanted to, it could absolutely sacrifice some legacy support and bulldoze everything into a modern security model fully under its control. It simply has not done so yet because it is still tolerating legacy compatibility.
References
Native Environments
- Apple Keychain: Keychain services | Apple Developer Documentation
- Android Keystore: Android Keystore system
- DPAPI: How to: Use Data Protection
- Protected Storage: PStore – Win32 apps | Microsoft Learn
- Windows Appcontainer: Public Preview : Improve Win32 app security via app isolation
- Windows Cryptography next generation: Cryptography API: Next Generation – Win32 apps | Microsoft Learn
- Supported on Windows Server 2008 and later. As of now, this is the most recommended way to securely store information on Windows.
- Windows TPM storage provider: TPM Base Services
- In most situations, though, it is better not to use this directly. TPM key constraints are fairly restrictive. In most cases, it is better to use the Windows Cryptography: Next Generation API above instead. That API uses TPM under the hood by default when available and enabled; if TPM is unavailable or disabled, it falls back to software-based alternatives, though naturally that is not as strong as having TPM enabled. To exaggerate a little, unless you work for the CIA, you probably will not need to use this directly.
- UWP Cryptography Data protection: Windows.Security.Cryptography.DataProtection Namespace – Windows UWP applications | Microsoft Learn
Packages for cross-platform native apps
- Flutter Secure Storage: flutter_secure_storage | Flutter Package (pub.dev)
- An abstraction over each platform’s secure storage API.
- React Native Keychain: react-native-keychain – npm (npmjs.com)
- Similar to the package above.
- Ionic Secure Storage: Secure Storage – Secure Storage (ionic.io) – 상동
- Electron safeStorage: safeStorage | Electron (electronjs.org)
- MAUI Secure storage: Secure storage – .NET MAUI | Microsoft Learn
Unfortunately, I could not find packages for Avalonia or Tauri. If anyone finds one or lets me know about one, I’ll add it to the article.Closing remarks
Closing remarks
This ended up being a pretty dense and technical write-up. Still, I hope everyone keeps security in mind when handling sensitive user data.