
Despite the presumption that apps in the Apple App Store are secure, your data can be exposed to bad actors because of poor programming practices.
Imagine this: you renovate your house, equip it with the best security system, secure everything you can to make it work, and then one of the contractors you hired simply leaves your key and alarm code under the doormat for anyone to find.
Unfortunately, many mobile apps, including the ones you’re using, potentially keep your keys under a virtual doormat. What I mean by that is that rather than having an additional layer of security, they lazily insert links to their APIs or databases into the code they’re writing. This is sometimes done as a placeholder for testing and later forgotten in a frenzy of code rewrites, debugging, and releases. Other times, it’s done as a pure cost-saving measure.
A few years ago, we explored the impact of these applications on the Android Play Store. However, Apple prides itself on device security, and the App Store is known for a much stricter app submission process (believe me, I once saw Apple deny a submission because of a missing Finnish language file). So, with all that scrutiny, are iOS users safe from hard-coded secrets exposing their real-life secrets?Unfortunately, not really. Keep reading to find out why.
Hard coded, easy to crack
Unlike what its name suggests, hard coding doesn’t indicate an elevated sense of security. Instead, it means that data is entered directly into the software’s source code, unlike soft coding, which codes the data in a separate location.
So, for example, a hardcoded line of code giving access to an API (application programming interface) might look something like this:
API_ADDRESS = “https://api.yourapp.com”
API_KEY = “i-dont-care-about-security-so-i-hardcoded-this”
A soft-coded key would require a few more lines of code and an outside service that would give temporary, limited access to your API or database, which would expire after a short time and be monitored more consistently. I won’t bore you with the details, but it would include a few callbacks, functions, and a whole other codebase outside the main source code handling the credentials.
Why do companies hard-code secrets?
So, if this practice is so dangerous, why do companies risk your data and their security? There are a few reasons, ranging from simple mistakes to cynical calculations.
The simplest and most common reason is that developers may not realize the danger of hardcoded secrets. After all, not every coder is a cybersecurity expert, and many apps are created by tiny studios. This means developers may not understand the implications of leaving an API key or database credentials in their software.
Similarly, even in more prominent companies, hard-coded secrets may be left in as a cost-saving measure during development with the intention to switch to soft-coding before release and simply forgotten. Trust me. I’ve worked in mobile app development companies, and when things get hectic, mistakes can happen.
The last reason is even simpler, however frustrating. Some companies just don’t care. They believe that the potential cost of exposing a hard-coded secret is lower than the cost of infrastructure required to maintain a working soft-coded solution, so they save money while potentially exposing your data. Many are probably right, and the hard coding won’t bring about a massive leak, but even if it does, many customers simply won’t understand the cause and blame it all on cybercriminals.
What data is exposed?
Essentially, it’s hard to say what data will be exposed, as it all depends on the exact setup used by each app. After all, a social media app will have more of your data than a hypercasual game. The most common data potentially subject to a leak is:
- Contact information (e.g., name, email address, home address, etc.)
- Login information (e.g., username, email address, password, etc.)
- Usage information (e.g., for a fitness app, your pulse, exercise locations)
- Data uploaded by you to the app (e.g., photos, videos)
Hard-coded secret leaks
This is not a theoretical scenario, either. Over the years, the world has seen many hard-coded secret leaks, some from mobile apps, others from website source codes or desktop apps. Here are a few examples.
Sisense (2024)
While you may not know what Sisense is, the developers of some of your favorite apps and services do. Sisense delivers data analytics and business intelligence solutions to many companies across the globe. Despite being a massive company, Sisense left credentials to its Amazon S3 account hanging in its source code. Malicious hackers managed to access its GitLab and subsequently stole terabytes of customer data, causing the US Cybersecurity & Infrastructure Agency (CISA) to get involved. These included login tokens, personal information, and passwords, among others.
Toyota (2022)
One of the world’s biggest car manufacturers accidentally left their keys in the glove compartment. Or should I say – in the source code of their TOYOTA CONNECT app? The breach, detected after 5 years, was caused by a subcontractor who left hardcoded database credentials in their code. This caused nearly 300 thousand people’s personal information to be exposed. Toyota didn’t learn its security lesson either, as a subsequent database leak exposed a further 2 million customers, this time due to a misconfigured database.
Samsung (2022)
2022 also saw Samsung suffer a massive data breach. While the breach itself wasn’t caused by a hard-coded secret, the source codes stolen revealed many of them, with almost seven thousand API keys, AWS credentials, database connection strings, and more being exposed.
Unfortunately, many companies don’t see these leaks as serious and continue to risk their security and, worse, your data for various reasons.
iOS apps exposed for hard-coding secrets
The examples above aren’t directly related to iOS; however, security researchers from Symantec have found examples of applications that included hard-coded secrets in their source code. While no leak was confirmed outright, you should definitely secure your account if you’ve used any of these.
Crumbl
Crumbl is one of the top Food & Drink apps in the US, delivering desserts and baked goods to customers nationwide. With over 20 million lifetime downloads, you’d think it keeps the code extra secure. Unfortunately, researchers have found hardcoded Amazon Web Services credentials in its code, which could’ve allowed bad actors to intercept communication between users and the app, potentially capturing crucial data.
Given that they’ve had over 10 updates since Symantec exposed the vulnerability in October, it’s likely that its codebase is now fixed. However, given the potentially exposed data, I’m surprised I wasn’t able to find any statements on the topic by the company.

Eureka
Eureka isn’t the most popular app; however, thousands of people use it to earn money by filling out surveys every day. Similarly to Crumbl, Eureka left its AWS credentials hanging out in its code for anyone to play around with. While, just like in the case of Crumbl, we don’t know if this was used to leak any user data, given the character of the app, any leak would be catastrophic.
The company didn’t publish any reaction. While I understand an unwillingness to expose your bad coding practices, transparency usually goes a long way in helping people feel their information is secure.

Videoshop - Video Editor
The last iOS app named in Symantec’s report is a less-popular free video editing app that was nevertheless downloaded over 20 million times from the App Store. Once again, AWS access is hardcoded, this time leading directly to an S3 bucket and, thus, user data or their videos.
Even more worrying is that the app hasn’t been updated since Symantec’s article came out. While Symantec likely gave the team a heads-up, their patch notes don’t mention fixing any security vulnerabilities.

The apps shown above are just examples. As our Android research showed, many more apps likely expose hard-coded security vulnerabilities, and some are probably on your iPhone and iPad as you read this.
Why doesn’t Apple do more to protect your data?
This might make you wonder why Apple won’t take a more proactive approach when tackling hard-coded secrets. After all, most apps are installed through the App Store, over which they have full control.
First, it’s a question of intellectual property. App Store submissions do not contain the source code; they are binary code. This means that a hard-coded secret can’t be detected quickly and would require potentially illegal reverse engineering to be spotted by Apple’s team. Apple instead relies on the fact that when the developer signs Apple’s Terms and Conditions, they take full responsibility for the contents of their app.
Another reason is simple: it would cost too much. Even if Apple required source codes, the logistics of keeping them secure, analyzing them for hard-coded secrets, and making a decision based on that would be extremely costly. This would force Apple to increase App Store submission fees or take a loss on many apps. Money talks; here, it says revenue is worth more than widespread security.
That said, Apple does react… retroactively. Whenever they are tipped off to a company leaving hard-coded secrets in their software, they respond and require the company to take the appropriate steps to secure iOS users. Nevertheless, whatever damage is done will be pretty irreversible.
How to protect yourself
As you can see, it’s very likely that you are exposed to the risks of hard-coded secrets every time you install a new app. Luckily, there are steps you can take to reduce the impact of leaks caused by hardcoded secrets.
- Never reuse passwords. Use a password manager to keep them all saved and secure.
- Use two-factor authentication (2FA). That way, even if someone gets your password, they won’t be able to log in. The most secure 2FA methods are authenticator apps on your phone (e.g., Authy, Google Authenticator) and a 2FA security key (e.g., YubiKey).
- Change your password whenever you’re notified of a security breach.
Final thoughts
Hard-coded secrets are an unfortunate example of either a simple mistake or a cynical omission that can potentially impact end-users without them even knowing.
The unfortunate truth is that whenever we download a new piece of software to our devices, we’re exposing our data just that tiny bit more to the outside world. Sadly, many software developers are more than happy to gamble your data to save some money on development. What I find even more ironic is that when faced with court cases and regulatory challenges around the world, Apple often argues that preventing third-party app stores from their devices is the only way to keep your data safe. Yet, the App Store still puts the onus of security on app developers. That approach has the regrettable side effect of creating a false sense of security for their users.
In a world where corporate profits are king, you must count on yourself to secure your data. Companies will leave your keys under that virtual doormat, so you have to ensure that cybercriminals won’t be able to do anything with the data they steal. I know it’s daunting and frustrating to have to nail down every part of your virtual life just because some companies are too incompetent, lazy, or greedy to write a few extra lines of code. Still, if we can’t change how they act, we must adjust our lives to protect ourselves.
Your email address will not be published. Required fields are markedmarked