Sarah is careful about privacy. She denied Facebook camera access years ago. She does not use free Wi-Fi without a VPN. She assumed she had things under control.
Then she learned that her phone’s operating system had been collecting her location data every 30 seconds, sharing her contact graph with three advertising analytics firms, and that her “private” browsing history was being indexed to build a behavioral profile — all through first-party OS features she never explicitly enabled.
She had been protecting herself from yesterday’s threats. The 2026 landscape looks very different.
Mobile privacy has undergone a significant shift over the past two years. The changes are technical, legal, and behavioral — and most people have not caught up. This article covers what has actually changed, what it means for your private data, and what you need to do right now.
The Shift from App Tracking to OS-Level Tracking
For years, the major mobile privacy story was third-party app tracking. Apps would embed advertising SDKs that tracked your behavior across the internet. Apple’s App Tracking Transparency (ATT) framework, introduced in 2021, significantly disrupted this model by requiring user permission for cross-app tracking identifiers.
The advertising industry adapted. Instead of relying on third-party app SDKs to collect data, attention shifted to first-party data collection at the operating system level, by the platform owners themselves.
Apple and Google are not third parties to your device. They built the OS. They do not need to ask your permission to observe what happens on a device running their software — at least not for data categories that fall outside current legal requirements. Both companies collect substantial behavioral data about how you use your device, which apps you open, how long you spend in them, what you search for, and increasingly, what your camera and microphone observe.
On Android, Google’s advertising services are deeply integrated into the OS. Android 15 introduced a new Privacy Sandbox architecture designed to replace the GAID (Google Advertising ID) with a set of “privacy-preserving” APIs. The framing is that these APIs give you more control. The reality is more nuanced: the APIs still enable highly targeted advertising based on your on-device behavioral profile — the data is just analyzed on-device rather than sent to servers. Your profile still exists. The advertisers still receive targeting signals. The difference is in where the profiling happens, not whether it happens.
On iOS 18, Apple extended its own first-party data collection through Apple Intelligence features. Siri now processes far more on-device context, including emails, messages, calendar entries, and screen content, to power contextual suggestions. Apple’s position is that this processing happens on-device and is privacy-preserving. The concern is that “on-device” does not mean “not used for profiling” — it means the profiling runs locally. For data that Apple’s services need to process in the cloud, the company introduced a Private Cloud Compute architecture that it describes as verifiable and auditable. Independent security researchers have reviewed this architecture and found the claims credible — but “credible” is different from “verified.”
The practical upshot: the same companies that say they protect your privacy are also the largest beneficiaries of understanding your behavior. Read their privacy policies with that tension in mind.
Android 15 and iOS 18 Privacy Features: What Actually Changed
Both major mobile operating systems shipped meaningful privacy improvements in their 2024-2025 cycles. Not all of them are as strong as the marketing suggests.
Android 15 Privacy Changes
Partial screen sharing controls: Android 15 lets you share a single app window during screen share rather than your entire screen. This is a genuine improvement for when you share your screen on video calls — the other person cannot see your notifications or other apps.
Health Connect expansion: Android 15 expanded Health Connect as the centralized repository for health and fitness data, with better permission granularity. Apps can request specific data types rather than blanket health access. This is a meaningful improvement if you use health apps.
Theft protection features: Android 15 introduced Theft Protection, which uses on-device AI to detect when your phone is snatched. It locks the screen immediately. The feature also added Offline Device Lock, which locks the screen when disconnected from known networks in suspicious circumstances. For protecting vault apps from physical theft, this is genuinely helpful.
Private Space: Android 15 introduced a Private Space — a second, separate profile area on the device that can be locked with a separate PIN and hidden from the main profile. This is conceptually similar to the functionality vault apps have provided for years, but at the OS level. It is a sign that Android is acknowledging the legitimate need for private spaces on devices — something Calculator Hide App users have known for years.
What did not change: Android’s relationship with Google’s advertising infrastructure remains complex. The Privacy Sandbox APIs are in full rollout, but the fundamental data flow — your device’s behavior informing advertising targeting — has not changed in principle, only in architecture.
iOS 18 Privacy Changes
Locked and Hidden Apps: iOS 18 added the ability to lock and hide individual apps directly in the OS. A locked app requires Face ID or a passcode to open. A hidden app disappears from the home screen and search, appearing only in a locked folder. This is a direct acknowledgment of what vault app users have always needed. It also means iOS now offers a basic version of the app hiding functionality that vault apps have provided.
The important caveat: iOS’s built-in app hiding does not encrypt the app’s data. It obscures the app’s presence but does not add an encryption layer to the files within it. A vault app with AES-256 encryption provides meaningfully stronger protection.
Contacts access granularity: iOS 18 lets you share specific contacts with an app rather than your entire contacts list. This addresses a longstanding concern about apps harvesting contact graphs.
Pasteboard access notifications: The system now notifies you when an app accesses the clipboard. This was partially addressed in iOS 14, but iOS 18 extended it more broadly.
Apple Intelligence privacy: The processing model for Apple Intelligence features is genuinely complex. On-device processing happens in the Secure Enclave architecture. Cloud processing uses Private Cloud Compute. Apple has made stronger privacy claims here than it has historically, and the architecture has received serious independent review. It is among the more privacy-conscious implementations of AI features on a major platform.
What did not change: iCloud backup still means Apple can access your backed-up data unless you have opted into Advanced Data Protection (end-to-end encryption for iCloud). That opt-in rate remains low, meaning most iCloud users’ backed-up data is accessible to Apple and by extension to legal demands served on Apple.
The Rise of Data Broker Apps
One of the most significant and underreported privacy developments of 2025-2026 is the proliferation of data broker applications masquerading as legitimate utilities.
These apps — often presented as free VPNs, free phone cleaners, free battery optimizers, or free photo organizers — request broad permissions and collect and sell your data to data brokers. The data broker industry is large and largely unregulated in most US states. Data brokers aggregate location data, behavioral data, device identifiers, and personal information to sell targeting profiles to advertisers, insurers, employers, and anyone willing to pay.
The problem has worsened because legitimate-sounding app categories have been colonized by data collectors. A “free vault app” that requests access to your contacts, location, microphone, and full photo library is not providing a free service out of generosity.
We have written about this in detail in our article on the dangers of generic vault apps. The short version: an app that offers vault functionality for free, requests broad permissions, and comes from a developer you cannot verify is likely monetizing your data.
Data broker activity also extends beyond apps. The major data brokers aggregate information from public records, social media platforms, loyalty programs, and purchase histories to build profiles. Your private phone behavior is one input. Your digital life broadly is the product.
What New Smartphone Sensors Can Now Detect
Modern smartphones carry an array of sensors that can infer information about you beyond what you share directly. The 2025-2026 generation of devices has expanded this capability.
Ultrasound tracking: Some advertising networks use ultrasound beacons embedded in TV commercials and retail environments. Apps with microphone access can pick up these inaudible tones and report back which TV ads you watched or which stores you visited. This cross-device tracking mechanism has existed since 2016 but became more widespread in 2025 as it was incorporated into more advertising SDKs.
Motion-based inference: Accelerometer and gyroscope data does not require special permissions on most platforms. Researchers have demonstrated that motion data alone can be used to infer your location, your mode of transport, and in some cases even what you typed on a physical keyboard while your phone sat on the same desk.
Network-based location inference: Even without GPS access, apps can infer your location from the Wi-Fi networks your device detects (not connects to — just detects). This has been used by apps with no explicit location permission to track user movements.
Camera-based inference: The front-facing camera, when active, can be analyzed to infer emotional state, health indicators, and attention. Some advertising platforms have experimented with attention-based ad pricing that charges more if your camera data suggests you actually watched an ad.
None of this is hypothetical. These are documented techniques in use in the advertising industry. The gap between “we don’t use your location” and “we don’t track your location” is significant.
AI Photo Scanning: How Google Photos and Apple Intelligence Changed Things
The introduction of AI-powered photo analysis has created a new category of privacy concern that deserves specific attention.
Google Photos uses on-device and server-side AI to analyze the content of your photos for organizational features (face recognition, scene detection, object tagging). It also uses photo content to inform Google’s understanding of your interests and activities for advertising purposes. The processing that happens on Google’s servers operates on photos that Google can decrypt — because Google holds the decryption keys for standard Google Photos storage.
In 2025, Google expanded Google Photos’ AI features to include automatic highlight creation from your photos and memory albums. These features process your photos in ways that go beyond basic organization, creating new content derived from your private moments.
Apple Intelligence in iOS 18 introduced on-device AI features for photos, including semantic search, memory movie creation, and image generation. Apple’s framing is that these features happen on-device, which is true for the local processing. Where cloud processing is required, it goes through Private Cloud Compute. Apple has been more explicit than Google about what data leaves the device and when.
The concern with both is not that the features are malicious. The concern is that AI analysis of your photos creates a profile of your life, relationships, health indicators, and activities — and that profile is increasingly difficult to control once it exists. Your gallery app is not a neutral storage container. It is an analysis engine.
This is precisely why encrypted vault apps matter. Files stored in Calculator Hide App are not scanned by the OS’s AI features. They are encrypted blobs to the operating system. The AI analysis that happens on your camera roll does not happen in your vault. If you want to understand more about what your gallery app actually does with your photos, our article on whether your gallery app is private goes deep on this topic.
FISA Warrants and Phone Data: What Changed in 2025
The Foreign Intelligence Surveillance Act (FISA) Section 702 was renewed in April 2024 with expanded authority. The renewal included provisions that critics argued significantly expanded the government’s ability to compel American companies to assist with foreign intelligence surveillance — with implications for data held on behalf of US users.
In practical terms, FISA 702 warrants can be served on US telecommunications and technology companies without the public disclosure requirements that accompany standard warrants. Companies cannot publicly confirm whether they have received FISA demands. The targets of FISA surveillance are not notified.
The relevance to mobile privacy is direct: data stored in standard cloud services operated by US companies is potentially subject to FISA demands. This includes email, cloud-stored photos, backup data, and other content held by major US tech companies.
End-to-end encrypted services, where the provider holds no readable data, are significantly more resistant to FISA demands — the same reasoning that applies to standard legal demands. A demand served on a company that holds only encrypted ciphertext produces only encrypted ciphertext.
This is not a reason to be paranoid. FISA surveillance targets foreign intelligence threats and is not used to investigate ordinary citizens’ private photo libraries. But it is a reason to understand that “your data is with a major US company” does not mean “your data is protected from government access.”
For the large majority of users, the more realistic concern is device theft, nosy family members, and data broker profiling — not FISA warrants. But understanding the full picture means you can make informed decisions rather than false-confidence ones.
The Private Browser Situation in 2026
Incognito mode and private browsing modes in standard browsers remain widely misunderstood. They do not hide your browsing from your internet service provider, your employer’s network, the websites you visit, or your device’s OS.
Chrome’s Incognito mode was the subject of a major class action lawsuit that settled in 2024, with Google acknowledging that it collected data from users who believed themselves to be browsing privately.
The practical alternative for genuinely private browsing is a private browser built into a vault app — one that does not log history to the device, does not share browsing data with OS-level analytics, and does not sync with any account. Calculator Hide App’s built-in private browser provides this. Our article on private browser vs. incognito mode explains the technical difference in detail, and our hands-on guide on how to use the private browser inside Calculator Hide App walks through practical usage.
5 Things to Do Right Now for Better Mobile Privacy
After all that context, here is the practical action list.
1. Review your app permissions this week.
Go through every app on your phone and check what permissions it holds. On iOS: Settings > Privacy and Security. On Android: Settings > Apps > Permissions. Revoke any permission that an app does not genuinely need to function. A flashlight app with microphone access does not need microphone access. A recipe app with contacts access does not need contacts access.
Pay particular attention to location permissions. Deny background location to every app that does not have a compelling reason for it. Most apps that request “always on” location can function perfectly well with “only while using” or no location at all.
2. Enable Advanced Data Protection on iOS (or review your Google account backup settings).
On iOS 18, go to Settings > Your Name > iCloud > Advanced Data Protection. This enables end-to-end encryption for iCloud backups, meaning Apple cannot read your backed-up data. The recovery key requirement is serious — store it safely — but the privacy improvement is significant.
On Android, review your Google account backup settings and understand what is being synced. Consider whether sensitive files should be excluded from device backups.
3. Move your sensitive files to an encrypted vault.
Files sitting in your camera roll are accessible to the OS’s AI analysis, included in device backups, and visible to anyone who picks up your phone. Move private photos, videos, and documents into Calculator Hide App. The AES-256 encryption means the OS sees only encrypted data — it cannot scan, analyze, or profile what it cannot read.
4. Audit your data broker exposure.
Services like DeleteMe, Privacy Bee, and similar data removal services can audit and remove your information from data broker databases. This does not address in-phone tracking, but it addresses the aggregated profile that data brokers compile from public records and third-party data purchases. Running this kind of audit annually is a reasonable baseline practice.
5. Use a password manager and stop reusing passwords.
This is not new advice, but it bears repeating in the context of 2026: credential stuffing (using breached username/password combinations to break into accounts) is now highly automated and scales to billions of accounts. If you reuse a password anywhere, a breach of any service using that password creates a path to every other service. A password manager with unique, randomly generated passwords for every account is the single highest-impact change most people can make.
The broader point that ties all five of these together: privacy in 2026 is not a one-time setup. It is a practice. The threat landscape evolves, the platforms change, and your habits need to evolve with them.
Ready to take the most immediate step? Move your sensitive files to an encrypted vault that the OS cannot read, the AI cannot scan, and casual access cannot reach. Download Calculator Hide App and make that change today.
Frequently Asked Questions
Is Android 15 or iOS 18 private by default?
Neither is private by default in the sense of preventing the platform owner from collecting behavioral data. Both iOS 18 and Android 15 include meaningful privacy improvements over previous versions — better permission controls, on-device processing for some AI features, stronger theft protections. But both platforms also continue to collect significant behavioral data for advertising and product improvement purposes. “Private” requires active configuration, not just updating to the latest OS.
Does iPhone’s Locked and Hidden Apps feature replace a vault app?
For basic app hiding, it provides a built-in option. But it does not encrypt the app’s data. If someone has access to your device at the file system level — through a forensic tool, a deep backup extraction, or OS-level access — the app’s data can still be read. A vault app with AES-256 encryption adds a cryptographic layer that the OS’s own hiding feature does not provide.
Are free VPN apps safe to use for privacy?
Many free VPN apps are data collectors. The economics of a VPN service require significant infrastructure, and a free service needs a revenue model. For many free VPNs, that revenue model is selling user data. Some free VPNs have been caught logging traffic they claimed not to log, or selling browsing data to data brokers. Paid VPN services from reputable providers with public audits are significantly more trustworthy than free alternatives.
What is data broker profiling and why does it matter?
Data brokers aggregate information about individuals from multiple sources — public records, purchase data, location data, social media activity, app behavior — to build profiles that they sell to advertisers, insurers, employers, and others. The relevance to mobile privacy is that apps you use collect behavioral data that ends up in these profiles. Your location history, your app usage patterns, and your purchase behavior are all inputs to profiles that exist about you and that you have limited ability to access or correct.
How does AI photo scanning affect my privacy in practice?
Google Photos and Apple’s Photos app use AI to analyze the content of your photos — recognizing faces, places, objects, and activities. This creates a structured record of your life based on your photo library. For Google, this analysis is one input to the behavioral profile that drives advertising. For Apple, the stated use is limited to on-device features. In either case, photos that contain sensitive personal information — health matters, private relationships, financial documents — are being analyzed and categorized. An encrypted vault app keeps those photos away from AI analysis entirely.
What is FISA and does it affect my private photos?
FISA Section 702 allows US intelligence agencies to compel US tech companies to provide data on foreign intelligence targets. It is not used to investigate ordinary citizens’ private photos. However, data stored with major US tech companies exists within the legal framework where such demands are possible. End-to-end encrypted storage, where the provider holds no readable data, provides more resistance to this type of demand than server-side encrypted storage. For everyday users, the more relevant concern is data breaches and device theft, not FISA warrants.
Is incognito mode in Chrome actually private?
Incognito mode prevents your browsing history from being saved to your device’s browser history. It does not hide your activity from your internet service provider, your employer’s network, the websites you visit, Google itself, or your device’s OS. The 2024 settlement in the Google Incognito lawsuit required Google to delete billions of data records collected from users in Incognito mode and to clarify what Incognito does and does not do. For genuinely private browsing, a dedicated private browser that does not report to any account or cloud service is the appropriate tool.
Should I be worried about ultrasound tracking?
Ultrasound tracking is a real technique used by some advertising networks. The mitigation is straightforward: deny microphone access to apps that do not genuinely need it. A shopping app, a weather app, or a game does not need microphone access. Revoking microphone access from these categories of apps removes the sensor the tracking technique requires. The scenario where you are meaningfully affected by ultrasound tracking is fairly narrow — you would need to have an app with microphone access in the background while near an ultrasound beacon. But given the easy mitigation, it is worth auditing microphone permissions.
What should I prioritize if I can only make one privacy change today?
Review and tighten your app permissions, with particular emphasis on location and microphone access. This addresses the most common and ongoing data collection mechanisms immediately. The second change, which you can make in the same session, is moving your sensitive photos and files to an encrypted vault like Calculator Hide App so that OS-level AI scanning and device backups do not include your most private content.