As vehicles become rolling computers, the same systems that promise safer, more personal driving are raising urgent questions about consent, surveillance and control over personal data.
The modern car is learning to recognize its driver. It can adjust the seat before the engine starts, recommend a route based on habits, warn when eyelids look heavy, unlock through a phone, call for help after a crash, suggest charging stops, monitor tire pressure, remember music preferences and collect performance data for maintenance. In the most advanced models, cameras, microphones, radar, biometric sensors, navigation systems and connected apps work together to make the vehicle feel less like a machine and more like an attentive assistant.
For many drivers, this is progress. A car that understands the person behind the wheel can reduce friction, improve safety and personalize a daily routine that once depended on buttons, mirrors and memory. For automakers, it is the next stage of competition. Electric powertrains, software updates and advanced driver-assistance systems have turned vehicles into digital platforms. The smartest car is no longer only the one with the most horsepower or best fuel economy. It is the one that knows what the driver needs before the driver asks.
But the same intelligence that makes a car useful also makes it intimate. A connected vehicle can know where someone lives, works, worships, shops, exercises and spends the night. It can record how fast they drive, how hard they brake, whether they take late-night trips, which phone is paired, which contacts are called, what entertainment is played and sometimes who is sitting inside. If cameras and driver-monitoring systems are active, the vehicle may also process images of faces, eye movement, posture and attention. The car has become one of the most data-rich environments in everyday life.
That transformation is forcing a new question onto consumers and regulators: when a car understands its driver, who else understands the driver too?
Privacy advocates have warned that the automotive industry is collecting far more data than most consumers realize. Mozilla Foundation’s widely cited review of 25 major car brands concluded that all failed its privacy standards, describing modern vehicles as among the worst consumer product categories it had examined for privacy. The group said car companies collected broad categories of personal information and often gave drivers limited control over how that information was used or shared. The report became a turning point because it framed connected cars not as a future concern, but as a present-day privacy problem.
Regulators have begun to respond. In the United States, the Federal Trade Commission warned in 2024 that connected cars can collect sensitive information, including location and biometric data, and that improper use or disclosure can harm consumers financially and personally. In early 2025, the FTC announced a settlement with General Motors and OnStar over allegations that the company used a misleading enrollment process for connected services and shared precise geolocation and driving behavior data with third parties, including consumer reporting agencies. GM and OnStar were barred for five years from disclosing certain driver data to consumer reporting agencies without affirmative consent.
California’s privacy regulator has also focused on the connected-vehicle ecosystem. The California Privacy Protection Agency announced a review of data practices by connected-vehicle manufacturers and related technologies, looking at the collection and use of information such as location, biometric and behavioral data. In 2025, the agency reached a settlement with American Honda Motor Co. over alleged California privacy law violations, signaling that car data is no longer a niche issue for legal specialists. It is becoming a mainstream enforcement priority.
Europe has treated the issue through a data-protection lens for years. The European Data Protection Board’s guidelines on connected vehicles state that data produced by a vehicle can concern drivers, passengers and even people outside the car, and may qualify as personal data when it can identify someone directly or indirectly. The guidance recognizes that connected cars sit inside a complex ecosystem involving automakers, service providers, insurers, mapping companies, entertainment platforms and mobility apps. In other words, the privacy risk is not only what the car collects, but where the data travels.
The convenience is real. Driver-monitoring systems can detect drowsiness or distraction and may prevent crashes. Usage-based insurance can reward careful drivers. Predictive maintenance can identify mechanical problems before breakdowns occur. Over-the-air updates can fix software bugs without a dealership visit. Navigation systems can optimize routes based on traffic, battery range and charging availability. Emergency services can receive crash information faster. For families, fleet operators and elderly drivers, connected features can offer reassurance and practical value.
The problem is not intelligence itself. It is opacity. Many drivers do not know what data their cars collect, how long it is stored, whether it stays inside the vehicle, whether it is transmitted to the manufacturer, whether it is shared with data brokers, insurers or advertisers, and how easily it can be deleted. Consent is often buried inside long privacy notices, app permissions or dealership enrollment flows. A person may believe they are signing up for a safety feature or maintenance alert without understanding that driving behavior or location data could be used for other purposes.
Insurance is one of the most sensitive areas. In theory, telematics can make pricing fairer by rewarding safe driving rather than relying only on age, ZIP code or credit-related factors. In practice, drivers may not know when their behavior is being scored, who receives the score or how it affects premiums. Late-night driving, hard braking or rapid acceleration may look risky in a database even if there is a reasonable explanation. A system designed for personalization can become a system of judgment.
Law enforcement access raises another concern. Location history from a connected vehicle can reveal movements with extraordinary precision. U.S. lawmakers and privacy groups have questioned whether automakers provide such data to authorities without warrants or adequate notice to vehicle owners. Even when access is lawful, the issue highlights how cars now generate records that older vehicles never created. A private trip can become a searchable data trail.
There is also the risk of cybersecurity. A connected car is not only a transportation product; it is a networked device. Weak security can expose personal data or, in extreme cases, vehicle functions. Automakers have improved security practices, but the attack surface is expanding as vehicles integrate smartphones, cloud services, charging networks, third-party apps and semi-autonomous features. The smarter the car becomes, the more important it is to secure every digital doorway.
Consumers face an uncomfortable trade-off. Opting out of data collection may limit features that buyers paid for. Some vehicles require connectivity for navigation, remote start, battery management, software updates or advanced driver-assistance functions. A driver may technically have a choice, but the choice can feel unrealistic if privacy protection means degrading the product. True consent requires more than an “accept” button. It requires meaningful alternatives.
The industry’s challenge is to build trust before regulators impose it. Automakers should treat privacy as a safety feature, not a legal footnote. That means collecting only data necessary for a clear purpose, explaining data use in simple language, separating essential safety functions from optional data-sharing programs, giving drivers easy controls, allowing deletion, limiting retention and refusing to sell or share sensitive information without explicit permission. Cars already display tire pressure, battery range and blind-spot warnings. They could also display data status clearly: what is being collected, where it is going and how to stop it.
Design matters. A driver should not need a law degree to understand a car’s privacy settings. A family should be able to delete previous phone contacts and navigation history before selling a vehicle. A rental customer should be able to clear paired-device data after returning a car. A passenger should know if cabin cameras or microphones are active. A teenage driver should not be tracked by multiple parties without clear household rules. Privacy should be visible, reversible and practical.
The future of intelligent vehicles will not be decided only by engineers. It will be decided by whether consumers believe the car is working for them or watching them. The most successful smart cars may not be those that collect the most information, but those that use the least data necessary to deliver the most benefit. In an era of digital fatigue, restraint could become a competitive advantage.
Smart cars can make driving safer, easier and more personal. They can help people avoid accidents, maintain vehicles better and move through cities more efficiently. But convenience becomes dangerous when it depends on invisible surveillance. A vehicle that understands the driver should not become a vehicle that exposes the driver.
The question is not whether cars should become smarter. They already are. The question is whether the people inside them will remain in control of what their cars know.

