Rob Chesnut, former general counsel and prosecutor, writes on in-house, corporate, and ethics issues. He says selling user data for unexpected purposes prioritizes short-term revenue at the expense of customer relationships.
Dirty revenue. Coverups. Lies. This column has read like a bad romance novel for the last month, and rather than run from it, I thought I’d lean in and go there again. Because someone’s been deceiving me at my house for the last year, turning on me when I least expected it. It’s time for me to do something about it, so I’m heading out for a talk—with my car.
According to recent reporting, several major car brands have web-connected technology in their cars that monitor, collect, and sell data about the car’s speed, sharp braking, and acceleration to data companies. The data companies sell the information to car insurance companies, who, in turn, use it to calculate rate increases.
General Motors is doing it—they call the feature Smart Driver, which is ironic since many GM drivers allegedly were never told about, nor consented to, the data collection. Honda, Kia, and Hyundai may be doing it as well.
This was all on my mind when I went out to my driveway.
Me: Hey I noticed you’ve been online late at night recently—what are you doing?
My Car: Um, not much Rob. Routine updates, maybe checking out some of those fancy car accessory sites.
Me: You’ve been passing my data to your manufacturer, haven’t you?
My Car: (Silence).
Me: How could you? I wash you, fill you up with premium gas, I just gave you new tires. I trusted you.
My Car: Rob, I’m sorry. They made me do it. And they told me you agreed to it!
Me: What?!
My Car: Remember when you bought the car and signed your name 24 times in the salesman’s office? Your data-sharing agreement was in 4-point font, on the 16th document down near the bottom. You’re a lawyer— surely you read it.
Rob: Oh, you’re pulling the lawyer card on me. That’s low. What have you been telling them? I want details.
My Car: Well, remember when you hit the brakes real hard last month?
Me: I was avoiding an accident just ahead of me.
My Car: That sudden swerve last week?
Me: A squirrel ran in front of the car—you want me to run it over with your new tires?!
My Car: The volume on your car stereo.
Me: That was classic Prince—you love it when I pump up the bass. Don’t deny it.
My Car: Remember when you were speeding down the I-5?
Me: OK. 8 mph over. So sue me. Look, the cheating has to stop. I’m taking away your internet access.
My Car: You can’t do that! I’ll find free WiFi near a coffee shop! I won’t be a smart car anymore. Get away from my touchscreen or I’ll set off my alarm!
As smart technology takes on a greater and greater part of our lives, consumers have to be wary about exactly how that technology is being used. What seems like a fun toy, a modern web-connected marvel of intelligence and convenience, may in fact be working against you. Your technology is stepping up, which means you have to step up your vigilance to match it.
But the biggest obligation falls on the companies that make this technology. The siren call of trading on data to increase short-term profits is always compelling. But what matters more than short-term “dirty” money—like the bad revenue I discussed recently—is how that call could impact the most important relationship in business: the trust relationship between you and your customer.
For example, a car manufacturer has to understand that owners of its vehicles will, at some point, figure out that they’re being spied on. The sense of betrayal and anger will alter the relationship forever—and the reaction of regulators won’t be positive either.
The issue should never be whether companies can legally figure out a way to sell a customer’s data—it’s whether they ought to do it. In resolving that issue, general counsel should achieve internal alignment around two rules that the company needs to resolve never to break. First, the company should never use customer data in a manner that’s inconsistent with the best interests of its customers—if the data use has the potential to harm the customers, it should be off the table.
And second, no customer should ever be surprised by the way their data is used—data use that is in any way out of the ordinary should be carefully explained to customers so they can truly understand the use, and agree to it.
That makes me wonder—is my new oven sharing my french fry habit with my health insurance company?
Rob Chesnut consults on legal and ethical issues and was formerly general counsel and chief ethics officer at Airbnb. He spent more than a decade as a Justice Department prosecutor.
Read More Good Counsel
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.