When corporate decides to do something online, it falls on IT developers to make those wacky ideas from marketing or thel ine-of-business bosses into bits-and-bytes reality. Developers have always favored grabbing as much data as practical, in case someone needs it in six months. This is based on the time-honored rationale that if it’s not captured initially, the data is gone forever.
The problem is that capturing data when it’s not needed — especially sensitive, personal data — is asking for trouble. Walmart, Starbucks, Delta Air Lines, Facebook, and Walgreens have all recently learned that lesson with data leaks from their mobile apps. At least Walgreens should have learned that lesson. Seems that they didn’t.
What lessons should have been learned?
In January, Starbucks was found to have stored its password on its iOS devices in clear-text. If a thief grabbed the phone, the bad guy didn’t need to have the phone’s PIN. The password was readily available and, along with it, free stuff from the Starbucks Card. Even worse, the reality of consumers reusing passwords meant the potential damage was far worse.
This was quickly followed by the discovery that Delta’s Android mobile app had issues. The company was wise enough to encrypt their customers’ mobile passwords, but it dropped the ball bigtime by also saving — in cleartext — its encryption key. When called on it, the chain declared it fixed, when it merely moved the unencrypted key from one easy-to-find area to another.
Walmart made its iOS mobile app password available through its automated iTunes backup, but also joined in Starbucks by saving, and thereby exposing, an extensive history of its customers’ many movements, as captured by geolocation records, along with a similarly length history of purchases.
Then Walgreens’ mobile app was found to encourage its customers to take digital pictures, through its app, of prescription labels; and, yes, the app then made those images available to anyone who accessed the phone.
Beyond the security and privacy implications, the problem with the geolocation, pill bottle, and purchase history data captures is that the companies had no need for that data. They saved it “just in case,” which I am arguing is becoming an impressively ill-advised programming strategy.
That brings us back to Walgreens. Last week (April 21) the company announced a seemingly innocuous program. Called Destination-Specific Travel Health Services, the idea is to encourage vacationers and business travelers to turn to their local Walgreens when they plan global travel. The chain could then provide the latest recommended shots and other medicinal items appropriate to the destination.
This makes all the sense in the world coming out of marketing. It’s the exact kind of program that begs for developers and others in IT, especially security, to slam on the brakes and to ask some pointed questions. “Do we really want to store the travel plans — dates, destinations — of our customers?
Setting aside for the moment the critical question of whether the parties would ever again use such data, let’s try to imagine how many bad things could happen to us if someone evil accesses this data.
Getting back to that critical question, what could we possibly use that data for that could be worth that risk?
Companies need to start thinking through the information they are collecting, putting on the green-tinted glasses of a cyberthief or someone far worse. What could a burglar do with such information? A terrorist looking to identify potential U.S. kidnapping victims? A corporate spy wanting to know an executive’s itinerary?
Then there are the marketers, of everything from airlines and hotels to sightseeing tours. How will a Walgreens customer feel if their shot visit to the chain unleashes a sea of foreign service pitches?
That’s another element of the problem. Save it initially and IT is inclined to try and save it longterm. Save it initially and marketing will try to think up ways to sell and otherwise monetize it. Save it and that’s one more piece of data that you need to backup, encrypt, protect, and monitor.
Developers need to be the face of rationality, prudence, and balance.
(OK, it’s best that you not look at your cubicle mates as you think about this.) Marketing dreams up the ideas, but they won’t likely consider security, privacy, and logistical implications. Developers are good at pointing out the easy realities, along the lines of “You realize that creating such an application would likely need six months of testing and thousand hours of coding at a cost to your department of $XXXXXXX?”
Thinking through the other implications of data-collection is now your new responsibility. Why? No one else seems to be doing it. And when the leak happens and the problems explode, someone in marketing will find a way to blame you. They might not need to even reach very far beyond “Why didn’t you warn me?”
Photo by Brandon Bartoszek