One non-regulatory way to help correct the excesses of Big Tech is for us to become smarter users. Say, hypothetically, an app offers a glimpse at what you might look like several decades down the road, all in exchange for one picture. It seems innocent enough.
Before forking over your image, however, consider the totally random possibility that a Russian company would be granted “a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.” Hypothetically, of course.
FaceApp’s CEO told the Washington Post this week that it doesn’t “sell or share any user data with any third parties,” although the Post reporter helpfully noted that’s aside “from what it shares with trackers from Facebook and AdMob.”
It’s unrealistic to expect users to scour every app’s terms before taking advantage of their services. Also, FaceApp may, indeed, have no nefarious plans for the trove of data it’s collected. But come on. We’ve been through this before. Remember all those Facebook quizzes? There’s a reason apps offer these services for free, and it’s not just in the name of fun and games. As Sen. Marsha Blackburn (R-Tenn.) said in a Thursday press release, “Your data is the bedrock of tech companies’ revenue.”
Blackburn and other legislators like Sen. Josh Hawley (R-Mo.) have proposed regulatory action in response to Big Tech’s questionable data collection practices. I’m sympathetic to that argument. But one way to mitigate the federal government’s intervention would be to take measures on an individual basis that disincentivize this type of exploitation.
That, of course, isn’t to minimize the shady data collection practices on which many tech companies rely, or argue against the potential need for regulation. It’s simply to say that we have enough information at this point to start relying on ourselves to protect our data as well.