The likely demise of Roe v. Wade is putting a new spotlight on privacy rights and personal data. But even as some big tech companies are beginning to try to limit how much data their existing products collect, the industry keeps rolling out new waves of devices and services that scoop up even more personal info.
Why it matters: Any trove of data will sooner or later end up at the other end of a request, or order, to be shared with law enforcement. The newest generation of gear, including autonomous vehicles and always-on cameras, could provide the state with a persistent and omnipresent method of surveillance.
The big picture: “Data minimization” is an ethical guideline, encoded in the EU’s GDPR privacy framework and other regulations, urging organizations to collect only the data they actually need and to keep it only as long as they need it.
Google has taken a number of steps in this direction, including processing more data on devices rather than in the cloud and allowing users to set information to be deleted automatically.Apple has made data privacy a core principle, encrypting messaging data and limiting access to some other types of data so that only the user with the device can access it.
Be smart: Broader shifts in computing threaten to overshadow those efforts with new classes of products that depend on massive collection of data.
One big trend is building devices around sensors and cameras that are always recording, such as doorbell cameras and autonomous vehicles.The machine learning algorithms that underlie everything from search engines to speech recognition only work when trained on mountains of data, creating another incentive for companies to build info stockpiles.“Not only will more data be collected but exponentially more,” says Evan Greer, deputy director of Fight for the Future. “It’s inevitable that mountain of additional data being collected will be abused.”
Driving the news: This week, Vice reported details of the San Francisco Police Department seeking footage from GM-owned Cruise, one of several autonomous car companies whose vehicles are constantly riding around the streets of the city.
The SFPD acknowledged to Axios that it makes requests for footage to investigate specific crimes, but denied it wants to use them for ongoing surveillance.
Civil rights groups say the stakes of collecting such data have also increased, especially in the U.S.
“Those data sources have long been vulnerable to government demands,” says Matt Cagle, a staff attorney at the ACLU of Northern California, However, the rise of AI is “pouring fuel” on the fire, Cagle said, “particularly in this moment when states are criminalizing abortion and gender affirming care.””In a world where the Supreme Court might overturn Roe v. Wade, that dystopian hypothetical is no longer hypothetical,” Cagle told Axios.
Of note: Vast amounts of data are also available for purchase online, allowing law enforcement agencies to acquire information they may not be able to get through a court order.
Between the lines: While customers have some say over how their own data is stored on their devices, in many cases these newer technologies are sweeping up tons of data about bystanders and third parties.
With a doorbell camera, it’s the home owner or renter who buys the device, but the video can capture everyone who passes by. Autonomous vehicles owned by large tech companies like Cruise and Waymo are capturing footage of anyone in their path.
Yes, but: Data stockpiles sometimes serve a public good, especially in documenting human rights abuses.
One example: four Congressional Democrats Thursday called on Facebook and TikTok to preserve evidence of potential war crimes in Ukraine.
Flashback: Makers of the Flip camera, a portable video device that was popular before smartphones could capture large amounts of video, promoted the idea that more people recording more of their lives would make the world safer. This 2009 video explains it pretty well.