The system, which is just a proof of concept, alarms privacy advocates who worry that prudent surveillance could easily lead to government overreach, or worse, unauthorized use. It relies upon two tools developed independently at Purdue. The Visual Analytics Law Enforcement Toolkit superimposes the rate and location of crimes and the location of police surveillance cameras. CAM2 reveals the location and orientation of public network cameras, like the one outside your apartment. You could do the same thing with a search engine like Shodan, but CAM2 makes the job far easier, which is the scary part. Aggregating all these individual feeds makes it potentially much more invasive.
Purdue limits access to registered users, and the terms of service for CAM2 state “you agree not to use the platform to determine the identity of any specific individuals contained in any video or video stream.” A reasonable step to ensure privacy, but difficult to enforce (though the team promises the system will have strict security if it ever goes online).
“I can certainly see the utility for first responders,” says Dave Maass, an investigative researcher with digital rights group EFF. “But it does open up the potential for some unseemly surveillance.”
Beyond the specter of universal government surveillance lies the risk of someone hacking the system. To Maass, it brings to mind the TV show Person of Interest and its band of vigilantes who tap government cameras to predict and prevent crimes. This is not so far-fetched. Last year, the EFF discovered that anyone could access more than 100 “secure” automated license plate readers. “I think it becomes a very tempting target,” says Gautam Hans, policy counsel at the Center for Democracy & Technology. “Thinking about security issues is going to be a major concern.”
Granted, the system does not tap private feeds, nor does it peer into private spaces like someone’s home. But aggregating this data and mapping it against specific crimes or emergencies is troubling. Hans says there’s no way of knowing when someone violates the terms of service and targets an individual, and the patchwork of regulations governing how agencies can use such technology is no guarantee against government over-reach.
Still, Hans is pragmatic and realizes the Purdue researchers have a noble goal. “At a certain level there’s only so much you can do to prevent the march of technology,” he says. “It’s not the best use of our time to rail against its existence. At a certain point we need to figure out how to use it effectively, or at least with extensive oversight.”