Data Privacy Archives - San Francisco Public Press https://www.sfpublicpress.org/category/data-privacy/ Independent, Nonprofit, In-Depth Local News Thu, 28 Mar 2024 21:37:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 Exploring Privacy of Coronavirus Exposure Notification System https://www.sfpublicpress.org/exploring-privacy-of-coronavirus-exposure-notification-system/ https://www.sfpublicpress.org/exploring-privacy-of-coronavirus-exposure-notification-system/#respond Wed, 27 Jan 2021 20:28:29 +0000 https://www.sfpublicpress.org/?p=191007 Millions of Californians have gotten a push notification on their phones asking them to opt in to CA Notify to get warnings from their phones if they have been in close proximity to someone who later tested positive for the coronavirus. Gennie Gebhart, the activism director with the digital civil liberties group Electronic Frontier Foundation, explained to “Civic” how the system works and what information is exchanged.

The post Exploring Privacy of Coronavirus Exposure Notification System appeared first on San Francisco Public Press.

]]>
Millions of Californians have gotten a push notification on their phones asking them to opt in to CA Notify to get warnings from their phones if they have been in close proximity to someone who later tested positive for the coronavirus. Gennie Gebhart, the activism director with the digital civil liberties group Electronic Frontier Foundation, explained to “Civic” how the system works and what information is exchanged.

CA Notify, in the form of an app that Android users must download and an interface that iPhone users can activate without downloading anything, uses Bluetooth to estimate how close devices are to one another. Phones with the system activated don’t send out identifying information, but instead a rotating string of random characters. If someone with the system active on their phone tests positive, they would alert the CA Notify system of their infection with a health care provider’s help. Other devices that recognize a character string that person’s device had been sending out would alert their users to the potential exposure. 

While Bluetooth is not a new technology and was not designed for this purpose, Gebhart said using it, rather than location data, protects privacy.

“Location data from GPS or cell towers isn’t good at seeing how close you are to someone. But it’s very good at exposing where you’ve been, and what you might have been doing and who you might have been with,” she said. “So it is a big benefit that the system is using Bluetooth, really the most appropriate and promising approach to figuring out if you’ve been meaningfully close to people without exposing where you’ve been.”

Another privacy-protecting aspect of the system is that any information exchanged is not stored in a central database. Instead, the anonymous device identifiers are stored locally on the devices they have been close to.

“There’s not one central authority where all the identifiers are getting sent all the time where, for example, law enforcement or ICE could have access and pick through where people have been or who they’ve been with,” Gebhart said. 

Around 8 million of the state’s 40 million people had opted into the system as of early January. Though more research is needed to determine how much adoption affects effectiveness, more participation would likely increase the system’s ability to help slow the spread of the virus. And Gebhart said it is important to consider access to technology and services when evaluating the system as a tool to help curb the virus. Some of those who have been most affected by the pandemic, like low-income families and elderly people, are also less likely to have consistent access to a smartphone. An exposure notification may also be useless to a person who does not have access to coronavirus testing or is unable to quarantine. 

“This is not a techno-magic bullet that’s gonna save us,” Gebhart said. “This is the cherry on top of those health care, public health fundamentals and basics that you need to get right before this can be useful.”

A segment from our radio show and podcast, “Civic.” Listen at 8 a.m. and 6 p.m. Tuesdays and Thursdays at 102.5 FM in San Francisco, or online at ksfp.fm, and subscribe on Apple, Google, Spotify or Stitcher

The post Exploring Privacy of Coronavirus Exposure Notification System appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/exploring-privacy-of-coronavirus-exposure-notification-system/feed/ 0
Neighborhood Anti-Crime Surveillance Effort Prompts Privacy, Equity Concerns https://www.sfpublicpress.org/neighborhood-anti-crime-surveillance-effort-prompts-privacy-equity-concerns/ https://www.sfpublicpress.org/neighborhood-anti-crime-surveillance-effort-prompts-privacy-equity-concerns/#respond Fri, 11 Dec 2020 01:16:25 +0000 https://www.sfpublicpress.org/?p=155084 On several streets in the Mission, you can spot sticky notes in the windows of some homes. They’re blank, but they’re sending a message: The residents would like to signal their interest in participating in a neighborhood effort to address crime, trash and visible homelessness in the neighborhood. Nuala Bishari reported on the initiative for the San Francisco Public Press. She talked with “Civic” about what she found and how she learned it.

The post Neighborhood Anti-Crime Surveillance Effort Prompts Privacy, Equity Concerns appeared first on San Francisco Public Press.

]]>
On several streets in the Mission, you can spot sticky notes in the windows of some homes. They’re blank, but they’re sending a message: The residents would like to signal their interest in participating in a neighborhood effort to address crime, trash and visible homelessness in the neighborhood. Part of this coordinated effort between neighbors and police is the suggestion that residents install surveillance cameras from Ring, an Amazon company. Nuala Bishari reported on the somewhat secretive initiative for the San Francisco Public Press. She talked with “Civic” about what she found and how she learned it.

“I think one of the big issues, really, when we’re looking at private surveillance is the fact that legislation is still catching up to protecting the rights and civil liberties of people. And this is something that we see a lot in San Francisco, that the tech appears and then we kind of scramble a couple years later to figure out a way to legislate around it. And Ring is a really perfect example for that.”

— Nuala Bishari

A segment from our radio show and podcast, “Civic.” Listen at 8 a.m. and 6 p.m. Tuesdays and Thursdays at 102.5 FM in San Francisco, or online at ksfp.fm, and subscribe on Apple, Google, Spotify or Stitcher

The post Neighborhood Anti-Crime Surveillance Effort Prompts Privacy, Equity Concerns appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/neighborhood-anti-crime-surveillance-effort-prompts-privacy-equity-concerns/feed/ 0
Why Law Enforcement Should Publicize Surveillance Policies, Procedures https://www.sfpublicpress.org/why-law-enforcement-should-publicize-surveillance-policies-procedures/ https://www.sfpublicpress.org/why-law-enforcement-should-publicize-surveillance-policies-procedures/#respond Thu, 06 Aug 2020 15:00:00 +0000 https://sfpublicpress.org/?p=62997 OPINION: Studying the surveillance technology in use by law enforcement in the Bay Area has led us to believe camera registries and networks are so prevalent that residents could rightly question whether their purpose is for surveillance instead of security. But uncovering how and when these cameras and other technologies are being used is not easy.

The post Why Law Enforcement Should Publicize Surveillance Policies, Procedures appeared first on San Francisco Public Press.

]]>
OPINION

Most of the time, the video cameras we walk past every day are innocuous fixtures in the background. But because they’re so prevalent, when law enforcement officials gain access, webs of security cameras can be used as surveillance networks to monitor the everyday behavior of a massive number of city residents – and in many cases, the public would know little to nothing about it.

Studying the surveillance technology in use by law enforcement in the Bay Area has led us to believe camera registries and networks are so prevalent that residents could rightly question whether their purpose is for surveillance instead of security. But uncovering how and when these cameras and other technologies are being used is not easy.

While law enforcement agencies are technically complying with a recently enacted transparency law, three of the region’s largest police forces disclose their capabilities in formats that make it difficult for a layperson to learn which technologies are in use, and how, without already knowing where to look.

Listen to the authors of this piece at a recent Public Press Live webinar:

Camera surveillance ubiquitous

Camera networks and registries are pervasive in the Bay Area, which means people going about their everyday lives in communities that employ these tools may not realize how likely it is that their activities are recorded and retained by a system designed to fight crime and enforce laws.

Camera networks vary in terms of ownership. Camera registries, for which residents register personal video surveillance systems with local law enforcement, rely on residents voluntarily working with law enforcement agencies and are rarely properly documented under Senate Bill 978, a recently enacted transparency law. Other networks are managed directly by law enforcement.

We counted 36 camera networks in the Bay Area, of which 29 are directly run by law enforcement agencies. Some are provided and run by private entities. One such example was detailed in a recent New York Times article on a wealthy San Francisco resident who bankrolled an expanding network of privately-owned security cameras. The Electronic Frontier Foundation revealed in July that the San Francisco Police Department was granted remote access to this private network to surveil protestors — counter to local privacy law, which would have required city permission, the foundation alleged.

The camera network operated by the Moraga Police Department in Contra Costa County is an example of smaller, more targeted use of surveillance cameras. According to documents published by the Moraga Community Foundation in April 2017, cameras were installed at strategic locations and accessed by police investigators only in relation to solving a crime.

Most law enforcement agencies fail to specify when they will delete the data they collect through these networks. For example, the Fairfield Police Department states in its policy manual that data should be retained for a minimum of 30 days. The Cloverdale Police Department states recordings must be stored for a minimum of one year. Neither policy manual specifies a limit for how long the agencies can hold recordings that are not being used in legal proceedings.

Camera registries, which law enforcement agencies can only access by calling on a resident who has voluntarily registered their video system, are often framed as an ideal example of community-law enforcement agency partnerships and are touted as strengthening the relationship between citizens and law enforcement.

In our search, we were unable to locate lists of participants to understand who is signing up for such partnerships, or how many people participate. Just three of the 32 agencies that we know use camera registries have disclosed policies detailing the governing functions of these programs, which is required under SB 978. This means that for residents under the jurisdiction of the 29 agencies with no published governing policies, there is no way for individuals to know more about the camera footage that could be used to surveil them. 

Orinda Police Department’s website for their camera registry program. Captured July 19, 2020.

Body cameras, license plate readers top list of surveillance technologies

To address this lack of transparency, California Governor Jerry Brown signed Senate Bill 978 into law in October 2018. It requires, by January 1, 2020, that each of California’s local law enforcement agencies conspicuously post their standards, policies, procedures and training materials on their official websites, to provide their communities with a comprehensive understanding of what capabilities and actions they should expect from their police forces.

While this information was not secret before SB 978, access required specific and individual requests through the California Public Records Act. Senator Steven Bradford (D-Gardena) authored SB 978, stating that he hoped, among other benefits, the law would “build mutual trust as well as improve police accountability.”

We used both open source media like newspapers and law enforcement agencies’ own SB 978 documentation to examine and record the surveillance techniques and capabilities employed by law enforcement agencies in the Bay Area. We studied 80 law enforcement bodies, including some that are part of universities or community benefit districts, which are not subject to SB 978’s disclosure requirements. Raw data from all 80 agencies provide an aggregate view of how law enforcement operates throughout Alameda, Sonoma, Solano, Contra Costa and San Francisco counties. We created a spreadsheet and map meant as a tool for anyone doing specialized research on police surveillance in this region.

We found that the most prolific technologies in use are body-worn cameras, automated license plate readers, camera networks and camera registries, as seen in the table below. Of the 80 agencies studied, 85% employ either camera networks, camera registries or both. We also found that three of the region’s largest police forces disclose their capabilities in ways that make finding relevant information tedious and difficult.

TechnologyAgencies Using Technology
Body Worn Cameras55
Automatic License Plate Readers41
Camera Network36
Camera Registry33
Drones16
Gunshot Detection9

Disclosures bring only limited transparency

Some agencies’ disclosure documents comply with the letter, but not the spirit, of SB 978. The majority — 51 of the 67 law enforcement agencies examined — simply published their manuals in Portable Document Format (PDF) on their websites, with the text recognizable to a browser’s search function, making it easy to search specific terms like “Body Camera” or “Automated License Plate Reader.” A digital search for key terms encompassed the entire publication and required no familiarity with naming conventions or file format.

But 16 agencies sectioned their manuals into separate PDF documents and published links to each section. This requires an outsider to correctly guess which section to open for searches, and ultimately results in repetitive searches across all the published sections. In order to view the Oakland Police Department’s policy on body-worn-cameras, for example, a user must successfully navigate a hierarchy of folders and specialized headings.

Oakland PD’s partitioned interface. Each icon leads to a searchable PDF or another folder. Berkeley’s and San Francisco’s police departments are similarly formatted.

The San Francisco Police Department, too, partitions its required publication. Meanwhile, San Francisco also has a Committee on Information Technology that is required by city ordinance to publish this page, which lists all of the surveillance technologies employed by all city departments. The committee’s list of technology used by SFPD is extensive but explicit and includes a reference to Pen-Link, a communications company that provides law enforcement agencies with technology that intercepts electronic communications. But we were unable to find any explanation of this technology in the SFPD’s policy publications, although we may have missed it in our manual search through its partitioned interface.

Oakland’s and Berkeley’s police publications are similarly arranged, and thus similarly opaque. The San Francisco Sheriff’s department is considerably worse. The Sheriff’s policies, pictured below, are both partitioned and stored as scanned and non-searchable images without an index or table of contents, thus requiring any researcher to manually scan more than 650 pages of content.

San Francisco Sheriff’s publication. Over 650 pages of non-searchable scanned pages, partitioned into two PDF documents.  Note that the table of contents lists dates instead of page numbers.

We have criticized law enforcement agencies from San Francisco, Oakland and Berkeley for how they have complied with Senate Bill 978 because these agencies could easily clarify how they’ve presented their policies to their communities.

For greater trust and accountability, police departments that rely on surveillance camera footage should articulate how their officers obtain that footage, how they use it, and who gets access. Furthermore, we recommend that all agencies publish their policies in a way that an uninitiated outsider can navigate. These are simple revisions that might bring greater accountability and trust to about 1.5 million Bay Area citizens.

Shelby Perkins and Craig Nelson are graduate students at Stanford University’s Freeman Spogli Institute, where they study international policy. The views expressed here are their own.  

The post Why Law Enforcement Should Publicize Surveillance Policies, Procedures appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/why-law-enforcement-should-publicize-surveillance-policies-procedures/feed/ 0
S.F. Police Accessed Private Cameras to Surveil Protesters, Digital Privacy Group Reveals https://www.sfpublicpress.org/sf-police-accessed-private-cameras-to-surveil-protesters-digital-privacy-group-reveals/ https://www.sfpublicpress.org/sf-police-accessed-private-cameras-to-surveil-protesters-digital-privacy-group-reveals/#respond Tue, 28 Jul 2020 23:55:47 +0000 https://sfpublicpress.org/?p=57514 When a tech executive helped bankroll a private network of security cameras in San Francisco, it was touted as crime-fighting technology that would not be directly in the control of law enforcement. But a report from the Electronic Frontier Foundation, a digital privacy advocacy group, shows that the San Francisco Police Department gained remote access […]

The post S.F. Police Accessed Private Cameras to Surveil Protesters, Digital Privacy Group Reveals appeared first on San Francisco Public Press.

]]>
When a tech executive helped bankroll a private network of security cameras in San Francisco, it was touted as crime-fighting technology that would not be directly in the control of law enforcement. But a report from the Electronic Frontier Foundation, a digital privacy advocacy group, shows that the San Francisco Police Department gained remote access to this private camera network for days at a time during protests in late May and early June. The privacy group says that access was a violation of San Francisco law.

The camera network in question is managed by the Union Square Business Improvement District. Emails obtained by the foundation show that the group received, and approved, a request from SFPD to obtain remote access to the cameras for 48 hours on May 31. Two days later, police personnel made a request to extend that access “through the weekend,” when demonstrations were planned that “we worry will turn violent again.”

While the emails don’t offer further details about the kind of access officers were granted or how they used it, on June 10 an officer wrote district staff an email thanking them for “assisting us with our request for the use of your cameras during this period of civil unrest and rioting.”

Critics describe such cooperation as problematic. “There’s a history in this country of surveillance technology being not just surveillance technology, but surveillance in general being used to chill dissent,” said Dave Maass from the foundation in an interview with “Civic” earlier this month about law enforcement agencies surveilling protesters.

Reports by the New York Times and ABC7 have probed the relationship between private camera networks like this one and the police department, suggesting that private networks might be more palatable to privacy advocates who balk at extended law enforcement surveillance capacity. Both outlets spoke with Chris Larsen, CEO of the cryptocurrency firm Ripple and one of the network’s financial backers.

“The police can’t monitor it live, it’s actually against the law in San Francisco,” Larsen told ABC7. “They have to make a request just like anybody else.”

SFPD may not have inherent, on-demand access to live feeds, but their request for access was granted in a timely manner. And Maass and Matthew Guariglia from the Electronic Frontier Foundation argue that this is a violation of a privacy law passed in San Francisco last year requiring law enforcement to engage in a public process to get approval to access surveillance technology managed by third parties.

Hear Dave Maass speak at a recent Public Press Live webinar on surveillance:

“SFPD’s unfettered and indiscriminate live access for over a week to a third-party camera network to monitor protests was exactly the type of harm the ordinance was intended to protect against,” Maass and Guariglia wrote.

Sergeant Michael Andraychak said the police department needed time to research a request for comment on allegations the camera access violated the law and could not respond by the time of publication.

Caches of footage are also being used by the business district and the SFPD to pursue criminal charges. The foundation reported that the police department asked for, and received, 12 hours of footage from all the cameras in the Union Square Business Improvement District’s network during the night of May 30 to the morning of May 31, when looters damaged several businesses in Union Square. On June 2, the San Francisco Business Times reported that the business improvement district was using this footage in collaboration with the police department to pursue some 36 criminal cases against suspected vandals.

In his interview with “Civic,” Maass discussed the prevalence of digital surveillance. “This is something that happens in our society, even if it’s not legal, even if it feels like it’s unconstitutional,” he said. “It’s important to keep an eye on it to make so when it does happen, you can stop it or hold the perpetrators accountable.”

“Civic” spoke with Cyrus Farivar, a reporter on the tech investigations unit of NBC News in San Francisco and author of “Habeas Data,” about law enforcement surveillance of protests. He said one concern about expanded surveillance capacity is that people engaged in perfectly lawful activity are caught in broad data gathering like automated license plate readers and cameras. A person engaged in perfectly legal activity — like protesting — may still be uncomfortable with the government having a record of the activity they engage in.

“This type of technology, surveillance technology, is getting more sophisticated by the day,” Farivar said. “As time goes on, I think it’s incumbent on all of us to ask questions, whether we are trained journalists or not, to find out what kinds of surveillance are being used in our cities and our counties and our states and our nation.”

— Additional reporting from Ciara Long

The post S.F. Police Accessed Private Cameras to Surveil Protesters, Digital Privacy Group Reveals appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/sf-police-accessed-private-cameras-to-surveil-protesters-digital-privacy-group-reveals/feed/ 0
Tracking Surveillance Tech Used by Bay Area Law Enforcement https://www.sfpublicpress.org/tracking-surveillance-tech-used-by-bay-area-law-enforcement/ https://www.sfpublicpress.org/tracking-surveillance-tech-used-by-bay-area-law-enforcement/#respond Thu, 09 Jul 2020 01:18:21 +0000 https://sfpublicpress.org/?p=39786 From automated license plate readers to drones to devices designed to identify gunshots, law enforcement agencies use a variety of tools to gather data. Many are visible, if not immediately obvious to casual passersby. Dave Maass, senior investigative researcher with the digital privacy nonprofit Electronic Frontier Foundation, has been keeping a close eye on the […]

The post Tracking Surveillance Tech Used by Bay Area Law Enforcement appeared first on San Francisco Public Press.

]]>
From automated license plate readers to drones to devices designed to identify gunshots, law enforcement agencies use a variety of tools to gather data. Many are visible, if not immediately obvious to casual passersby. Dave Maass, senior investigative researcher with the digital privacy nonprofit Electronic Frontier Foundation, has been keeping a close eye on the proliferation of surveillance technology and helps educate the public on how to identify it. 

Stanford students Craig Nelson and Shelby Perkins have been researching which law enforcement agencies in the Bay Area use which technologies and mapping the results. Nelson and Perkins have also been tracking whether and how well agencies are complying with a state law that requires them to publish their standards, policies, procedures and training materials online.

“When people are going out into the world we are now constantly surrounded by surveillance technology and it has become somewhat invisible to us even though it’s just right there in front of our eyes. There’s a history in this country surveillance technology being used to chill dissent. Whether that’s the FBI going after Martin Luther King or police in the Los Angeles area sending undercover cops into college classrooms to spy on students and professors. This is something that happens in our society even if it’s not legal, even if it feels like it’s unconstitutional, it does happen and it’s important to keep an eye on it.”

— Dave Maass 

The post Tracking Surveillance Tech Used by Bay Area Law Enforcement appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/tracking-surveillance-tech-used-by-bay-area-law-enforcement/feed/ 0
Law Enforcement Monitors Protesters, Reporting Shows https://www.sfpublicpress.org/law-enforcement-monitors-protesters-reporting-shows/ https://www.sfpublicpress.org/law-enforcement-monitors-protesters-reporting-shows/#respond Wed, 08 Jul 2020 01:04:44 +0000 https://sfpublicpress.org/?p=38854 With hundreds of thousands of people taking to the streets in recent weeks to protest police killings and racism, the law enforcement response has been highly visible. But in less obvious ways, law enforcement officers also gather information about protesters both online and in public. Cyrus Farivar, a reporter on the tech investigations unit of […]

The post Law Enforcement Monitors Protesters, Reporting Shows appeared first on San Francisco Public Press.

]]>
With hundreds of thousands of people taking to the streets in recent weeks to protest police killings and racism, the law enforcement response has been highly visible. But in less obvious ways, law enforcement officers also gather information about protesters both online and in public. Cyrus Farivar, a reporter on the tech investigations unit of NBC News in San Francisco and author of “Habeas Data” has covered some recent cases in which law enforcement surveillance of social media posts about protests has resulted in real life enforcement actions, including arrest by the FBI. Read Farivar’s reporting at NBC News.

“I think for most of us we understand, like, OK: The police are looking for one criminal mastermind and they’re taking extraordinary measures to go after one person. That’s one thing. It’s an entirely different thing when people are peacefully assembling and the government is using something like license plate reader or a drone or something else that can monitor who is there, whether that’s through facial recognition whether that’s through a device that’s gathering data off of cell phones, whether that’s something else that we can’t even imagine yet.”

— Cyrus Farivar

A segment from our radio show and podcast, “Civic.” Listen at 8 a.m. and 6 p.m. Tuesdays and Thursdays at 102.5 FM in San Francisco, or online at ksfp.fm, and subscribe on Apple, Google, Spotify or Stitcher

The post Law Enforcement Monitors Protesters, Reporting Shows appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/law-enforcement-monitors-protesters-reporting-shows/feed/ 0
Consumer Reporting Firms Fought for a Year to Exempt Data From California Privacy Law https://www.sfpublicpress.org/consumer-reporting-firms-fought-for-a-year-to-exempt-data-from-california-privacy-law/ https://www.sfpublicpress.org/consumer-reporting-firms-fought-for-a-year-to-exempt-data-from-california-privacy-law/#respond Thu, 13 Feb 2020 23:03:18 +0000 http://sfpublicpress.flywheelstaging.com/news/consumer-reporting-firms-fought-for-a-year-to-exempt-data-from-california-privacy-law/ Even though federally regulated consumer reports were already exempted from California’s ambitious new privacy law, the companies that sell them spent much of the last year engaged in an as yet unsuccessful lobbying effort to prevent individuals from opting out of sharing their own data from the firms’ databases. That’s in part because they have diversified beyond consumer reports and credit scores and into the creation of personal profiles based on online information that is less well regulated and critics of the industry call intrusive.

The post Consumer Reporting Firms Fought for a Year to Exempt Data From California Privacy Law appeared first on San Francisco Public Press.

]]>
Even though federally regulated consumer reports were already exempted from California’s ambitious new privacy law, the companies that sell them spent much of the last year engaged in an as yet unsuccessful lobbying effort to prevent individuals from opting out of sharing their own data from the firms’ databases.

That’s in part because they have diversified beyond consumer reports and credit scores and into the creation of personal profiles based on online information that is less well regulated and critics of the industry call intrusive.

Records of lobbying by the industry show these kinds of requests continued throughout 2019, up to the eve of the law’s implementation on Jan. 1 this year. While the companies failed last year to pass legislation that would pare back consumers’ rights to their data failed in 2019, they have argued that they can refuse to disclose information about consumers in part to protect “the security of the business’s systems or networks.”

The arguments they made, in correspondence among the thousands of pages of response to the law, also exposed a little-known aspect of the business: Consumer reporting agencies are transforming themselves into large-scale vendors of unregulated categories of personal information. Those activities now dwarf what was for decades their core business model. They have argued that allowing users to control this data could threaten their ability to provide those services to protect against hackers and thieves.

Much of the lobbying was focused on the category of “fraud prevention services,” a category whose definition is highly contested. Debates in 2020 about amendments to the law are likely to grapple with this question as well.

Emory Roane, policy counsel for the San Diego-based Privacy Rights Clearinghouse, said the term fraud prevention sets “an overbroad and unclear standard that gives businesses far too much leeway to refuse to comply with consumer requests.”

The landmark 2018 legislation, the California Consumer Privacy Act, promises to give consumers greater control over their personal information in a variety of ways. It will let people find out what information companies with large databases possess about them, and compel them to delete or stop selling their most intimate details. It sets penalties for noncompliance — though the office of California Attorney General Xavier Becerra has said it might not have the staffing to pursue more than a small handful of prosecutions against companies each year.

Last February, Eric Ellman, an executive with the Consumer Data Industry Association in Washington, D.C., flew to Sacramento for a hearing hosted by the attorney general’s office. Ellman proposed that business practices including fraud prevention be clearly exempted from the act, which comes into full force in phases throughout 2020.

eric-ellman.jpg
Eric Ellman, senior vice president for public policy and legal affairs at the Consumer Data Industry Association, told California officials that the state should give businesses that have services protecting against fraud an exemption from privacy new regulations. Photo: CDIA.

The law already appeared to protect all companies from consumer requests to delete data if they claimed a fraud-prevention exemption. But consumer reporting firms argued that consumers also should be deprived of the right to opt out of having their data shared with third parties. Why? Because allowing opt-outs would create security holes in their databases that would erode everyone’s security.

“This would affect not only the consumer who requested opt-out, but all consumers, as effective fraud detection requires a large volume of data,” said the submission from Ellman’s association to the attorney general.

Jason Engel, an executive with Experian North America, one of the three largest consumer reporting agencies, expressed a concern similar to Ellman’s in a Dec. 6 letter to Becerra’s office commenting on draft regulations. Engel wanted the state to clarify the “scope” of fraud data that would be spared from deletion.

But Ellman and Engel’s work had consequences for more than the industry’s ability to guard against fraud. Publicly available corporate filings show consumer reporting agencies have developed enormous revenue streams that do not come from federally regulated consumer reports, yet use similar types of data.

Like Experian, two other consumer reporting giants, TransUnion and Equifax, are becoming large-scale data brokers, a term describing technology-based companies that use warehouses full of proprietary and publicly available digital records to create profiles of hundreds of millions of people. These profiles are sold to just about anyone for the purposes of targeted advertising, marketing, fundraising or tracking down individuals — raising just the kinds of privacy concerns that the California law is intended to regulate.

This is a far cry from the activities commonly associated with these companies, and which extensive federal regulations have applied for decades: determining credit, housing, employment or other risk factors on hundreds of millions of consumers and selling them in standardized reports.

Mission creep

Consumer reporting agencies are nowadays engaged in open trade in a broad array of data collection and resale beyond those reports, now representing as much as 80% of the industry’s overall revenues of more than $10 billion, said Pam Dixon, executive director of the World Privacy Forum, a privacy watchdog group.

While each company is slightly different, Dixon said, they “are typically now hybrid entities that have separate business units” performing services “which can be in many cases more profitable than the consumer reporting agency side of the business.”

Experian’s main market is the United States, and the company has a large presence in California, with its North American regional headquarters in the Orange County city of Costa Mesa. The company’s global CEO, Brian Cassin, said in an earnings call with investors last May that the company was seeing near-record earnings through its expanded business practices utilizing personal data.

“It’s been a very good year for Experian, one of our best in history in fact,” Cassin said. He added that he saw identity verification as “a large and growing market opportunity,” since more businesses across the economy are moving online, increasing their vulnerability to sophisticated fraud techniques.

brian-cassin.jpg
Brian Cassin, Experian’s global CEO, called identity verification — outside the company’s traditional trade in federally regulated consume reports — “a large and growing market opportunity.” Photo: Experian.

Consumer reporting agencies have been working for years to diversify the consumer data they sell, said Chi Chi Wu, an attorney for the Washington, D.C.-based National Consumer Law Center and an expert on the federal law that regulates consumer reports.

“The mission creep incentive is to sell more data,” Wu testified at a House Financial Services committee hearing last February.

Ed Mierzwinski, senior director of the federal consumer program at the Washington, D.C.-based U.S. Public Interest Research Group, argued that these companies were improperly trying to attribute their broader data-collection practices to consumer reporting or fraud prevention — anything that’s already exempt from disclosure.

“There are several sets of companies and interest groups that are trying to drive a hole in the privacy protections of the CCPA,” he said, referring to the California privacy law, “and the credit bureaus are the most disingenuous, probably.”

Lee Tien, a senior staff attorney specializing in privacy at San Francisco-based Electronic Frontier Foundation, said that “fraud prevention” might be a catch-all, justifying the collection of more than what’s necessary.

“The word fraud is so broad,” Tien said. “It’s very hard to pin down and yet it’s hard to oppose.”

But Lydia F. de la Torre, adjunct professor of privacy law at Santa Clara University, was more cautious. She said that at least part of the companies’ central activities is legitimately intended to protect the public rather than invade their privacy for profit. “A lot of cybersecurity relies on accessibility to threat intelligence,” she said.

For sale: everything about you

One of Experian’s leading products sold to businesses is a tool named CrossCore. The company calls it its “first open fraud and identity” platform, and it is marketed as a comprehensive package to financial firms and other businesses. CrossCore, according to its website, is supported by nearly 300 experts globally. Experian doubled its users in the last fiscal year and has 133 companies using the platform, it said in its most recent annual financial statement.

Experian is also pushing two other services: PowerCurve and Ascend. PowerCurve reportedly makes predictions on how consumers may react to products, while Ascend makes inferences on consumer behavior with machine learning and artificial intelligence. Experian said PowerCurve alone saw a 60% growth compared with the year before.

Neither Ellman nor representatives for Experian returned any of more than a dozen phone and email requests for an interview. Experian spokesman Jordan Takeyama said in an email that beyond comments shared by Ellman, the industry representative, “we do not have anything to add at this time.”

TransUnion and Equifax also did not respond to requests for clarification on their business services, products or public financial statements.

Experian collects and sells a wide range of consumer data to client marketers, who can use it to craft “campaign messaging that truly resonates with each” of their target groups or customers.

One Experian brochure for marketers touts its data on more than 300 million individuals and 126 million households and businesses, enabling clients to “reach niche markets from children to grandparents, mobile homes to mansions.”

According to Experian’s web site, the company sells basic snapshots of consumers that include estimated mortgage amounts and household spending budget estimates. A snapshot with information on 1,000 consumers costs $137.

Another of its catalogs sorts American households into 71 categories falling into 19 broad groups. It contains data on consumer marital status, income range, age range and presence of a child in the household. It also claims to reveal a consumer’s ethnic group, status as a homeowner or renter, employment status, level of education and current and past addresses.

The catalog is chock full of colorful labels representing distinct target markets: “Flourishing Families,” “Booming with Confidence,” “Singles and Starters,” “Babies and Bliss,” “Golf Carts and Gourmets,” “Colleges and Cafes,” “Small Town Shallow Pockets” and the dourly euphemistic “Economic Challenges.”

experian-marketing1.jpg
A catalog from Experian for a service called Mosaic explains how the company uses its vast databases to sell consumer profiles to marketers, grouping them into colorful categories such as “American Royalty,” “Aging of Aquarius” and “Birkenstocks and Beemers.”
experian-marketing2.jpg
Experian also groups consumers into less fancy-sounding categories based on the data the company collects, including “Mid-Scale Medley,” “Dare to Dream” and “Small Town Shallow Pockets.” Critics say consumer reporting firms have essentially become data brokers, selling marketing data to the highest bidder.

The company also sells the names of expectant parents and families with babies under 3 years old, under a product called Newborn Network in collaboration with Princeton-based marketing data firm ALC. Pricing for the list was not posted on Experian’s website, but for comparison, another marketing list service sold a “pregnant women email/postal/phone mailing list” for $185 for every 1,000 names, with extra charges to filter by income, child’s age and other factors.

Experian even tracks a bevy of seemingly mundane details about consumer habits and inclinations, according to a company document found online: type of motor vehicle owned, hobbies, frequency and destination of travel, preference for movies and TV shows, cellphone usage, number of children and their approximate ages, preference for exercise, dieting patterns, food choices such as vegetarianism, loyalty to brands, preference for frozen and fast food, organic products, political affiliation, position on abortion rights and proclivity to compost food waste.

How they profile consumers

Another Experian catalog for marketers claims that the company’s algorithms can infer whether a consumer has clinical depression or suffers from heart disease, takes a certain brand-name drug or has a dog or cat. It also offers lists based on inferences about whether a consumer has dry or oily skin, or wears contact lenses or glasses.

How do Experian and other data brokers know so much about us? Consumer reporting agencies ingest personal data from social media platforms, said Brett Horn, a senior equity analyst for investment advisory firm Morningstar. They can derive gender and ethnicity from information found on users’ Facebook accounts, including pictures, he said. If such information is used as a metric to extend credit “it could be a minefield,” he said, but the same information used for marketing purposes instead was “not such a problem.”

experian-brochure.jpg

Experian

What they know will surprise you: Experian promotes its ability to sell marketers targeted information about consumers based on a broad swath of demographic information, including taste in food and drink, investment behaviors, buying habits and the presence of children in the home.

Social media is just one type of information source that data brokers use to build consumer profiles, said Robert Gellman, a Washington, D.C.-based privacy and information policy consultant. Much of that information comes from consumers themselves.

Customers might fill out surveys in exchange for coupons and other free information they find useful. If a consumer visits WebMD to research a medical condition, Gellman said, data brokers “can track your purchases, they can track your web activities.”

“You will find that you can buy a list of people by disease,” he added. “You can get a list of people who aren’t diabetics. You can get a list of people by virtually every disease you’ve ever heard of.”

In 2012, the New York Times reported how retailer Target concluded that a teenager was pregnant based in part on what she was buying. “That kind of inference can be done all the time,” Gellman said.

Industry pressure

The lobbying by the consumer reporting industry seems to have been taken seriously in the state Capitol.

In 2019, Experian spent close to $51,000 on lobbying efforts for over a dozen bills including Assembly bills 1416 and 1355, which were amendments to California’s privacy law.

AB 1416, proposed in February 2019 and sponsored by Sacramento-area Democratic Assemblyman Ken Cooley, would have exempted all data used to create fraud prevention services from public disclosure or deletion. The bill would have allowed “a business do essentially anything in the name of cybersecurity protections,” said Roane from Privacy Rights Clearinghouse.

AB 1355, sponsored by Assemblyman Ed Chau, a Democrat from the Los Angeles area, patched up broad carve-outs by consumer reporting agencies to California’s privacy law, Roane said. Equifax paid a firm called California Advocates almost $88,000 in 2019 to lobby on this bill and others.

Consumer reporting agencies lobbied well into September, negotiating language on AB 1355 “up until the last minute,” Roane said. The consumer reporting agencies argued that they were “already regulated” and therefore should not need more restrictions, he said.

In the end, neither lobbying effort weakened consumers’ rights to control their data. AB 1416 failed to leave a Senate committee for a full vote. AB 1355 was signed by Gov. Gavin Newsom in October, though it merely added language to the new privacy law confirming that consumer reporting agencies had the same restrictions with the new California law as they did under federal law, Roane said.

“The world’s largest companies have actively and explicitly prioritized weakening” the new California privacy law, said Alastair Mactaggart, cofounder and chair of Californians for Consumer Privacy, in a statement he posted to the web in September to encourage improved legislation. In a 53-page white paper he penned in November, he proposed to strengthen the privacy law through a fall 2020 ballot initiative called the California Privacy Rights and Enforcement Act.

The new ballot initiative will allow consumers to stop companies from using sensitive personal information, such as sexual orientation or precise geolocation “unless it’s necessary to deliver a product or service to you,” Mactaggart said in a recent sit-down interview. Consumers will be able to tell companies “look, you actually can’t use that to advertise. You can’t use that for any reason unless it’s actually necessary to deliver a product that I actually have asked for.”

Messy responses to privacy requests

Since the new privacy law took effect in January, this reporter tried to request her data be disclosed, to opt out of sales and to have data deleted from all three major consumer reporting agencies.

The process was difficult, making it unclear how each of the companies interpreted their duty to consumers.

Experian made it the easiest to apply online, but after a month, Experian delivered an ambiguously worded statement: “Experian may have shared” personal information with other entities including law enforcement, “Travel, Leisure & Entertainment companies” and “Other.” The company “may have collected” the data from telecommunication firms, “Consumer Inquiries About Experian Products/Services,” and “Other Product Companies Not Categorized.”

Equifax’s online process did not work, and it took three hours on the phone to apply. (Two and a half hours was consumed waiting for an available Equifax representative on a hotline it established for California consumers needing help with their privacy rights. When asked why it took so much time, representative Gabriel replied, “We’re experiencing high call volume.”)

TransUnion’s application process was the worst. It only allows consumers to choose disclosure, opt-out or deletion one at a time. After a request to opt out of sales early in January a TransUnion representative said in a follow-up call: We have no record of you registering for an opt-out.

The post Consumer Reporting Firms Fought for a Year to Exempt Data From California Privacy Law appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/consumer-reporting-firms-fought-for-a-year-to-exempt-data-from-california-privacy-law/feed/ 0
New State Law Pits Privacy Against Free Speech, Public Records and Data Brokers https://www.sfpublicpress.org/new-state-law-pits-privacy-against-free-speech-public-records-and-data-brokers/ https://www.sfpublicpress.org/new-state-law-pits-privacy-against-free-speech-public-records-and-data-brokers/#respond Tue, 02 Jul 2019 01:09:47 +0000 http://sfpublicpress.flywheelstaging.com/news/new-state-law-pits-privacy-against-free-speech-public-records-and-data-brokers/ Though consumers may ask companies to delete or stop collecting data about them, the First Amendment and open-records statutes may thwart their efforts to get people-search sites to delete data after the law takes effect in January. Information brokers argue that the data they post comes from government entities and is publicly available.

The post New State Law Pits Privacy Against Free Speech, Public Records and Data Brokers appeared first on San Francisco Public Press.

]]>
For five years, since her doctor raped her at home, one woman in Los Angeles has had no luck getting “people search” sites such as MyLife and Spokeo to stop posting her address. Like many victims of rape, other violent crimes or harassing ex-spouses, she remains traumatized and fearful.

Although she has been circumspect about where she lives, these information brokers continue to post and sell her whereabouts and other details she considers personal and private.

“You don’t want a guy to go on the internet to look up where you live, and you try to protect your safety and you can’t, because you’re completely, easily Google-able,” said the woman, who lives alone and asked not to be identified. She just wants to vanish from the internet, a right that citizens of the European Union enjoy under the General Data Protection Regulation.

But even in the wake of last year’s passage of the California Consumer Privacy Act, the first of its kind in the United States, she and many others with reason to avoid public exposure will remain visible to the world. Though consumers may ask companies to delete or stop collecting data about them, the First Amendment and open-records statutes may thwart their efforts to get these information brokers to delete data after the law takes effect in January.

Personal Data That’s Quite Public

These businesses argue that the information comes from government entities, such as the state and municipal courts and motor vehicle agencies, and is publicly available.

“We actually don’t have an obligation to remove those records,” MyLife.com founder and CEO Jeff Tinsley said in an interview. The exception is for judges and law enforcement. Other removal requests, he said, are considered individually.

That may surprise Californians.

“Many people are under the misconception that there are laws that specifically protect them from having their personal information published online,” said Paul Stephens of the San Diego-based nonprofit Privacy Rights Clearinghouse. “For the most part, there are no such laws.”

‘Unconstitutional Speech Restrictions’

The Software and Information Industry Association, a Washington, D.C.-based trade group representing about 700 companies and organizations, argues that the state privacy law violates the U.S. First Amendment because it contains “unconstitutional speech restrictions.” In a memorandum filed with the California attorney general, the lobbying group said that the law restricts “the dissemination of accurate, publicly available information,” and that if not amended, “it is highly likely to be invalidated in court.”

The Legislature is considering adding language to exempt government records. The sponsor of A.B. 874, Assembly member Jacqui Irwin, a Democrat from Thousands Oaks, said she wanted to make sure the law did not impose “unconstitutional limitation on the use of public records, which the state and local governments make available to support transparency in government.”

Any limitation on what constitutes a public record must be scrutinized, said David Snyder, executive director of the San Rafael-based First Amendment Coalition. “I don’t think everything ought to be public, but I think you head down a dangerous road when you start saying, ‘Well, it’s a public record except in these circumstances.’”

Other privacy advocates agreed.

“Laws that protect consumer data privacy should be calibrated to not unduly burden free speech,” said Adam Schwartz, senior attorney at the San Francisco nonprofit Electronic Frontier Foundation.

New Law Silent on Emergency Requests

Though Californians will have expanded personal-data protections, information brokers will continue to sell and make available other facts about their lives, said Alastair Mactaggart, a businessman whose self-funded petition drive spurred legislators to propose and pass the new law.

That’s a big weakness, one professional data-scrubber said.

“Oftentimes individuals need to be removed from sites due to doxxing and/or direct threats to their physical harm,” said Will McAdam, founder and CEO of Los Angeles-based Privacy Duck, a fee-based service to remove people from information broker sites.

California’s privacy law has “no provisions for emergency removal demands and no provisions for emergency removal demands without having to turn over evidence such as police reports to private third-party people-search companies, creating more exposure,” McAdam said.

Loophole Over Selling or Displaying Data

He pointed out that there is no clear distinction between the “selling” of personal information and the “display” of it on dozens of sites. “This creates fine lines and loopholes for the people-search companies to take advantage of,” he said.

Federal law may need to be changed.

A starting point would be the Driver’s Privacy Protection Act, which allows the one-time sale of motorists’ information to private investigators or to any company claiming to update its records. State vehicle departments have been the main source of information for companies harvesting and selling people-search information, said McAdam, whose clients include public figures, activists and victims of crime, including identity theft.

New technology brings new forms of crimes, said Erica Johnstone, an attorney specializing in online harassment for Ridder, Costa & Johnstone in San Francisco. Sites listing family members and addresses can be used as “starter blocks” for ruthless cyberstalking campaigns, said Johnstone, who served on the Cyber Exploitation Task Force when U.S. Sen. Kamala Harris was California attorney general. She’s found harassment cases often extending beyond the targeted individual and spreading to family and friends.

Information Reposted After Opt-out Requests

Misinformation is another big problem.

“It’s important to remember that the information these companies compile can be incorrect, out of date or misleading, which may mean you lose opportunities just because someone does a simple search on your name,” said Meghan Land, executive director of Privacy Rights Clearinghouse.

Other privacy advocates pointed to the problem of information brokers reposting or “repopulating” data after being asked to take it down.

Rob Shavell, co-founder and CEO of a Massachusetts privacy firm called Abine, which offers the fee-based service DeleteMe, said “millions” of their opt-out requests showed that MyLife, Spokeo, PeopleFinders, Whitepages, PeekYou, Intelius and others frequently repost information several months after opt-outs.

MyLife Maintains ‘Suppression List’

He said his company, which works with the National Network to End Domestic Violence, has seen data brokers sell lists of abuse victims based on police reports. “Of course no company will admit it and you cannot audit them,” Shavell said.

Tinsley said that to comply with local laws, MyLife.com maintains a state-by-state “suppression list” of people to exclude. He denied that his site reposted data after opt-outs were received.

But McAdam and Shavell challenged Tinsley’s claims. McAdam said that in the past nine years he’s seen “tens of thousands” of Mylife.com profiles reappearing.

A Spokeo spokesperson said in an email that “we pride ourselves on providing one of the easiest opt-outs in the industry.”

“While we cannot comment specifically on CCPA given that the legislature is currently considering amendments, we agree with the spirit behind privacy laws like GDPR and CCPA regarding data transparency and control,” said the representative of the Pasadena-based company.

Background Reports and Reputation Scores

Tinsley launched his business in Los Angeles in 2002 as Reunion.com and later changed the name to MyLife.com. He said its mission is to make dating, home services or other marketplace sites safer and more trusted through background reports and reputation scores.

“What we actually do is we’re focused on helping making people feel safer,” he said.

Privately held MyLife, which is not subject to detailed public disclosures about its business, culls information from state and municipal government sources nationwide that “make data available to others including groups like us,” Tinsley said. “We just pull it all together to make it easier for consumers.”

Information may include age, birthday, past and current home addresses, phone numbers, court records, traffic citations, email addresses, employers, education, photographs, relatives, political affiliations, a mini profile and a section for MyLife’s registered members to rate each other.

MyLife automatically generates a public page for everyone 18 years and older, and says it has more than 300 million such pages — a “complete Wikipedia-like biography on every American.” The Census Bureau estimated the total U.S. population on June 30 at more than 329 million. About a quarter were 18 or younger.

No Explanation For How Details Are Removed

Regardless of membership, these “public pages” cannot be deleted, MyLife’s policy states, adding, “Though if you have extenuating circumstances call our customer care department.”

Tinsley would not explain the guidelines for taking down information. He said victims of rape, domestic violence or other crimes did not have to produce police reports.

Tinsley could not explain how a rape victim’s information still appeared on MyLife.com. “I would need to see proof” that her file had been suppressed and later reposted, he said. “I don’t see how this would happen in our service.”

In a follow-up email he added that “there are a lot of features and benefits that are part of the premium membership which include the ability to have more controls over your Reputation Profile (think of it as a personal homepage that makes you look good) and the ability to help facilitate removal of your entries from other sites that you can’t control at all.” Monthly plans cost $9.95 to $15.95.

“The value of our services is validated by the public,” Tinsley said. “Over the past 12 months our MyLife and our services has been used by more than 160 million people, to learn about others and help manage the way they look online. Most members love our services.”

Complaints and Legal Troubles

The company’s public record shows a history of complaints and legal troubles with consumers and authorities, including allegations of inaccurate information, false claims and advertising, fraudulent credit-card charges and duping users into providing personal information. In 2015, the company paid a $1 million settlement to the City of Santa Monica and consumers for allegedly violating a California law regulating automatic subscription renewals.

The Better Business Bureau reports nearly 7,000 complaints about the company in the past 12 months and more than 10,000 over the past three years. Since 2016, Mylife.com’s rating has risen from F to B-, but it is not accredited by the business rater.

Though the new privacy law does not cover the removal of crime victim data, the California secretary of state’s “Safe at Home” program offers legal remedies for domestic violence, rape and stalking, said Tara Gallegos, a spokeswoman for California Attorney General Xavier Becerra.

If a home address and home phone number registered with the state program are not removed after a written request, the victim may file a lawsuit and request a court order, she said. Victims can complain to the attorney general or their local district attorney.

Stephens, of the Privacy Rights Clearinghouse, noted that people-search sites or information brokers can be sued for a “deceptive practice” if their privacy policies state they will remove information upon request but do not.

That’s little comfort to the woman from Los Angeles, who said she feels that MyLife and its competitors are victimizing her again.

“All the info sites suck,” she said. “I want to be unknown and be able to fade away. The world we live in today does not allow this. I spend my life constantly combatting it and removing myself. That’s my fight.”

mylife.jpeg
MyLife, formerly Reunion.com, says it has 300 million publicly available pages — a “complete Wikipedia-like biography on every American.”
mylife_trump.jpeg
The last of several screens before viewing MyLife’s report on Donald Trump.
jefftinsley_mylife.jpg
Jeff Tinsley, MyLife founder and CEO. Photo via Twitter
jacqui_irwin.jpg
Assemblymember Jacqui Irwin, D-Thousand Oaks, has sponsored legislation that would amend the California Consumer Privacy Act to exempt public records from removal requests.

The post New State Law Pits Privacy Against Free Speech, Public Records and Data Brokers appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/new-state-law-pits-privacy-against-free-speech-public-records-and-data-brokers/feed/ 0
California Attorney General Plans Few Privacy Law Enforcement Actions, Telling Consumers to Take Violators to Court https://www.sfpublicpress.org/california-attorney-general-plans-few-privacy-law-enforcement-actions-telling-consumers-to-take-violators-to-court/ https://www.sfpublicpress.org/california-attorney-general-plans-few-privacy-law-enforcement-actions-telling-consumers-to-take-violators-to-court/#respond Wed, 15 May 2019 21:01:06 +0000 http://sfpublicpress.flywheelstaging.com/news/california-attorney-general-plans-few-privacy-law-enforcement-actions-telling-consumers-to-take-violators-to-court/ Attorney General Xavier Becerra says his office is ill equipped to prosecute violations of the state’s landmark data-privacy law, which takes effect in January. Only a handful of the most egregious cases will be handled per year. Instead, he wants aggrieved consumers to take violators to court on their own.

The post California Attorney General Plans Few Privacy Law Enforcement Actions, Telling Consumers to Take Violators to Court appeared first on San Francisco Public Press.

]]>
Under the California Consumer Privacy Act, which takes effect in 2020, consumers will have the right to opt out of the sale of their personal data to third parties, and request that businesses delete their information. The law calls for noncompliant companies to be prosecuted by the California attorney general’s office and fined up to $7,500 for each violation.

But after months of strenuous lobbying by technology companies, which are increasingly clashing with privacy advocates in Sacramento, it appears California consumers may end up having to fend for themselves. That’s because the office of Attorney General Xavier Becerra says it is ill equipped to prosecute data privacy, and predicts it may be able to handle only a handful of the most egregious cases per year.

Consumers have for years suffered blatant privacy abuses from companies that claim to be responsive to their requests for anonymity.

After seeing their names with birthdays, addresses, phone numbers and speeding tickets displayed online for the world to see, thousands of American consumers have been taking their complaints to the Better Business Bureau to try to remove their information from people search sites such as Mylife.com, Spokeo and Whitepages. One irate contributor to the bureau’s forum, Mark Perna, 45, said he called Mylife.com, a Los Angeles-based company that has tallied 9,242 complaints, requesting his information be removed. “If someone wanted to, they could stalk someone,” said Perna, who is a D.J. living in San Diego. After getting no satisfaction, he reported the company to the state attorney general.

More Staff for Just a Few Cases

But he might not get much help from California regulators. Stacey Schesser, supervising deputy attorney general on consumer protection, testified in a state Senate hearing in April about how limited her office’s enforcement capabilities were. She said that even after the planned expansion of her privacy team to 23 people under the governor’s proposed $4.7 million budget for the department, she would still have the ability to prosecute only three cases a year.

That calculation came from reviewing the number of hours spent in the past on similar types of cases. The new staff will have a “multitude of responsibilities” — handling investigations, litigating violations and proposing adjustments to the regulations themselves. Enforcement of the new privacy law can stay effective only if it responds as technology evolves.

The law will cover businesses that fall into any of three categories: those that earn more than half their revenue from the sale of consumers’ personal information; those with more than $25 million in gross revenue; and those selling or sharing personal information of 50,000 or more consumers, households or devices.

“The issue here is that there’s 40 million Californians, and the scope of businesses that the CCPA applies to is significantly large,” said Schesser.

But there is wide disagreement about the number of firms that will have to comply with the California Consumer Privacy Act, to say nothing of more than a dozen other proposals in various states of development in the Legislature. There is no official list of companies, and thus no estimate of the portion of the state’s economy that could be affected when the law takes effect.

Records Request Denied

In response to a Legislative Open Records Act request for lists of companies expected to come under its regulatory scrutiny, the attorney general’s office replied that it had no such records. The office also refused to release any documents or correspondence under a request that would shed light on the office’s strategy toward specific companies, including large tech firms and people search companies. Among other reasons, it said “regulation-related work product is confidential and exempt from disclosure.”

Non-governmental estimates of the scope of the regulations suggest that compliance could be a major headache not just for a multitude of California-based companies, but also those in other states, based on the broad wording of the regulation — something industry lobbyists are hard at work to change.

A survey by the International Association of Privacy Professionals argued that half a million U.S. companies could be affected by California’s privacy law, based on Census Bureau data from 2015.

California’s privacy law “is effectively a national law,” said Colin O’Malley, principal of consultancy Lucid Privacy Group and co-founder of privacy app Ghostery. “It’s extremely difficult to carve out a market as large as California. Many folks in the digital media space are going to be treating everybody as a Californian.”

From an international perspective, it’s clear that California has its work cut out for it. Deputy Attorney General Schesser compared the number of staff she will be overseeing with her Irish counterpart. With a population of 4.8 million — one-eighth of California’s — Ireland maintains a data privacy unit of 140 officers to enforce Europe’s General Data Protection Regulation, enacted last year.

Schesser said the attorney general’s office is “way understaffed” to handle California’s new law, which she called “groundbreaking” because it is the most ambitious privacy legislation enacted in the United States to date.

‘Private Right of Action’ Proposed

The attorney general’s office is in a tough situation because introducing a law “is a totally different skill set” from the investigations or litigation that are the norm, said Travis LeBlanc, partner at Palo Alto-based law firm Cooley LLP and former chief of the Federal Communication Commission’s enforcement bureau. The attorney general’s office is “fundamentally a law enforcement agency,” he said, “not a regulatory agency that is designed around promulgating policy.”

To mitigate its inability to pursue many violators of the new law, the attorney general’s office has endorsed a proposal to allow individuals to challenge violators in the courts.

To ensure that consumer “opt out” requests do not go ignored, Becerra wored with state Sen. Hannah-Beth Jackson, a Democrat from Santa Barbara who is chair of the Judiciary Committee, to draft a bill to permit a so-called private right of action with regard to data privacy cases.

“The attorney general is your basic criminal law enforcement and civil rights officer in the state,” Jackson said, adding that she was seeking ways other than to “put the entire burden on the attorney general.”

The bill, SB 561, was voted down in a committee hearing May 16. It would have embraced a key concept in Europe’s privacy law, the ability for consumers to file civil suits. It also would also have removed language allowing businesses to avoid fines as long as they “cure” privacy violations within 30 days of being notified that they are not following the rules.

Giving Consumers Power

The original intent of the California Consumer Privacy Act was to give consumers “the ability to control how their information is used and whether or not a company can retain that,” Schesser said, so giving consumers the right to take action against companies represents “fundamental fairness.” She said private right of action would “work in parallel” with prosecution by the attorney general.

Many nonprofit advocacy groups, such as the American Civil Liberties Union, the San Francisco-based Electronic Frontier Foundation and the Sacramento-based Consumer Federation of California, support the idea of expanded rights to civil action.

“For a consumer protection law to have real teeth, it should have a private right of action, which allows for consumers to file a class action,” said Richard Holober, executive director of the Consumer Federation of California. “That is a generally much more impactful deterrent against corporate misbehavior than penalties” that would be enforced by the government.

But SB561 faced an uphill battle. Many organizations, including industry groups, opposed the bill, and it was so contentious that last summer the private right of action was stripped from the original draft of the main privacy law before the Legislature approved it.

‘Unfair Burden on Businesses’

SB561 was stopped in its tracks in the Senate Appropriations Committee. Jackson said she was evaluating her options and considering next steps.

The business and tech industry was pleased to hear the bill was defeated.

“Allowing direct consumer private actions could become a big issue for companies,” said Dan Rosler, vice president for business opportunities at advertising technology firm Flashtalking, based in San Francisco. “Companies could literally be driven into bankruptcy regardless the merits of the cases.”

Jackson has been able to keep the bill alive by offering to sit down and work out the language of the bill with opponents such as the California Chamber of Commerce and the Internet Association, which represents tech giants including Facebook, Google and Amazon.

“This bill would roll back the most critical agreement leading to the CCPA — that this complex, new law would be enforced by a regulator,” Sarah Boot, policy advocate for the California Chamber of Commerce, said at the April hearing. “SB561 would allow thousands of trial attorneys to test a business’ ability to perfectly comply with the complexities of this new law.” Boot called the bill a “significant and unfair burden on businesses.”

Other senators have been skeptical of the bill. Sen. Andreas Borgeas, Republican from Fresno, called it “red meat for trial lawyers” and said that “a ravenous frenzy” of privacy cases brought forth by consumers would have a chilling effect on business.

Sen. Bob Wieckowski, Democrat from Fremont, represents a district that is home to major Bay Area tech companies, including electric automobile maker Tesla. Wieckowski said he was “not comfortable” supporting the bill, although he generally supported a private right of action.

“There are a lot of people who don’t want to let people be able to access their own lawyers,” to address privacy violations, Jackson said.

She said she would work with all sides to create an enforcement mechanism that is effective for the “94 percent of Californians who want their privacy rights protected.”

Meanwhile other lawmakers, including state Sen. Thomas J. Umberg, Democrat from Santa Ana, have proposed that district attorneys and city attorneys in large cities support the state in enforcing the new privacy law. Both the attorney general’s office and Jackson said they were looking into such possibilities.

The post California Attorney General Plans Few Privacy Law Enforcement Actions, Telling Consumers to Take Violators to Court appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/california-attorney-general-plans-few-privacy-law-enforcement-actions-telling-consumers-to-take-violators-to-court/feed/ 0
Writing the Rules on Data Privacy in S.F. Could Disrupt the Disrupters https://www.sfpublicpress.org/writing-the-rules-on-data-privacy-in-s-f-could-disrupt-the-disrupters/ https://www.sfpublicpress.org/writing-the-rules-on-data-privacy-in-s-f-could-disrupt-the-disrupters/#respond Mon, 25 Mar 2019 15:00:00 +0000 http://sfpublicpress.flywheelstaging.com/news/writing-the-rules-on-data-privacy-in-s-f-could-disrupt-the-disrupters/ As city officials this spring craft a 'privacy-first policy' mandated by voter-approved Proposition B, supporters hope its lofty ambitions will start to become a reality this summer. Already there are signs that the city could move to the forefront of enforcing limits on data collection and reshaping our relationship with technology companies.

The post Writing the Rules on Data Privacy in S.F. Could Disrupt the Disrupters appeared first on San Francisco Public Press.

]]>
B it by bit, San Francisco has become a place where it’s assumed that tech companies — many of which are based here — are tracking your every move. Until now, it has largely been up to you to find ways to stay off their collective corporate radar.

But instead of having to dig deep into the privacy settings of your phone or an app to opt out of sharing your personal information to do something as pedestrian as, say, rent one of the now-ubiquitous electric scooters, imagine never having to opt in in the first place.

As city officials this spring craft a “privacy-first policy” mandated by voter-approved Proposition B, supporters hope its lofty ambitions will start to become a reality this summer. The local regulations — among the first emerging in cities nationwide — could fill potential holes in the separate landmark statewide privacy law that takes effect in 2020. San Francisco’s initiative might also accelerate efforts to pre-empt cities and states through the adoption of federal standards, which the technology and business lobbies could be expected to water down.

Already there are signs that the city could move to the forefront of enforcing limits on data collection and reshaping our relationship with technology companies.

Disrupting the Disrupters

City leaders have made bold, but so far not very specific, claims about their ability to limit the personal-information free-for-all that is at the heart of the business model for data brokers, many startups and other digital enterprises.

Since passage of the ballot measure in November, supervisors have introduced legislation to restrict or ban surveillance and facial recognition. But the idea of being able to use online services in San Francisco without providing any personal details is the most controversial proposal. It is also potentially the most transformative and disruptive to an industry that has reaped riches from “disruption.”

Previously: S.F. Voters Want Tough Data Privacy Rules, But Obstacles Loom

Related: Why Privacy Needs All of Us

As officials work to add teeth to the larger privacy framework, critics warn that aggressive regulations could chase companies that make their money mining Big Data out of San Francisco, or into the courts to battle regulators.

“I think it’s going to be very difficult to change the business model that underlies a lot of current services, because it enables things to be free,” said Kelsey Finch, an attorney at the Future of Privacy Forum, a nonprofit think tank and advocacy group based in Washington, D.C. “It’s hard to have both free products and services and not provide any data. And that will be a logistical hurdle. It may be a political hurdle.”

Brian Hofer, who chairs the Oakland Privacy Advisory Commission, agrees. “If the restrictions are perceived as being too cumbersome for the private industries, I think there’s going to be a lot of pushback,” he said.

Strict regulations would have a “far-reaching impact in a lot of different technologies,” said Cynthia Cole, a privacy attorney with Palo Alto-based Baker & Botts, which represents more than half the Fortune 500 companies. She cited geolocation and biometric data as rapidly expanding fields that could be drastically affected and whose stakeholders could respond accordingly.

“I think you could have a dissuasive impact on companies saying, ‘Are we going to put our offices in San Mateo or Oakland, or are we going to put it in San Francisco?’” Cole said. “And they might say, ‘OK, well, San Francisco has these laws that make it even more stringent, even stricter for us with respect to our business model. So, we’re going to just implant ourselves right outside San Francisco instead.’”

Challenging Big Tech

The first warning shot came just a month after city voters endorsed the 11 “guiding principles” in Proposition B, which are the foundation of the privacy policy. The measure covers data collected by city agencies, anyone with a city license, permit or contract, and anyone receiving city funding.

In December, District 1 Supervisor Sandra Lee Fewer introduced an ordinance to require stores to post signs and notify the city if they are using surveillance cameras to monitor, track or collect data on shoppers. Standard security cameras were exempted.
The proposal, which included unspecified “administrative penalties” for noncompliance, was sparked by a new breed of stores like partially automated Amazon Go, which have cameras that “can sense if a customer reaches out and grabs an item and puts it back down,” said Fewer aide Angelina Yu.

“We felt that it was important to be transparent and upfront that that type of data was being collected and analyzed,” she said. “It’s inferring a lot more data than just a simple transaction.

“We’re hoping this will dovetail with a lot of the efforts around ‘privacy first,’” said Yu, adding that Fewer’s office planned to meet with Amazon. “We are at this time where it’s kind of a Wild West and there’s an entirely new range of technologies and spaces where privacy questions are really impacting the real world and our everyday movements and transactions.”

In February, Fewer backed off advancing her measure “at this time” but did not say why.

Taking Aim at Surveillance

District 3 Supervisor Aaron Peskin, who sponsored Proposition B, followed Fewer’s proposal in late January by introducing the “Stop Secret Surveillance Ordinance.” It would require that the Board of Supervisors approve any city department request to acquire new surveillance technology, or use current surveillance equipment in a new way to monitor or track residents and visitors. It would also ban facial recognition by city departments, setting up San Francisco to be the first U.S. city to do so.

The proposal is in line with what Sameena Usman, with the Bay Area Council on American Islamic Relations, was hoping for last year, when her group endorsed Proposition B. Usman told the Public Press in October that she saw the city Charter amendment as “a stepping stone” toward a surveillance ordinance. Though Peskin’s plan does not directly cite the privacy-first policy, a fact sheet distributed by his office called it “an implementation vehicle” of Proposition B.

Peskin legislative aide Lee Hepner called it a way to “build on the mandate of the voters last November to protect their personal information, to protect them from unwarranted intrusion into their private worlds.” Down the line, he said, supervisors might also look at banning facial recognition at private events that receive city permits for the use of public space.

Once the privacy policy is in place, it is expected to provide a legal platform on which supervisors might introduce dozens of other transformative regulations applying to city agencies, contractors and any entity that receives permits or payments from the city, which Proposition B calls for.

Taking Aim at Business Models

But privacy advocates in City Hall envision an expansive agenda moving forward, which if pursued could become a significant headache for business, especially local tech giants that grow by making acquisitions. Hepner said Peskin’s office is “trying to look at data privacy regulation as somewhat of an antitrust tool.” The goal would be to prevent the accumulation of massive amounts of data by large companies that purchase startups solely for that purpose.

“If you are a permittee of the City and County of San Francisco and you are acquired by Amazon or Google or Uber or Lyft, that should force a reevaluation of your permit … to see how policies are changing and shifting relative to the exchange and use of people’s personal information,” Hepner explained.

The city’s policy “does absolutely threaten the business models of some companies,” said the Electronic Frontier Foundation’s director of grassroots advocacy, Shahid Buttar, who evaluated Proposition B as it was being drafted last year. “Companies will have to let the users decide how they are to participate and how they are to be monetized, as opposed to imposing their own interests on users that want to use their platforms.”

The threat of relocation could be private industry’s primary negotiating tactic.

“There are ways to put pressure on cities that you don’t have that option on a federal level or less so even on a state level,” said attorney Cole, who has numerous gig-economy and other tech clients that expect to be affected by San Francisco’s regulations.

Seattle’s Experience

Seattle has developed one of the nation’s more robust citywide privacy regimes. In 2015, it created privacy principles to guide city government and companies with city contracts.

Seattle chief privacy officer, Ginger Armbruster, said that occasionally the city has to tell a vendor “this just doesn’t work for us if you want to do business with the city.” But unlike what San Franciscans envision, Seattle’s privacy policy is just advisory and not legally binding. That means with tech giants like Google and hometown Amazon, the city has had to bend.

Some private entities have an attitude of “we take data, we use it, we monetize it, we share it, we do what we want with it because we’re a large company and we don’t have to negotiate that,” Armbruster said.

“It becomes complicated when we deal with large companies who say, ‘These are our data practices, love it or leave it.’ There’s really no room for discussion,” she added. “There are times when some of those data practices can be at odds with what we’re hoping to do with public data. If we had a choice, we would rather do X, but here we are doing Y because we need to provide the service.”

Companies Over-Collect Data

Armbruster said transportation-focused companies are especially guilty of over-collecting data not relevant to the services they provide. She said some of her conversations with startups have bordered on comical.

“We ran into that all the time. Like, ‘Why are you collecting this information?’ ‘Well, we just want it.’ ‘Well, I can understand why you want it, but, you know, help me with what you’re doing with it, we have to be specific,’” Armbruster recalled. “And you know, we didn’t feel like we have such a mature process for privacy that we should be talking to a private entity that doesn’t have that figured out. But those are the questions you have to ask, and you have to make some real trade-off decisions about that.”

Oakland’s privacy commission, which advises the city on collection methods and storage of residents’ data, has seen the same patterns in which a private partnership “promises to make our life better, whether it’s measuring parking spaces or traffic … and there’s always some sort of trade-off,” said Hofer.

“Usually they offer us this service for free or at very low cost, but it’s in exchange for our data, and we want to know how that data’s being used, where it goes, what the purpose is, and put some reasonable guidelines into place,” he added.

“Some of these proposals come to us pretty thin,” Hofer said. “There’s not a lot of analysis. They haven’t really studied the impact, they’re just trying to figure out how to get large volumes of data and go sell it to their customers, to third parties. The tech mentality is just ‘if we can collect it, let’s collect it,’” he said.

“Whereas we’re coming a little bit more from the legal civil liberty side: ‘Should we collect it?’”

Hofer said he hopes privacy-first policies cause companies to ask themselves, “What data do I need to achieve that purpose, and what is the minimum amount of retention to hold the data until I can delete it safely?”

Big Data, Big Profits

For tech companies, of course, data equal profit. Facebook, Google, Experian, CoreLogic, Acxiom and other data harvesters have been joined by thousands of large and small brokers that buy and sell, or give third parties access to, personal information. International Data Corp., which provides market research for technology and telecommunications companies, has projected that in 2020, Big Data and business analytics will generate more than $210 billion in revenues worldwide.

Users’ personal information can be the most valuable asset some startups have and the foundation of their business model to attract investors. Hepner said one major goal of San Francisco’s privacy policy is to limit the jackpot their founders are seeking.

“If you’re providing a service to the public — an alternative transit service, for instance — that’s great and we want you to provide that service,” said Hepner. “But if the profit motive of your business is the acquisition of people’s personal information, or the potential misuse and abuse of that personal information, and your business model is actually like a data broker and facilitating data-broker monopolies, then that’s something that we definitely want to put up some safeguards.”

In his February State of the State Address, Gov. Gavin Newsom congratulated the Legislature for passing the nation’s first state privacy law, saying that “companies that make billions of dollars collecting, curating and monetizing our personal data have a duty to protect it. Consumers have a right to know and control how their data is being used.”

He then proposed a “data dividend” for Californians, “because we recognize that your data has value and it belongs to you.”
“California’s consumers should also be able to share in the wealth that is created from their data,” he said.

A push for standardized, potentially more conservative, national legislation could also be a way for technology giants to reign in far-reaching local or state regulations.

“My hope is that we will get to federal legislation before we get 48 different privacy state laws,” Kalinda Raina, head of global privacy at LinkedIn, said at a January event in San Francisco marking national Data Privacy Day. “It’d be wonderful to see one law that is operated at the federal level that is consistent and brings the ability for companies to plan throughout the U.S. as to how they’re going to comply.”

Cole’s clients are waiting to see the texts of both the San Francisco and California laws. She’s also tracking the evolution of and difference among local and state privacy laws across the country, including San Jose, Denver and New York. “It’s starting to get a little difficult to keep up,” she said.

To date, 11 states are considering privacy laws similar to California’s. In Washington state, one piece of legislation calls for restricting companies that use personal data for profiling and facial recognition.

The Question of Enforcement

Proposition B directs San Francisco’s city administrator to draft the privacy policy by May 31. As of early February, that office was still months away from writing any “meaningful content,” said spokesman Bill Barnes.

“We already have a lot of laws in San Francisco that govern different types of information,” he said, highlighting the voter-approved Proposition D in 2006, which strengthened restrictions on when the city could disclose private information. “The city’s treatment of private information is generally fairly robust.”

The city administrator is also working in the shadow of the state attorney general, who is hammering out the details of the California Consumer Privacy Act, which becomes law on Jan. 1, 2020.

Buttar, of the Electronic Frontier Foundation, which defends civil liberties in the digital realm, said that although “Big Tech is quite active in pushing back against the regulatory schemes that the state Attorney General’s Office is building,” he hasn’t heard about similar lobbying efforts at the city level, at least not yet.

He pointed out one crucial detail of the state law that has yet to be determined: Whether individuals will have a “private right of action” to sue, and who will enforce the law — the attorney general or the courts.

“If the private enforcement goes off the table and the AG’s office is then the sole source of enforcement, it will likely get inundated by reports of violations and abuse from users around the state,” Buttar said.

San Francisco has to wrestle with a similar question: Will enforcement fall to the city attorney?

“How will the compliance mechanisms work? What will the complaint processing look like, what kinds of remedies will or will not be offered?” said the Future of Privacy Forum’s Finch, who built a tool Seattle uses to evaluate privacy risks for its open-data program. “It’s very important to be clear with the people of San Francisco what those will be.”

The city administrator’s spokesman said it was too early to comment. “As we draft this with the City Attorney’s Office, we’ll obviously look at everything,” said Barnes.

To Buttar, “The most important aspect of both of these reforms at the state and the municipal level is the empowerment of users and their agency and autonomy, and their right to choose how their information is used. If that’s all we get out of these measures, even that would be a pretty big step forward.”

privacy_hofer_swickham.jpg
Brian Hofer, Oakland Privacy Advisory Commission. Photo by Sharon Wickham // San Francisco Public Press
privacy_surveillance.jpg
Illustration by Reid Brown // San Francisco Public Press
privacy_fewer_swickham.jpg
District 1 Supervisor Sandra Lee Fewer, right, with legislative aide Angelina Yu in their City Hall office. Photo by Sharon Wickham // San Francisco Public Press
privacy_peskin_swickham.jpg
District 3 Supervisor Aaron Peskin with legislative aide Lee Hepner. Photo by Sharon Wickham // San Francisco Public Press
raina_linkedin_wickham.jpg
Kalinda Raina, head of global privacy at LinkedIn, addresses participants at an event marking national Data Privacy Day at the company’s offices in downtown San Francisco. Photo by Sharon Wickham // San Francisco Public Press

The post Writing the Rules on Data Privacy in S.F. Could Disrupt the Disrupters appeared first on San Francisco Public Press.

]]>
https://www.sfpublicpress.org/writing-the-rules-on-data-privacy-in-s-f-could-disrupt-the-disrupters/feed/ 0