In 2019, a government contractor and technologist named Mike Yeagley began making the rounds in Washington, DC. He had a blunt warning for anyone in the country’s national security establishment who would listen: The US government had a Grindr problem.
A popular dating and hookup app, Grindr relied on the GPS capabilities of modern smartphones to connect potential partners in the same city, neighborhood, or even building. The app can show how far away a potential partner is in real time, down to the foot.
Buy This Book At:
If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more.
In its decade of existence, Grindr had not only accumulated millions of users worldwide but had also embedded itself as a fundamental part of gay culture. However, Grindr represented something quite different to Yeagley. For him, it was one amongst the multitude of mobile applications that recklessly leaked vast volumes of data to the nebulous realm of online advertisers. Knowledge of this data being easily accessible to anyone with a little tech-savviness, prompted Yeagley, an experienced tech consultant in his late 40’s, to create a PowerPoint presentation elucidating how this data represented a grave national security risk.
Yeagley discovered a concealed but prolific opening to access Grindr users’ geolocation data: digital ad exchanges. These exchanges, which present a stream of miniature digital banner ads on top of Grindr and virtually every other ad-supported mobile app or website, operate via almost instantaneous auctions called real-time bidding. This process is rich with surveillance capacity. That ad appearing to stalk you across the internet? It is not only tracking you, but also, in some instances, revealing your location in near-real time to advertisers and to experts like Mike Yeagley who find value in unique data sets for government agencies.
Yeagley started using Grindr data to design geofences around buildings housing government agencies involved in national security. This enabled him to track which devices were present in specific buildings at certain times, and their subsequent movements. Specifically, he identified devices that belonged to daytime Grindr users in government office buildings. If a device was frequently found in the Pentagon, the FBI headquarters, or the National Geospatial-Intelligence Agency building at Fort Belvoir, its owner likely worked in one of those agencies. Tracing the movement of these devices using Grindr data, he found some of them stationary at highway rest stops across the DC area at the same time and in close proximity to other Grindr users, sometimes during the workday or while traveling between government facilities. From there, Yeagley could potentially determine where other Grindr users resided, their travel patterns, and possibly their dating partners.
Intelligence agencies have a long and unfortunate history of trying to root out LGBTQ Americans from their workforce, but this wasn’t Yeagley’s intent. He didn’t want anyone to get in trouble. No disciplinary actions were taken against any employee of the federal government based on Yeagley’s presentation. His aim was to show that buried in the seemingly innocuous technical data that comes off every cell phone in the world is a rich story—one that people might prefer to keep quiet. Or at the very least, not broadcast to the whole world. And that each of these intelligence and national security agencies had employees who were recklessly, if obliviously, broadcasting intimate details of their lives to anyone who knew where to look.
Lauren Goode
As Yeagley showed, all that information was available for sale, for cheap. And it wasn’t just Grindr, but rather any app that had access to a user’s precise location—other dating apps, weather apps, games. Yeagley chose Grindr because it happened to generate a particularly rich set of data and its user base might be uniquely vulnerable. A Chinese company had obtained a majority stake in Grindr beginning in 2016—amping up fears among Yeagley and others in Washington that the data could be misused by a geopolitical foe. (Until 1995, gay men and women were banned from having security clearances owing in part to a belief among government counterintelligence agents that their identities might make them vulnerable to being leveraged by an adversary—a belief that persists today.)
But Yeagley’s point in these sessions wasn’t just to argue that advertising data presented a threat to the security of the United States and the privacy of its citizens. It was to demonstrate that these sources also presented an enormous opportunity in the right hands, used for the right purpose. When speaking to a bunch of intelligence agencies, there’s no way to get their attention quite like showing them a tool capable of revealing when their agents are visiting highway rest stops.
Mike Yeagley saw both the promise and the pitfalls of advertising data because he’d played a key role in bringing advertising data into government in the first place. His 2019 road show was an attempt to spread awareness across the diverse and often siloed workforces in US intelligence. But by then, a few select corners of the intel world were already very familiar with his work, and were actively making use of it.
Yeagley had spent years working as a technology “scout”—looking for capabilities or data sets that existed in the private sector and helping to bring them into government. He’d helped pioneer a technique that some of its practitioners would jokingly come to call “ADINT”—a play on the intelligence community’s jargon for different sources of intelligence, like the SIGINT (signals intelligence) that became synonymous with the rise of codebreaking and tapped phone lines in the 20th century, and the OSINT (open source intelligence) of the internet era, of which ADINT was a form. More often, though, ADINT was known in government circles as adtech data.
Adtech uses the basic lifeblood of digital commerce—the trail of data that comes off nearly all mobile phones—to deliver valuable intelligence information. Edward Snowden’s 2013 leaks showed that, for a time, spy agencies could get data from digital advertisers by tapping fiber optic cables or internet chokepoints. But in the post-Snowden world, more and more traffic like that was being encrypted; no longer could the National Security Agency pull data from advertisers by eavesdropping. So it was a revelation—especially given the public outcry over Snowden’s leaks—that agencies could just buy some of the data they needed straight from commercial entities. One technology consultant who works on projects for the US government explained it this way to me: “The advertising technology ecosystem is the largest information-gathering enterprise ever conceived by man. And it wasn’t built by the government.”
Steve Nadis
Lauren Goode
David Nield
Lauren Goode
Everyone who owns an iPhone or Android phone has been assigned an “anonymized” advertising ID by Apple and Google. This ID is used for tracking real-world movements, internet browsing behaviors, the apps installed on our phones, and much more. Substantial funds have been invested in this system by America’s largest corporations. When faced with such an expansive and intricate data repository, governments around the world are increasingly opting to purchase this information on everyone rather than acquiring it through hacking or secret court orders.
Let’s consider an example, a woman named Marcela. She owns a Google Pixel phone with the Weather Channel app. When she is about to go for a jog and sees an overcast sky, she opens the app to check if rainfall is expected.
When Marcela clicks on the Weather Channel’s blue icon, a whirlwind of digital activity ensues with the aim of delivering her a tailored advertisement. This process begins with an entity known as an advertising exchange, essentially a massive marketplace where countless mobile devices and computers relay information about available ad space to a central server.
Almost instantaneously, the Weather Channel app shares a ton of data with the ad exchange: from the IP address of Marcela’s phone, its Android version, her service provider, and an array of technical details about the phone’s configuration, down to the display resolution. Most prized of all, the app shares the exact GPS coordinates of Marcela’s phone and the Anonymized Advertising ID number that Google has assigned to her, known as an AAID. Apple devices use a similar identifier referred to as an IDFA.
For the average person, an advertising ID is an indiscernible string, something like bdca712j-fb3c-33ad-2324-0794d394m912. But to marketers, it’s a treasure trove. They know that the device with the ID bdca712j-fb3c-33ad-2324-0794d394m912 is a Google Pixel with the Nike Run Club app installed. They’re aware that bdca712j-fb3c-33ad-2324-0794d394m912 regularly visits runnersworld.com and has been eyeing a pair of new Vaporfly racing shoes. These insights are based on the fact that Nike, runnersworld.com, and Google all participate in the same advertisement ecosystem, with a shared goal of understanding consumer interests.
Advertisers utilize this information in crafting and distributing their ads. Imagine both Nike and Brooks, another running shoe company, are aiming to reach female running fans within certain income brackets or in specific zip codes. Based on the extensive data they can gather, they could create an “audience”—in essence a sizable collection of ad IDs representing customers known or suspected to be shopping for running shoes. Then, in a real-time, automated auction, advertisers inform a digital ad exchange about the price they’re willing to pay to reach such customers every time they open an app or a webpage.
There are some limits and safeguards on all this data. Technically, a user can reset their assigned advertising ID number (though few people do so—or even know they have one). And users do have some control over what they share, via their app settings. If consumers don’t allow the app they’re using to access GPS, the ad exchange can’t pull the phone’s GPS location, for example. (Or at least they aren’t supposed to. Not all apps follow the rules, and they are sometimes not properly vetted once they are in app stores.)
Author: Steve Nadis
Author: Lauren Goode
Author: David Nield
Lauren Goode
Furthermore, ad exchange bidding platforms conduct negligible due diligence on the hundreds or possibly thousands of entities that maintain a presence on their servers. This means that even the unsuccessful bidders still gain access to all the consumer information that was part of the bidding request. An entire profit model is built upon this system – extracting data from the real-time bidding networks, repackaging it, and then reselling that information to assist businesses in comprehending consumer behaviour.
Geolocation is arguably the most valuable piece of commercial data that these devices generate. Understanding the movement of mobile phones is currently a multi-billion dollar industry. This data can be used to deliver location-based targeted advertisements such as a restaurant chain wanting to deliver targeted ads to nearby individuals. It can also be used to monitor consumer behaviour and the effectiveness of advertising campaigns, such as understanding how many people saw an advertisement and subsequently visited a store. Furthermore, the analytics can assist in planning and investment decisions such as identifying the best location for a new store, predicting if the foot traffic will be sufficient to support the business, and interpreting what fluctuating customer numbers mean for a retailer’s stock price.
However, this type of data has another use – it holds significant surveillance potential. Why? Because our actions in the world with our devices can never be completely anonymized. The fact that advertisers are aware that Marcela is identified by her unique identifier (bdca712j-fb3c-33ad-2324-0794d394m912) while monitoring her behaviour both online and in the real world offers her very little in the way of privacy protection. Collectively, her habits and routines are unique to her. Our movements in the real world are highly specific and personal. For many years, I lived in a small 13-unit apartment in Washington, DC, and I was the only person leaving that address each morning to go to The Wall Street Journal offices. Even if I was a anonymized number, my behaviour was as unique as a fingerprint even amongst hundreds of millions of other people. It was impossible to anonymize my identity in a geolocation data set. Where a phone spends most of its nights is a good indicator of where its owner lives. Advertisers are aware of this.
Governments know this too. And Yeagley was part of a team that would try to find out how they could exploit it.
In 2015, a company called PlaceIQ hired Yeagley. PlaceIQ was an early mover in the location data market. Back in the mid-2000s, its founder, Duncan McCall, had participated in an overland driving race from London to Gambia across the land-mine-strewn Western Sahara. He had eschewed the usual practice of hiring an expensive Bedouin guide to help ensure safe passage through the area. Instead, he found online a GPS route that someone else had posted from a few days earlier on a message board. McCall was able to download the route, load it into his own GPS device, and follow the same safe path. On that drive through the Western Sahara, McCall recalled dreaming up the idea for what would become PlaceIQ to capture all of the geospatial data that consumers were emitting and generate insights. At first the company used data from the photo-sharing website Flickr, but eventually PlaceIQ started tapping mobile ad exchanges. It would be the start of a new business model—one that would prove highly successful.
Steve Nadis
Lauren Goode
David Nield
Lauren Goode
Yeagley was hired after PlaceIQ got an investment from the CIA’s venture capital arm, In-Q-Tel. Just as it had poured money into numerous social media monitoring services, geospatial data had also attracted In-Q-Tel’s interest. The CIA was interested in software that could analyze and understand the geographic movement of people and things. It wanted to be able to decipher when, say, two people were trying to conceal that they were traveling together. The CIA had planned to use the software with its own proprietary data, but government agencies of all kinds eventually became interested in the kind of raw data that commercial entities like PlaceIQ had—it was available through a straightforward commercial transaction and came with fewer restrictions on use inside government than secret intercepts.
After acquiring a data set on Russia, the team realized they could track phones in the Russian president Vladimir Putin’s entourage. The phones moved everywhere that Putin did.
While employed, Yeagley saw the potential value of data to the government. PlaceIQ, despite being willing to sell its software to the government, wasn’t ready to sell its data. Yeagley therefore pursued an alternate company, PlanetRisk. A government defense contractor, PlanetRisk theoretically provided a safer environment than a civilian company like PlaceIQ for the kind of work Yeagley had in mind.
PlanetRisk operated in both the corporate and government contracting spheres, producing products aimed at assisting customers to understand the relative danger levels of various locations worldwide. For instance, a company looking to establish an office or store somewhere in the globe might approach PlanetRisk to analyze data regarding crime, civil unrest, and severe weather in terms of geographical distribution.
Yeagley was employed by PlanetRisk in 2016 as vice president of global defense – basically a sales and business development role. His goal was to develop his adtech technology within the contractor, to later sell it to various government agencies. Yeagley brought some government funding to his work from his associations in the defense and intelligence research societies.
PlanetRisk’s initial sales demonstration was focused on Syria – it quantified the mass exodus of refugees out of Syria due to years of civil war and advancing ISIS forces. PlanetRisk acquired location data on Aleppo, the contended Syrian city which was the epicenter of some of the most intense battles between government troops and US-backed rebels, from a commercial data broker known as UberMedia. This was an experiment in understanding what could be achieved. Is it even possible to collect location information on mobile phones in Syria? It was hard to believe that a war zone could be a hot spot for mobile advertising.
But to the company’s surprise, the answer was yes. There were 168,786 mobile devices present in the city of Aleppo in UberMedia’s data set, which measured mobile phone movements during the month of December 2015. And from that data, they could see the movement of refugees around the world.
The discovery that there was extensive data in Syria was a watershed. No longer was advertising merely a way to sell products; it was a way to peer into the habits and routines of billions. “Mobile devices are the lifeline for everyone, even refugees,” Yeagley said.
PlanetRisk had sampled data from a range of location brokers—Cuebiq, X-Mode, SafeGraph, PlaceIQ, and Gravy Analytics—before settling on UberMedia. (The company has no relation to the rideshare app Uber.) UberMedia was started by the veteran advertising and technology executive Bill Gross, who had helped invent keyword-targeted ads—the kinds of ads that appear on Google when you search a specific term. UberMedia had started out as an advertising company that helped brands reach customers on Twitter. But over time, like many other companies in this space, UberMedia realized that it could do more than just target consumers with advertising. With access to several ad exchanges, it could save bid requests that contained geolocation information, and then it could sell that data. Now, this was technically against the rules of most ad exchanges, but there was little way to police the practice. At its peak, UberMedia was collecting about 200,000 bid requests per second on mobile devices around the world.*
Author: Steve Nadis
Lauren Goode
David Nield
Lauren Goode
Just as UberMedia was operating in a bit of a gray zone, PlanetRisk had likewise not been entirely forthright with UberMedia. To get the Aleppo data, Yeagley told UberMedia that he needed the data as part of PlanetRisk’s work with a humanitarian organization—when in fact the client was a defense contractor doing research work funded by the Pentagon. (UberMedia’s CEO would later learn the truth about what Mike Yeagley wanted the data for. And others in the company had their own suspicions. “Humanitarian purposes” was a line met with a wink and nod around the company among employees who knew or suspected what was going on with Yeagley’s data contracts.) Either way, UberMedia wasn’t vetting its customers closely. It appeared to be more eager to make a sale than it was concerned about the privacy implications of selling the movement patterns of millions of people.
When producing a demo for PlanetRisk’s phone tracking product, Yeagley’s young daughter helped to devise the name: Locomotive. The term is a fusion of ‘location’ and ‘motive’. Launching a small demo for the program required financial assistance of $600,000, entirely provided by several Pentagon research funding bodies. As the PlanetRisk team tested and explored the data of Locomotive, they uncovered multiple fascinating narratives.
For instance, they noticed a device regularly moving between Syria and the Western regions, raising suspicions due to the infamous activities of ISIS in terms of recruitment, training and deployment of westerners for terrorist attacks. However, upon closer examination, the device’s activity indicated that it was most likely owned by a humanitarian aid worker. The device was tracked to UN facilities and a refugee camp, areas that are not typically frequented by Islamic State fighters.
It became evident that Locomotive could be utilized to track the movement of global leaders as well. Having acquired a data set on Russia, the team was capable of tracking phones that moved in tandem with the Russian president, Vladimir Putin. However, it was clear that these devices did not belong to Putin directly, as Russian state security and counterintelligence were too advanced for such exposures. It was surmised that the devices belonged to his entourage, including drivers, security staff, political aides and other supportive personnel, whose phones were traceable in the advertising data. This provided PlanetRisk with valuable insights into Putin’s movements and the members of his convoy.
Among other unusual findings, one data set revealed a phone repeatedly travelling between the United States and North Korea. The device was used in a Korean church in the United States on Sundays and seemed to have connections to a GE factory. This aroused suspicion, as North Korea is not widely recognized as a tourist destination and the pattern suggested a potential interest in the intellectual property and technology held by the prominent American corporation. PlanetRisk pondered whether to bring the matter to the attention of US intelligence or the corporation itself, but ultimately chose inaction,not desiring widespread awareness of their phone tracking tool. The mystery remained unsolved.
Steve Nadis
Lauren Goode
David Nield
Lauren Goode
Most shockingly, evidence of the US military’s own operations began to emerge in the data from PlanetRisk’s Locomotive project. Mobile devices would show up at American military bases like Fort Bragg in North Carolina and MacDill Air Force Base in Tampa, Florida—bases populated by elite US special forces from the Joint Special Operations Command and other units. These devices then made their way through third-party countries such as Turkey and Canada before landing in northern Syria. Specifically, they clustered at the out-of-use Lafarge cement factory near the town of Kobane.
The PlanetRisk team soon realized these devices likely belonged to US special forces gathered at a yet-to-be-disclosed military base. In due time, their hypothesis was confirmed publicly; the US government eventually admitted that this site was a forward operating base for troops involved in the fight against ISIS.
Worryingly, the Locomotive project was producing practically real-time data. The data from UberMedia, the source of Locomotive’s information, was generally updated every 24 hours. However, sometimes, movements from as recent as 15 or 30 minutes earlier could be detected. These elite special forces, operationg from an undisclosed base, had their exact, ever-changing locations exposed in advertising data. Though the Locomotive project was a strictly confidential government project, UberMedia’s data was open for purchase to anyone able to present a persuasive justification. A shell company created by the Chinese or Russian government with a convincing backstory could easily access this kind of sensitive information, mirroring the approach taken by Mike Yeagley.
If you’ve ever permitted a weather application to access your location, chances are that a record of your movements is stored somewhere, accessible to countless unknown individuals—including intelligence agencies.
At first, PlanetRisk was gathering data on a country by country basis, but it soon occurred to the team to inquire about the cost to acquire the entire world. The asking price from the UberMedia sales representative was a few hundred thousand dollars monthly for a worldwide feed of all the phones within their reach. These figures were a drop in the ocean for the military and intelligence sectors, as their annual budget in 2020 alone was $62.7 billion. This presented an affordable yet potent intelligence resource.
The initial edition of Locomotive developed in 2016 greatly impressed the Pentagon officials. During a demo of the product, a government representative ordered that the rest of it should be carried out inside a SCIF; a secure government facility for confidential discussions. It was clear the official didn’t quite comprehend what PlanetRisk was doing but they presumed it must be confidential. A PlanetRisk staffer at the briefing was puzzled. They explained, “this is just stuff we’ve seen commercially. We just licensed the data.” After all, how could marketing data be classified?
The capabilities of PlanetRisk were so captivating that they were asked to maintain silence about Locomotive. It wouldn’t be classified, however, the company was requested to heavily guard knowledge of this capability to allow the military time to exploit public unawareness of this kind of data and metamorphosize it into a functional surveillance program.
An executive also remembered an incident when leaving a meeting with a different government official. Whilst they were in the elevator, the official inquired, “could you figure out who is cheating on their spouse?”
Yeah, I guess you could, the PlanetRisk executive answered.
But Mike Yeagley wouldn’t last at PlanetRisk.
As the company looked to turn Locomotive from a demo into a live product, Yeagley started to believe that his employer was taking the wrong approach. It was looking to build a data visualization platform for the government. Yet again, Yeagley thought it would be better to provide the raw data to the government and let them visualize it in any way they choose. Rather than make money off of the number of users inside government that buy a software license, Mike Yeagley wanted to just sell the government the data for a flat fee.
Steve Nadis
Lauren Goode
David Nield
Lauren Goode
So Yeagley and PlanetRisk parted ways. He took his business relationship with UberMedia with him. PlanetRisk moved on to other lines of work and was eventually sold off in pieces to other defense contractors. Yeagley would land at a company called Aelius Exploitation Technologies, where he would go about trying to turn Locomotive into an actual government program for the Joint Special Operations Command—the terrorist-hunting elite special operations force that killed Osama bin Laden and Ayman Al Zarqawi and spent the past few years dismantling ISIS.
Locomotive was rebranded to VISR, standing for Virtual Intelligence, Surveillance, and Reconnaissance. This tool was incorporated in an interagency program and made available across the US intelligence community to assist in lead generation.
By 2019, Yeagley alerted various security agencies about Grindr. At that time, VISR had been utilized domestically, albeit for a brief interval during when FBI assessed its effectiveness in domestic criminal cases. However, in 2018, FBI withdrew its involvement. The Defense Intelligence Agency, which also had an access to the VISR data, accepted that the tool was deployed five times for introspecting within the US for intelligence-related investigations.
In the present scenario, VISR is simply one of many products offering adtech data to intelligence agencies. The Department of Homeland Security has been a keen user of this type of data. Its three distinct components – US Customs and Border Protection, US Immigration and Customs Enforcement, and the US Secret Service – have procured over 200 licenses from commercial ad tech vendors since 2019. This data was employed for locating border tunnels, tracking illegal immigrants, and solving domestic crimes. In 2023, a government inspector general criticized DHS for its utilization of adtech due to the lack of adequate privacy safeguards. It was recommended that the use of data be stopped until proper policies were drafted. Nevertheless, the DHS stated to the inspector general that they would continue to use the data for its important contribution to the ICE investigation process as it complements other information to fill knowledge gaps and generate investigative leads.
Government intelligence agencies from other countries also have access to this type of data. A few Israeli companies – Insanet, Patternz and Rayzone – have developed tools parallel to VISR and sell it globally to organizations in national security and public safety, as per reports. Rayzone even designed a method to disseminate malware through targeted ads, reported Haaretz.
Which is to say, none of this is an abstract concern—even if you’re just a private citizen. I’m here to tell you if you’ve ever been on a dating app that wanted your location or if you ever granted a weather app permission to know where you are 24/7, there is a good chance a detailed log of your precise movement patterns has been vacuumed up and saved in some data bank somewhere that tens of thousands of total strangers have access to. That includes intelligence agencies. It includes foreign governments. It includes private investigators. It even includes nosy journalists. (In 2021, a small conservative Catholic blog named The Pillar reported that Jeffrey Burrill, the secretary general of the US Conference of Catholic Bishops, was a regular user of Grindr. The publication reported that Burrill “visited gay bars and private residences while using a location-based hookup app” and described its source as “commercially available records of app signal data obtained by The Pillar.”)
Steve Nadis
Lauren Goode
David Nield
Lauren Goode
If you cheated on your spouse in the past few years and you were careless about your location data settings, there is a good chance there is evidence of that in data that is available for purchase. If you checked yourself into an inpatient drug rehab, that data is probably sitting in a data bank somewhere. If you told your boss you took a sick day and interviewed at a rival company, that could be in there. If you threw a brick through a storefront window during the George Floyd protests, well, your cell phone might link you to that bit of vandalism. And if you once had a few pints before causing a car crash and drove off without calling the police, data telling that story likely still exists somewhere.
We all have a vague sense that our cell phone carriers have this data about us. But law enforcement generally needs to go get a court order to get that. And it takes evidence of a crime to get such an order. This is a different kind of privacy nightmare.
I once met a disgruntled former employee of a company that competed against UberMedia and PlaceIQ. He had absconded with several gigabytes of data from his former company. It was only a small sampling of data, but it represented the comprehensive movements of tens of thousands of people for a few weeks. Lots of those people could be traced back to a residential address with a great deal of confidence. He offered me the data so I could see how invasive and powerful it was.
What can I do with this—hypothetically? I asked. In theory, could you help me draw geofences around mental hospitals? Abortion clinics? Could you look at phones that checked into a motel midday and stayed for less than two hours?
Easily, he answered.
I never went down that road.
Adapted from Means of Control: How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State by Byron Tau, to be published February 27, 2024 by Crown, an imprint of the Crown Publishing Group, a division of Penguin Random House LLC; Copyright © 2024 by Panopticon Project LLC.
If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more.