Palantir’s God’s-Eye View of Afghanistan

The company’s software can sift through enormous amounts of data, and those metrics can be used to make life-or-death decisions.
aerostat
Photograph: Gabe Hernandez/AP
This is an excerpt from the book First Platoon, by Annie Jacobsen, about the US Defense Department’s quest to build the most powerful biometrics database in the world: a system that can tag, track, and locate suspected terrorists in a war zone. But, as the world continues to battle a deadly pandemic, these big-data surveillance systems are playing an increasingly suspicious role in our daily lives.

In a steel container inside an American outpost in southern Afghanistan, an aerostat operator named Kevin was watching a man defecate in a farmer’s field. The man was wearing a purple hat. It was 2012, and Kevin was serving as mission director for the Persistent Ground Surveillance System team stationed here at Combat Outpost Siah Choy, located in the heart of volatile Zhari District.

The PGSS (pronounced pee-jiss) team spent 24 hours a day, seven days a week, watching an area that included 20 or so small Afghan villages. Their job was twofold. On the one hand, they watched four individual platoons of American soldiers who’d been deployed to this area, including the approximately 30 young men who made up First Platoon. Whenever one of these platoons stepped off base to conduct a patrol, the PGSS team “covered down” on the soldiers, keeping an eye out for indicators of a pending attack. The rest of the time, the team observed locals under suspicion of terrorist activity, which is why Kevin was watching the man in the purple hat. The question at hand: Was he squatting down to go to the bathroom, or to bury an IED?

An aerostat is a giant surveillance balloon. Its onboard cameras and sensors suck up vast amounts of data on what’s happening on the ground. That raw data gets processed, organized, and aggregated into an army intelligence product thanks to software developed by Palantir Technologies. Launched almost two decades ago with seed money from the CIA, the Silicon Valley startup had managed to solve a problem plaguing the Pentagon: After years of accumulating surveillance video captured by drones, airships, and aircraft flying over Iraq, the armed forces had, quite literally, millions of hours of footage sitting in archives taking up space. “We’re going to find ourselves in the not too distant future swimming in sensors and drowning in data,” Lieutenant General David Deptula warned colleagues in 2009. In one single year, the Air Force alone had collected more video footage in Iraq than a person could watch 24 hours a day, seven days a week, over the course of 24 continuous years. What to do with all that information? Palantir’s software could sift through volumes of raw, or unstructured, data, then organize and structure it in a way that made search and discovery features possible. Search for, and discovery of, say, a man in a purple hat.

“I could see everything,” Kevin says, referring to the aerostat’s technology-enabled omniscience, sometimes called the God’s-eye view. “The only way I didn’t see something was if I wasn’t looking at it.”

Kevin is an expert in what’s called pattern-of-life analysis, an esoteric discipline that involves establishing a person’s identity based on his or her cumulative habits, much of which is captured from overhead surveillance. The man going to the bathroom was deemed a person of interest, and Kevin was working to establish his pattern of life in pursuit of a new methodology called activity-based intelligence, or ABI. The first, fundamental premise of activity-based intelligence: You are what you do.

The PGSS aerostat that Kevin was in charge of was a 72-foot-long balloon called a 22M (for meters) in contractor parlance. It was not a dirigible, meaning it was not steerable and did not navigate through the air on its own power. The 22M was tethered to a mooring station inside the combat outpost at Siah Choy, attached by a 2,700-foot cable made of fiber optics, rubber, and Kevlar wrap. The flatbed surface on the mooring station reminded Kevin of a merry‑go‑round because it could rotate 360 degrees. “It could swivel back and forth to allow for wind relief, [which] mattered in the summer months, when the 120 Days of Wind kicked in,” he said, referring to Afghanistan’s strong seasonal winds. (He would later say they reminded him of the Santa Anas in Southern California, where he grew up.)

The equipment attached to the balloon afforded Kevin a clear view of the soldiers, their outpost, called Strong Point Payenzai, and its environs. For the most part, First Platoon’s soldiers were unaware that an aerostat with a suite of electro-optical-infrared high-definition sensors and cameras was able to watch them as they walked around their area of operations—through grape rows, down footpaths, and into the villages on their twice-daily patrols.

“The idea was, do not let anyone know we exist,” Kevin says. “Occasionally one of the Joes”—contractor vernacular for soldiers—“an NCO usually, would use us as a way of saying ‘We are watching you.’ And we’d say, ‘No, no, no, don’t do that.’ We’d end up with some villager at our combat outpost saying, ‘I know you’re watching us. Tell me who stole my goat.’ This actually happened.”

The imaging system, dubbed the MX-15, attached to the underbelly of the aerostat was roughly the size of a beach ball. It weighed 100 pounds and carried an array of cameras for intelligence, surveillance, and reconnaissance purposes. Its ability to see people and objects on the ground was extraordinary; it could make out an unusual modification on the buttstock of an AK-47 from 2 miles away, Kevin recalls. This image feed was livestreamed to several 40-inch monitors inside the steel container where the PGSS team worked. Separately, the data went to Palantir’s database for processing.

Working as a pattern-of-life expert at Sia Choy, Kevin spent hours a day, seven days a week watching all kinds of people go about their lives, with the goal of separating out the insurgents from the civilians. Pattern-of-life analysis means watching seemingly innocent behavior hour after hour, eyes focused for when the behavior of a person of interest might take a unique turn. Machines can’t yet perform this function; only humans have conscious reasoning.

The PGSS team had reason to believe the person of interest in the purple hat was a terrorist. Because purple was an uncommon color for a hat in Zhari District, it had become a unique marker for the man. After watching him for weeks, they’d determined he was a bomb emplacer, meaning he buried IEDs for the Taliban. The team had established his bed-down location: He lived across the Arghandab River, on the south side. Like many of the people Kevin and his team tracked, this individual insurgent was still an anonymous person to them.

“He would get up every morning, turn on an IED belt, a series of IEDs strung together by det [detonation] cord, to protect himself and his perimeter,” Kevin recalls. “We elevated him to 429 status through his actions.”

429 status is what happens when a person of interest completes three “interactions with the ground.” These are actions that allow for that individual to be moved out of civilian status and into insurgent status—to be targeted and killed legally according to army rules of engagement.

The three interactions with the ground were specific: “If I see him interacting with the ground, and then I see the pressure tank going in, or [if ] I see the charge going in, and him stringing the lamp cord out to install his pressure plate or his battery connection … That’s one, two, and three for me,” Kevin says. This is activity-based intelligence acquired through persistent surveillance from above.

But activity-based intelligence as a methodology is predicated on a more radical idea. “By understanding [a person’s] pattern of life, analysts can build models of potential outcomes and anticipate what may happen,” explains Patrick Biltgen, a senior mission engineer who worked on the persistent surveillance system ARGUS‐IS (shorthand for Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System), glorified by some for its mind-blowing technological achievements and criticized by others as paving the way for an all-seeing, digital panopticon.

Activity-based intelligence began in the war theater with the presumption You are what you do, but it is now being pushed into a new realm, says Biltgen. One that asserts: Because we know what you did, we think we know what you are going to do next. “Just like [the film] Minority Report predicted in 2002.” Today, some of these same methodologies are being used by the US federal government, including the Department of Health and Human Services, to tag, track and locate people who might carry the coronavirus. This raises concerns among legal scholars of a burgeoning biometric cybersurveillance state.

In Afghanistan in 2012, when not covering down on a specific mission, the PGSS team at Combat Outpost Siah Choy would watch persons of interest like the man in the purple hat. Waiting for three key interactions with the earth that would allow for 429 status to be assigned to the person of interest. As soon as the criteria were met, the PGSS team would notify the army’s S2 intelligence officer, a lower-echelon intelligence officer working in the tactical operations center, but with profound, direct influence over what might happen next. The S2 would monitor the situation by watching the full-motion video feed. At the same time, one of the aerostat’s flight engineers would begin reviewing the feed from minutes before.

“Rolling it back in time,” Kevin explains, “in order to take snapshots of the three interaction-with-the-earth events.”

The 429 package, which allows an insurgent to be killed in an air strike, must meet the legal requirements. The full-motion video gets snapshotted as evidence. While this is going on, the PGSS operator quickly generates a PowerPoint containing all the data, which goes to the S2. The S2 quickly reviews that, then sends the information to the battle captain.

“He takes that info,” Kevin explains, “and he washes it through Palantir.” Although Kevin carries a top secret clearance, as a PGSS operator he would not be able to access Palantir’s database. “That’s an S2 function,” he explains, meaning the classified data being aggregated by Palantir is proprietary. The job of a pattern-of-life expert is to find out “who is who” and “who is doing what.” As defense contractors, PGSS operators do not have the legal authority to decide who gets to kill whom.

“The military application of Palantir is awesome,” Kevin says. Palantir is capable of mining and aggregating data on individual people in a manner that would astonish almost anyone. But he thinks the growing movement among law enforcement agencies in the United States to use Palantir’s software programs domestically is cause for alarm. For example, in 2017, a joint effort by the Department of Homeland Security, Immigration and Customs Enforcement, and Health and Human Services relied on Palantir Technologies to tag, track, locate, and arrest 400 people in an operation targeting family members and caregivers of unaccompanied migrant children. Human rights organizations and privacy advocates cried foul.

“The fact that there’s other moves afoot to actually use Palantir in the United States, I think that’s very, very bad, because of the type of 360 [degree] metrics that are collected,” Kevin warns. “I’m not kind of saying, ‘Hey, I’m scared of Big Brother.’ That’s not my view. But that is exactly what Palantir is capable of.”

In Afghanistan in 2012, there was a geospatial intelligence gap as far as biometrics was concerned. PGSS operators were able to physically locate and track individual persons of interest who were still anonymous—meaning they were fighters whose identities were not yet known. These individuals were being watched because of what they did. Separately, the Defense Department maintained its Automated Biometric Identification System, or ABIS, database, which contained the biometric profiles on millions of individuals in Afghanistan, some of whom had already been classified as most-wanted terrorists. Biometrics meaning fingerprints, facial images, iris scans, and DNA—body measurements taken wittingly, by American soldiers on presence patrols, as well as unwittingly, having been lifted off captured bomb parts and weapons caches. In 2012, there was no technology-based way to bridge this gap. Meaning the MX‑15 camera on the aerostat could not cross reference what it saw with biometric information from the ABIS database. On occasion Kevin would participate in a go‑around.

“I would get a slice of data from Palantir [via S2] saying, ‘Hey this is this guy we’re interested in. The request would be ‘Try and locate him.’” Included in the slice of data from Palantir would be an image of the man’s face. “I’d get a picture of him,” Kevin says, “I’d also get, maybe, one or two degrees of people that he knows, and areas that he’s known to travel in.” When Kevin says “degrees of people,” he means individuals the person of interest is linked to, associates with, or has been determined to know. The power of Palantir lies in the connections it can make between people, searching vast amounts of data, analyzing patterns, and making connections that would take humans a huge amount of time to figure out.

Because Palantir’s algorithms can gather data about a person’s activities in the past, in 2012, the machines were “learning” how to make predictions about this same person’s activities in the future. In addition to the images of the associates, Kevin would often get predictions about “a general area where [the person] could be traveling.”

Once the PGSS team located who they thought was the actual person of interest, “we’d kind of do a self-check, to follow him.” Meaning the initial hunt began with a computer, but it was now fact-checked by a human. “This is basically what I do. I follow his bed-down location. I track every building that he walks to. I determine his daily pattern of life. When does he pray? When does he eat? When does he go to the bathroom? When does he wake up? When does he sleep? The data cuts from Palantir are like a bread-crumb trail for me to go down. At the same time, if I see something, then that’s me generating a report. And that becomes data in Palantir.”

Once an individual is determined to be a known IED emplacer, like the man in the purple hat, and he has been designated a “429 package,” then one of two things happens. “If there is an asset available, if CAS,” close air support like attack helicopters and drones “is in the vicinity, then it is time to take the target out.” If there’s not air support available, then the person of interest remains marked for death in the system. “The moment there is a target of opportunity to take him out, I call it in. I don’t have to go back through the approving process,” Kevin says. “The 429 package stands. That’s why it’s called a Target of Opportunity. When you have the opportunity, you strike the target.” You kill the man.

One morning, Kevin came into the ops center. The overnight team was excited. One of them said, “We’re about to kill the man in the purple hat.”

Kevin had personally watched this man bury IEDs and train others how to emplace bombs. He leaned in close to the screen. “Where is he?” he asked his colleague.

The colleague pointed to the screen. “Here,” he said, “talking to this other farmer,” and he pointed to a man seated on a tractor.

Kevin examined the image feed. The man on the tractor was talking to an old man, who appeared to be another farmer. Kevin stared at the man in the purple hat.

“That’s a Massey Ferguson tractor he’s sitting on,” Kevin said, pointing at the screen.

“Yep,” the colleague agreed.

Kevin explains what went through his mind in 2012. “I’d burned a lot of time and effort trying to locate and kill this guy, because he was a terrorist cell leader. I knew his face. I knew his gait. I knew his build. I knew what he looked like, and I knew he wore a purple hat. I knew he wore white and black man-jams [traditional outfit]. I knew the color of his shawl, his little body wrap, and I knew where he lived.”

Standing in the C2 shelter at Siah Choy, in front of the video screens, the colleague spoke, “We’re getting ready to hit him now,” he said. “CAS is on the way.”

“That isn’t him,” Kevin said. “That is absolutely not him.”

Kevin was certain of this. “I thought, wow, that looks like him. But something just gave me a tickle that that wasn’t him. For a lot of different reasons. Number one, he’s not a worker. He’s a bad guy. Bad guys don’t tool around on tractors and play farmer. They are bad guys.” The tractor was a legitimate and expensive tractor, one only a farmer would have. “Why is he on a tractor?” Kevin asked himself. “Why is he talking to this old man in this field?”

The more Kevin looked at the man in the purple hat, the more he realized something was wrong. “I became confused. I said to myself, ‘Well, I mean, fuck, it looks like him, but I don’t think it is him.’”

Then he became very stressed out, he recalls. “Hands-down, I wanted the man in the purple hat dead. I still do to this day. But we’re talking about killing someone.” Metaphorically, he says, he had his finger on the button. “If that kills an innocent civilian? I don’t want that.”

Kevin ran out of the operations center, across the outpost and into the tactical operations center. “I told the S2 they had to call off the air strike. It’s not him,” Kevin told the battle captain.

The tactical operations center spun into action. One of the S2 intelligence officers confirmed that Brigade Headquarters, located a few miles north at Forward Operating Base Pasab, had already authorized the air strike. That close air support was on the way.

“I said, ‘I’m certain it’s not him.’” Kevin remembers. The battle captain said to him, “Well, you’ve got five minutes to figure that out and prove me wrong.” Kevin said that’s what he’d do.

Kevin ran back to the C2 shelter. “I [moved] the camera over to his actual bed-down location. He lived right across the river. I waited and waited. It felt like half an hour. It was probably more like a few minutes. Finally he came out. I recognized him right away.”

Kevin was looking at the man with the purple hat. The insurgent whose pattern of life he’d been tracking for hundreds of hours.

“He walked out of where he slept to go to the bathroom, wash his hands, stretch. I had visual positive identification on him.”

S2 called off the air strike.

“Had a computer done the algorithm on the guy on the tractor, as far as the computer was concerned, that was him. The insurgent in the purple hat,” Kevin says. “But because I had already been watching this guy for months, I knew that it wasn’t.” Humans are still the ultimate recognizers. “We humans have the ability to recognize faces. It’s part of our genetics. Of however many thousands of years of being a hunter-gatherer. Of being able to spot recognizable features. I knew his face. I doubted the computer. I was right.”

How was the farmer on the tractor misrecognized as the cell leader in the purple hat in the first place? After the air strike was called off, and the man was spared execution, the PGSS operators rolled back the videotape to review what had happened. To see what they could learn.

“It was his hat,” Kevin explains. “There’s a window of time, around dawn, as the sun comes up,” he explains, where colors are “read differently” by the imaging system than how it sees them during the day. In this window of time, the farmer’s hat was misidentified as purple, setting off a series of linkages that were based on information that was erroneous to begin with.

But what if the S2 shop had killed the farmer in the purple hat in error? And what if, out of fear of backlash over yet another civilian casualty, the data that showed otherwise was deleted so that it would never become known? This invites the question: Who has control over Palantir’s Save or Delete buttons?

“Not me,” says Kevin. “That’s an S2 function.”

Who controls what data gets saved as potential evidence, and what data gets deleted—including data that could potentially act in a defense? What happens to the rule of law when individual citizens are persistently surveilled without knowledge of, or access to, the information that is being collected on them?

The Department of Defense won’t answer these questions on the grounds that its war-fighting systems are classified. But persistent surveillance systems similar to the PGSS are now being used to watch and collect data on Americans back home, always under the premise of rule of law. Privacy issues regarding persistent surveillance are being debated in the courts at a snail’s pace, while advances in machine learning are moving forward at science-fiction-like speed. Palantir cofounder and CEO Alex Karp sees that as an existential challenge for Palatir. “The present and the future ability to control the rule of law and its application will be determined by our ability to harness and master artificial intelligence and its precursor, machine learning,” Karp says.

The global pandemic has pushed the use of military-grade surveillance technologies on American citizens, and to an alarming degree: On April 10, 2020, the US Department of Health and Human Services (HHS) entered into a no-bid contract with Palantir Technologies to track the spread of the coronavirus. The goal of the HHS Protect Now program, explains former CIA officer Christopher Burgess, is to “bring disparate data sets together and provide better visibility to HHS on the spread of Covid.” HHS confirmed that the data that Palantir is now mining includes “diagnostic testing data, geographic testing data, [and] demographic statistics,” meaning information about individual American citizens’ health, location, family, and tribe. The initial HHS announcement said Palantir would have access to 187 data sets. That number has since grown to 225. Unknowns abound: What data is going into the Palantir system, how is it shared, with whom, and for how long? What safeguards are in place to prevent HHS from sharing identifiable personal data with its federal law enforcement partners—just as it did in 2017, with ICE?

“Given how tight-lipped both HHS and Palantir have been over the program, we don’t fully know,” says Lauren Zabierek, executive director of the Cyber Project at Harvard Kennedy School’s Belfer Center. Zabierek is a former Air Force officer who also served as a civilian analyst with the National Geospatial-Intelligence Agency (NGA) in three war zones, including in Kandahar in 2012. “I sincerely hope that HHS Protect Now will do nothing resembling finding and fixing certain entities,” she says, using military nomenclature for locating and killing IED emplacers in the war zone. “I hope that the data sets will only be used to understand the spread of the virus in the aggregate.” But of course how could we ever be sure of that? Machines make mistakes, the implications of which are both known and unknown. Just ask the man in the purple hat.


Adapted from First Platoon: A Story of Modern Warfare in the Age of Identity Dominance, by Annie Jacobsen. Copyright © 2021 by Annie Jacobsen. Published by arrangement with Dutton, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC.


If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more.


More Great WIRED Stories