Brandon was accepted to Harvard Business School for the fall of 2015, but he already had an idea of what he wanted to do. When he was overseas, he spent time working with sensors and inexpensive computers. “When I realized that, used together, the two could reason and take action,” he said, “my mind started racing with a sense of new possibilities.” He had come to believe that certain battlefield tasks could be accomplished with artificial intelligence, and this, he felt, would save lives.
He’d identified a specific problem, one he believed was solvable: that physical act of searching structures, which had bedeviled troops in the urban combat that characterized so much of the post-9/11 wars.
“No one was really working on this,” Brandon said, so as he entered business school he took his idea to Ryan. At 31, Ryan was already a proven entrepreneur. He had founded and sold a wireless charging company, WiPower, to Qualcomm, and had started a time-lock container company, Kitchen Safe, that had led to “the most enthusiastic pitch ever” on Shark Tank (at least according to Business Insider). When Brandon hit up his brother, Ryan was between ventures (though he did have a dishwashing robot in development). Brandon, who is the gregarious T-shirt-and-jeans-wearing counterpoint to his brother’s more analytical, collared-shirt-and-khakis persona, initially encountered some skepticism from Ryan. “I assumed this was a solved problem, that we were already doing this,” said Ryan, explaining his initial hesitation. “Also,” he joked, “the idea was coming from my little brother.”
Brandon managed to convince Ryan that his idea was viable and that the component technologies already existed, so in the spring of 2015 they set about finding an engineer who could take it on. “Everyone we talked to,” Ryan recalled, “kept mentioning this guy Andrew.” That was Andrew Reiter, a chemical engineer turned roboticist who had cycled through prestigious research programs at Northwestern and Harvard and was currently at Draper Laboratories, in Cambridge, Massachusetts, working on camera-based navigation in autonomous robots.
“They sent me an email out of the blue,” Andrew said, “and I also thought, isn’t the military already doing this?” Although university labs had experimented with quadrotor autonomy, and a few high-profile small-drone projects had dabbled with military applications, AI-driven drones had yet to be put to use. That is partly because applying artificial intelligence to actual environments can still be a difficult feat: Machine learning is good at predictable and repetitive tasks, but the real world is insanely unpredictable. Over the past two decades, the military had come to rely on human-controlled drones for everything from intelligence collection to air strikes. Despite numerous conceptual papers imagining the role that systems powered by artificial intelligence will play in the future of warfare, the military had yet to field a single autonomous drone.
The brothers flew to Cambridge to meet Andrew in person. Within six hours the three had the outlines of a business plan: They would create an AI-powered quadcopter (they won’t say much about technological specifics) to solve the problem of room-clearing. Their goal was to then expand the use of the AI—what they later branded Hivemind—and apply it to other military problems. A month later, Andrew moved to San Diego and took up residence in Ryan’s guest room for about a week.
By late August 2015 the three had a proposal in hand, and in a two-week period they’d scheduled 30 meetings with potential investors in Silicon Valley. Twenty-nine passed. The investor who bit had no interest in saving lives on the battlefield; instead, they wanted to develop a selfie-snapping drone. The capital was there, but the mission wasn’t. When I asked whether they considered going in a different direction, Brandon said, “We were building a company to make a dent in this mission.”