Faculty, a British AI consultancy known for its government ties, is developing artificial intelligence technology for military drones. The company has worked closely with the UK government on AI safety, the NHS, and education. According to Hadean, a defense industry partner, Faculty has “experience developing and deploying AI models on to UAVs”, or unmanned aerial vehicles.
The two companies are collaborating on “subject identification, tracking object movement, and exploring autonomous swarming development, deployment and operations.
Faculty gained prominence after working on data analysis for the Vote Leave campaign under Dominic Cummings, who later served as Boris Johnson’s adviser. During the pandemic, Johnson’s government assigned work to Faculty and included its CEO, Marc Warner, in meetings of its scientific advisory panel. A Faculty spokesperson said: “We help to develop novel AI models that will help our defence partners create safer, more robust solutions.” They added that the company has “rigorous ethical policies and internal processes” and follows the Ministry of Defence’s ethical guidelines on AI.
Faculty’s defense AI initiative
However, Faculty did not answer questions on whether it was working on drones capable of applying lethal force, citing confidentiality agreements. Many experts and politicians have called for caution before introducing more autonomous technologies into the military.
In 2023, a House of Lords committee called for the UK government to set up a treaty or non-binding agreement to clarify the application of international humanitarian law regarding lethal drones. Faculty continues to work closely with the AI Safety Institute (AISI), potentially influencing UK government policy. In November, the AISI contracted Faculty to survey how large language models “are used to aid in criminal or otherwise undesirable behaviour.”
Albert Sanchez-Graells, a professor of economic law at the University of Bristol, warned that the UK is relying on tech firms’ “self-restraint and responsibility in AI development.” He added that companies supporting AISI’s work need to avoid organizational conflicts of interest arising from their work for other parts of the government and broader market-based AI business.
The Department for Science, Innovation and Technology declined to comment, stating it would not go into detail on individual commercial contracts.







