850 גג

'Man replaced by machine': Is the use of AI undermining the IDF's intelligence capabilities?

Unit 8200 served as a testing ground for advanced AI systems, shaping its military strategy for years before October 7; a Washington Post report raises concerns about increased civilian casualties and weakened human intelligence under the unit’s AI-driven approach

Ynet|
Following Hamas’ surprise attack on October 7 last year, the IDF launched an extensive aerial campaign in Gaza, leveraging years of meticulously gathered intelligence on addresses, tunnels and critical infrastructure tied to the terrorist group. However, as the target bank began to dwindle, the military turned to an artificial intelligence system known as “The Gospel” to rapidly generate hundreds of new targets, according to a detailed Washington Post investigation.
The report examines how the use of AI technologies impacted the prolonged war, including its influence on civilian casualties and the quality of Israeli intelligence. According to two individuals familiar with the intelligence-gathering operation, the deployment of AI allowed the IDF to maintain a relentless pace of airstrikes.
4 View gallery
חיילי 8200
חיילי 8200
Unit 8200 troops
(Photo: IDF)
The Post describes the software as one of the most advanced applications of military AI to date, central to Israel’s ongoing operations. It also sheds light on internal debates within the IDF over whether reliance on AI weakened traditional intelligence capabilities and if the technology’s recommendations received sufficient scrutiny. Discussions on the use of AI began years before the October 7 attack, according to the investigation.
Critics within the military have argued that the AI system contributed to a significant rise in civilian casualties in Gaza. The Post bases its findings on interviews with over a dozen individuals familiar with the system, most of whom spoke anonymously due to national security concerns, as well as on internal documents obtained during its investigation.
“What’s happening in Gaza is a forerunner of a broader shift in how war is being fought,” said Steven Feldstein, senior fellow at the Carnegie Endowment, who researches the use of AI in war. He noted that the IDF appeared to have lowered its threshold for the acceptable civilian casualty rate during the Gaza war. “Combine that with the acceleration these systems offer — as well as the questions of accuracy — and the end result is a higher death count than was previously imagined in war.”
The IDF said claims that its use of AI endangers lives are “off the mark.”
“The more ability you have to compile pieces of information effectively, the more accurate the process is,” the IDF said in a statement to The Post. “If anything, these tools have minimized collateral damage and raised the accuracy of the human-led process.”

A week's work in 30 minutes

A major transformation within Unit 8200, Israel’s elite intelligence division, has been revealed, showing a dramatic shift toward engineering and technology roles under the leadership of Yossi Sariel.
According to a Washington Post investigation, by October 7, 2023, 60% of Unit 8200 personnel worked in engineering and technological capacities—double the percentage from a decade ago. This shift came at the expense of traditional roles, including experts in Arabic language, and involved the dismissal of officers critical of adopting artificial intelligence.
4 View gallery
יוסי שריאל כנס החורף של מכון "גזית" של צה"ל   ו הפריצה של חמאס לישראל ב־7 באוקטובר
יוסי שריאל כנס החורף של מכון "גזית" של צה"ל   ו הפריצה של חמאס לישראל ב־7 באוקטובר
Unit 8200 commander Brig. Gen. Yossi Sariel
(Photo: Moti Kimchi)
The report highlights the integration of AI tools like “The Gospel,” a system developed using hundreds of predictive algorithms designed to process vast quantities of data—referred to by the IDF as “the pool.” The software generates recommendations for military targets, including tunnels, rockets and other infrastructure, which are reviewed by analysts and ultimately approved by senior officers for inclusion in the target database.
A military source familiar with the systems told the Post that the AI’s image recognition capabilities allow soldiers to identify subtle patterns in satellite imagery, such as minor terrain changes indicating newly buried rocket launchers or tunnels. This level of analysis compressed work that previously took a week into just 30 minutes.
However, the rapid adoption of AI has raised concerns among some officers. Critics argue the technology’s speed conceals its limitations, such as inaccuracies in Arabic-language processing, where slang and keywords were reportedly misunderstood.
A former senior military official noted that intelligence reports presented to commanders often did not clarify whether data was derived from human sources or AI, complicating the evaluation process. “Everything was treated as the same,” another former senior official said. “I’m not even sure the person preparing the report knew the difference between the pieces of information.”
The Israeli military utilizes advanced AI systems, including "The Gospel" and "Lavender," to predict potential civilian casualties during operations. These tools rely on data mining software that combines image recognition from drone footage with tracking data from smartphones connected to cellular antennas to estimate the number of civilians in a targeted area.
In 2014, the IDF’s accepted civilian-to-combatant casualty ratio was one civilian per senior terrorist, said Tal Mimran, a former legal advisor to the military. According to Breaking the Silence, a group citing testimonies from IDF soldiers, that ratio has risen during the current war to 15 civilians per low-ranking terrorist and exponentially higher for mid- and senior-level operatives. The New York Times recently reported the figure may be closer to 20 civilians per terrorist.
The IDF has maintained that its assessments of collateral damage comply with international law, which requires distinguishing between civilians and combatants and taking precautions to minimize harm to non-combatants.

Internal debate: Man vs. machine

Supporters of these technologies argue that aggressive implementation of AI innovations is essential for the survival of a small nation facing determined and powerful adversaries. “Technological superiority is what keeps Israel safe,” said Blaise Misztal, vice president for policy at the Jewish Institute for National Security of America, who was briefed by the IDF’s intelligence division on its AI capabilities in 2021. “The faster Israel is able to identify enemy capabilities and take them off the battlefield, the shorter a war is going to be, and it will have fewer casualties.”
4 View gallery
בסיס 8200 בגלילות
בסיס 8200 בגלילות
Unit 8200 base in Glilot
(Photo: Yariv Katz)
However, concerns about the quality of AI-generated intelligence have sparked internal divisions within the IDF. According to three sources, the reliance on such technology has shifted priorities, favoring technical capabilities over traditional practices.
For example, Unit 8200 has long allowed junior analysts to bypass their immediate commanders and relay warnings directly to senior officials. Some worry that the reliance on AI systems may have undermined this practice, potentially weakening the IDF's decision-making processes.
Get the Ynetnews app on your smartphone: Google Play: https://bit.ly/4eJ37pE | Apple App Store: https://bit.ly/3ZL7iNv
Sariel has resigned from the military, citing his responsibility for the intelligence failures leading to Hamas’ surprise attack on October 7. The resignation was submitted to IDF Chief of Staff Lt. Gen. Herzi Halevi and Military Intelligence Chief Maj. Gen. Shlomi Binder.
According to two former senior officers interviewed by the Post, the unit’s growing reliance on AI was a key factor in Israel’s lack of preparedness. They claim an overemphasis on technological findings hindered analysts from effectively conveying warnings to top commanders. “This was an AI factory,” said one former military leader, speaking on the condition of anonymity to describe national security topics. “The man was replaced by the machine.”
Sariel’s name surfaced after a security breach reported by The Guardian. A book he authored, The Human Machine Team, published on Amazon, revealed a digital trail linked to a personal Google account under his name. The account contained unique identifiers and maps tied to his work. The book outlines Sariel’s vision of integrating AI into defense, emphasizing how it could revolutionize the relationship between humans and machines in military operations.
Before taking command of Unit 8200, Sariel spent a sabbatical year at the Pentagon-funded National Defense University in Washington. A professor from the institution described Sariel’s radical vision for AI on the battlefield, which he shared in his book. Sariel proposed that AI would transform all aspects of defense, including border security, turning Israel’s borders into "smart borders" using advanced surveillance technologies. He also predicted that within five years, AI could replace 80% of intelligence analysts specializing in foreign languages.
When Sariel returned to Israel, he implemented his vision. Appointed commander of Unit 8200 by then-IDF Chief of Staff Aviv Kochavi in the summer of 2020, Sariel faced internal concerns from former commanders who privately expressed worries about what they described as a "religious attitude toward AI" developing under his leadership.
Unit 8200, renowned for its cutting-edge cyber technologies and online intelligence gathering, has long held a reputation for technological excellence. However, the sheer volume of information often overwhelmed analysts.
For instance, Hamas operatives frequently used the word “watermelon” as code for a bomb, but the system struggled to differentiate between discussions about actual watermelons and encrypted messages. “If you pick up a thousand conversations a day, do I really want to hear about every watermelon in Gaza?” one insider told the Washington Post.
Under Sariel, the unit intensified its data mining efforts and restructured intelligence operations. New technologies were introduced to rapidly process algorithms in anticipation of a potential war with Hezbollah. One such innovation was an app called "Hunter," which allowed soldiers in the field to directly access intelligence data in real time.
Long before Hamas’ surprise attack, Unit 8200 had been meticulously constructing a target bank, requiring analysts to verify findings with at least two independent sources and to continuously refresh the data, according to three individuals familiar with the program.
Before a target could be added to the "bank," it had to be "validated" by a senior officer and a military lawyer to ensure compliance with international law. Senior intelligence officials, including Sariel, believed machine learning could dramatically accelerate this process.
“They really did believe with all the sensors they had all around and above Gaza, I won’t say total informational awareness, but that they had a very good picture of what was happening inside,” said Misztal, who leads an organization focused on security cooperation between the United States and Israel.
Following the events of October 7, Unit 8200 has reportedly increased efforts to recruit additional Arabic-speaking analysts, including those tasked with evaluating and critiquing AI systems, according to three sources cited by the Post. Meanwhile, Israeli officials are notably more restrained in their public remarks about the use of artificial intelligence.
<< Follow Ynetnews on Facebook | Twitter | Instagram | Telegram >>
Comments
The commenter agrees to the privacy policy of Ynet News and agrees not to submit comments that violate the terms of use, including incitement, libel and expressions that exceed the accepted norms of freedom of speech.
""