HRC Prompts CBC To Acknowledge IDF Claims That It Doesn’t Use Artificial Intelligence To Identify Terrorists

April 8, 2024

On the evening of April 4, CBC News.ca published a report by Alexander Panetta entitled: “U.S. warns Israel to change course in Gaza — or else,” which gave a platform to egregious and unsubstantiated allegations claiming that the Israeli military used artificial intelligence to identify suspected Hamas operatives, allegations which Israel strenuously denies.

Despite Israel’s publicly rejecting the allegations contained in a report originally published by the UK newspaper the Guardian, this CBC News report failed to acknowledge Israel’s refuting these claims and instead presented them at face value.

Instead, CBC journalist Alex Panetta stated the following in his report:

“On a related note, Kirby declined to comment on news reports that purport to identify one of the reasons for a high number of civilian casualties in Gaza. Israeli and British outlets this week reported on an artificial intelligence program, called Lavender, that the Israeli military has used to identify suspected Hamas operatives.  The list purportedly grew to 37,000 at one point — but was eventually scaled back — and those people were targeted for bombing, even if they were surrounded by civilians. Military personnel rarely questioned the AI before approving strikes, according to the reports.”

In sharp contrast, Reuters (a wire source that CBC is a client of) has reported that “The Israeli Defense Forces denies that AI was used to identify suspected extremists and targets.”

Reuters’ article stated the following: “The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF said in a statement. The statement added that IDF directives mandate analysts to conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in line with international law and Israeli guidelines.”

To read a detailed statement that the Israel Defense Forces made in response to the Guardian’s report, click here.

Given the serious allegations being made, CBC was duty-bound to prominently acknowledge that Israel denies that its armed forces use AI to identify terrorists.

Within an hour of this report being published, HonestReporting Canada filed an official complaint with CBC News executives calling for corrective action to be undertaken. We are pleased to note that this CBC article now prominently features Israel’s stated denials and this report has been amended accordingly.

The report now states the following:

“On a related note, Kirby declined to comment on extraordinary allegations about one reason for the high number of civilian casualties in Gaza. Israeli and British news outlets this week reported on an artificial intelligence program, called Lavender, that the Israeli military has allegedly used to compile a list of people who might be Hamas operatives. The list purportedly grew to 37,000 at one point; it eventually shrank, as the search criteria were adjusted. The people on that list were allegedly targeted for bombing, even if they were with numerous civilians. Military personnel rarely questioned the AI before approving strikes on entire residential dwellings, said the reports.

The Israeli military rejected the allegations.

In a
statement, the Israel Defence Forces said it does not rely on artificial intelligence to pick targets. The IDF said it merely uses the technology described in the reports to create a database of names that it consults as an additional resource. It also said it carefully assesses potential for casualties resulting from its strikes on a case-by-case basis.”

On top of including Israel’s rejection of these serious allegations which lack substantiation, the CBC changed the following words in its report in response to our complaint:

  • news reports that purport to identify” to “extraordinary allegations”
  • that the Israeli military has used to identify suspected Hamas operatives” to “that the Israeli military has allegedly used to compile a list of people who might be Hamas operatives.”
  • even if they were surrounded by civilians. Military personnel rarely questioned the AI before approving strikes, according to the reports” to “even if they were with numerous civilians. Military personnel rarely questioned the AI before approving strikes on entire residential dwellings, said the reports.”

While we commend the CBC for taking corrective action, we are deeply concerned that its own editorial oversight did not detect the aforementioned problems contained within its report. Furthermore, and in our view, the CBC’s failure to publish a formal correction notice that should be appended to this article, reflects poorly about our public broadcaster’s commitment towards journalistic accountability and transparency.

This comes on the heels of an eye-opening column recently published in The Toronto Sun by journalist Warren Kinsella, who claimed that CBC has a secret committee overseeing coverage of the Hamas-Israel war, and whose reporting has overwhelmingly produced negative coverage of Israel since October 7, though CBC denies the existence of a secretive committee that “oversee the network’s reporting on Israel.”

As well, it was only a couple days ago that CBC Music host Jarrett Martineau shared with listeners a song called “River 2 the Sea,” by artist Hussein Elnamer, which Martineau introduced by saying it was “in sonic solidarity with the people of Palestine.”

If CBC claims to produce reporting about Israel that is fair, accurate and balanced and is not anti-Israel, then its own actions contradict its statements.

Comments

You may also like

Send this to a friend