- cross-posted to:
- hackernews@lemmy.bestiver.se
- cross-posted to:
- hackernews@lemmy.bestiver.se
Archive: [ https://archive.ph/cJ53M ]
Some of the department’s leaders worried about the accuracy of these algorithms. One audit of a language-processing technology revealed that the software prediction was not as accurate as a human officer would have been, according to two of the people.
Others worried that predictions from the software were being given too much weight. Typically, the research division would produce daily intelligence reports for senior commanders to review potential targets. But though an individual analyst could double-click to see the information that led to the prediction, senior commanders were not informed whether a recommendation was derived through an algorithm or through human sourcing.
“Everything was treated as the same,” another former senior official said. “I’m not even sure the person preparing the report knew the difference between the pieces of information.”
Two former senior military leaders told The Post the emphasis on technology eroded 8200’s “culture of warning,” where even low-level analysts could easily brief top commanders about concerns. This shift, they added, is a significant reason Israel was surprised by the Oct. 7 attack: An experienced female analyst who had surfaced Hamas’ battlefield plans for breaking into Israel’s borders was unable to get a meeting with the unit’s top commanders in time.
they built the prototype; the real skynets are getting built in the united states and they’re called “cop cities”
the one in atlanta will be the first one and i’m willing to bet the farm that the drones we’ve been having are a test flights for skynet’s eyes-in-the-sky.