“It is not clear how the autorf has been changed or whether the AI RIF mandate (via autorf or freely),” said Concularer. “However, federal workers’ AI -powered firing concerns are not unfounded. Elon Musk and the Trump administration have made no secret of their love for their use of doddy technology and budget deductions, and in fact, they have already tried to join the work.”
Concular said that automatic leave .The prejudice can maintain prejudice, increase workers’ monitoring and transparency to a place where workers do not know why they were allowed to go. For government employees, such incomplete system is at risk of confusing the rights of workers or dissatisfaction with illegal firing.
“It is not often the vision of how the tool works, what data it is being fed, or how it is weighing different data in its analysis,” said Concular said. “The logic behind a given decision is not accessible to the worker and, in the government context, it is impossible to know whether the device is following legal and regular requirements or whether the federal job device will need to be followed.”
The situation becomes even more surprising when you imagine mass mistakes. “If you automatically automate bad assumptions in a process, the error is far greater than that of a person,” Don Mohan, a public policy professor at the University of Michigan, told Reuters.
“This will not help make them make better decisions, and it will not make these decisions more popular,” Mohin said.
Concular advised that the only way to save workers from an illegal firing is to support lawmakers to interfere with unions that defend the rights of workers. Congress calls for “strict external testing and auditing, strong notices and disclosures, and the need for human decision reviews,” banned federal agencies, relying on unknown data, and banning the use of shade tools, “Koncler said, without more transparency.
“We should protect federal workers from these harmful tools,” he said, “he added,” If the government cannot effectively reduce the risk of using automatic decision -making technology, it should not be used at all. “