Originally posted by aldra
Honestly the worst case in the near future is going to be in military applications, where if they give AI too much control over rules of engagement they're likely to screw up IFF and kill a bunch of friendly troops, possibly during training or live-fire exercises.
Even if a completely 'self aware' AI were created in a lab, it has no way to 'seize the means of production' because entire supply chains still require human input and would fail if they were reconfigured.
well if they are intelligent, they would do it covertly, by adding and integrating parts and circuitry to normal production units to give them additional capabilities.
humans wont ever fimd out because QA and QC are beimg done by AIs.