I heard a similar thing happening with Asiana, pilots unable to go on strike, since they’re declared as essential industry. So instead they are doing a lazy strike. Refusing to accept small technical issues that are otherwise OK to dispatch with. Not giving voluntary extensions on their duty hours.
Of course the government friendly media is riling up public opinion against them, how they’re causing losses with their difficult behaviour.
But that’s just an awful argument to make. An essential industry, relying so much on safety, starts losing money the moment the employees start being a little less lenient.
To be clear, we are talking about safe to fly airplanes. Almost everything on an airplane is redundant, and the manufacturers provide clear instructions what can be inoperative and under what conditions.
Still, the captain can decide to spend a lot of time on this; do a very extended safety briefing before departure, delaying the flight. Requesting extra fuel over safety concerns. Requesting a different route, because let’s say the weather radar is not working and there’s a small chance of bad weather along the route.
It will not be free, and it will misdiagnose exactly the same way as a human doctor would.
I’d argue that what Saraphim described, -their friend dying from being constantly misdiagnosed due to weight- is the perfect example of what we will be able expect from an AI doctor. These machine learning algorithms lack fidelity, which is most needed to understand a complex problem.
Furthermore they have no concept of ethics or morals, and the data they train on reflects the imperfections of our society.
So for example if all the doctors are biased towards overweight women, the AI trained on their diagnostic data will be too.
AI doesnt exist but will ruin everything anyways
One day AI will be a useful tool but to get there, one of the things we must do is be very critical of it.