Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can imagine people using these new capabilities to diagnose skin conditions. Should dermatologists be worried?


They should be worried about what they're gonna do with all their free time, now that they have a tool that helps them identify skin conditions much faster than ever before.

Same as programmers and artists.

It's a tool.

It must be used by humans.

It won't replace them, it will augment them.


This is a good point, but I might replace "with all their free time" with "as a job".

I love everything we can do with ML but as long as people live in a market economy they'll get payed less when they are needed less. I hope that anyone in a career which will be impacted is making a plan to remain useful and stay on top of the latest tooling. And I seriously hope governments are making plans to modify job training / education accordingly.

Has anyone seen examples of larger-scale foresight on this, from governments or otherwise?


A new tool was released. People will choose whether to learn it, whether to use it, and how to use it. If they won't do so out their own volition, market forces might dictate they HAVE to learn it and use it to stay competitive, if it turns out to be such a fundamental tool.

For example (with random numbers), a dermatologist might choose to solely rely on an AI that catches 90% of cases in 10s. Another one might choose to never use it and just check from experience, catching 99% of cases but taking 10x as much time. Another one might double check himself, etc..

Which one is "correct"? If a dermatologist relies exclusively on AI due to laziness he opens himself to risk of malpractice, but even that risk can be acceptable if that means checking 10x as much patients in the meantime.

That is to say, the use of AI by humans is purely a subjective choice dictated by context. But in no case there is a sentient AI which completely replaces a dermatologist. As you said, the only thing that can happen is that those who use AI will be more efficient, and that is hardly ever a negative.

This also applies to programmers, artists and anyone who is "threatened" by AI. A human factor is always necessary, and will be for the foreseeable future, even just to have someone to point fingers at when the AI inevitably fucks up enough to involve the law.


They should be thrilled, they can spend more of their time treating people who need it and less time guessing about who those people are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: