Yes, the details matter.
AI that can scan x-rays, analyze bloodwork, evaluate my poop for life-threatening conditions, or otherwise augment a doctor's treatment? AI models that look at millions of possible treatment plans and find the ones most likely to be successful? Wonderful.
AI systems that remove the human connections? AI that evaluates treatment not based on medical efficacy but on cost models? AI used to make healthcare cheaper but not better outcomes? Do not want!
A very real issue is the dumbing-down of doctors who rely too much on AI. There were studies that doctors using AI to help during colonoscopy were less able to do their job after getting used to the AI tools. They became worse at their job by being reliant on AI.
Use of AI in some cases and for some conditions results in far better outcomes for patients. In some cases it augments what a skilled doctor can do. In some cases it results in detrimental outcomes for patients. And in some cases, it adds no medical value with a risk of increasing problems, in addition to increasing costs, like cases of transcription errors that aren't caught, or case summaries that are wrong in critical ways.