

deleted by creator


deleted by creator


deleted by creator
deleted by creator


deleted by creator


We’ll need to ensure that this bias against female (and also male) patients isn’t adopted by the AI. We’re already not properly testing medicine on both sexes. Medical textbooks often list stuff as more or less common in one sexe. This is entirely possible but if the data isn’t properly screened we’re just moving the problem. Data can exist and be wrong for many reasons. We should address that urgently. It is bad for everyone. I think it is plausible an AI could have reached the same conclusion here because of all the mental health problems considered far more common in women. Did anyone ever even check where the source of that data is? Because some stuff really hasn’t been rechecked in the last 50 years I’m sure.


deleted by creator


deleted by creator


deleted by creator


deleted by creator


deleted by creator


deleted by creator


deleted by creator


deleted by creator


deleted by creator
deleted by creator