Possibly the result of the growing autonomy of youth or the accessibility of specialist information, self-diagnosing is on the rise. Upward of 30 percent of Americans have used the Internet to reach a medical conclusion. This number is only rising, but what are the merits of giving yourself a diagnosis?
A broken arm is a case in which someone could be well justified in making a household diagnosis. But this kind of self-assessment based on a layman’s understanding of the body doesn’t hold up for something more subtle like a venereal disease or something internal only understandable through ambiguous symptoms. The ability to self-diagnose becomes even less credible when talking about emotional disorders.
Self-diagnosis has an essential place in over-the-counter medication (like menstrual cramps or allergies), but diagnosing oneself with a severe mental disorder and parading the conclusion as legitimate is foolish. Not only does it depreciate the value of a doctor or psychiatrist, but it can end the pursuit of a conclusion right then and there. Not to mention that although the Internet is a great source for information (and can often replace other sources), it’s open-sourced and the chance of collecting symptoms that lead to a false diagnosis is great.
Any degree of emotional distress is “mental illness” given the clinical definition. But genuine illnesses, chronic or genetic, are much more difficult to pin down. Bipolar disorder is notoriously difficult to diagnose – and that’s a good thing. A small degree of emotional flip-floppiness shouldn’t be treated with prescription medicine. In the states, ADHD diagnoses have gone up 42 percent over the past eight years, and there’s criticism about whether these are all genuine or if the standard of labeling has declined too much.
An example of poor self-diagnoses concerns gluten: celiac disease (an autoimmune disorder in which gluten damages the small intestine) affects only 1 percent of people in the United States, but now it seems everybody and their dog are gluten-free. In addition, non-celiac gluten sensitivity, or gluten intolerance, which is the new excuse for the diet, is up for serious debate on whether it even exists. Yet so many people, without guidance from medical professionals, have made the decision for themselves – there seems to be a “fad factor” at play.
Again, consider anxiety. I dated a girl who would wake up in the middle of the night with cold sweats and heavy breathing, and I would have to coax her back down to security and reassurance. But now, anxiety is a label self-diagnosed and almost omnipresent in the confused youth sphere, especially on breeding grounds for histrionic behavior like Tumblr.
Everyone remembers middle school when the emo wave romanticized depression. This seems to have been replaced by a wave that romanticizes anxiety and doesn’t nearly experience it.
None of this criticism on the mental health front should be confused for other fields of identity. Self-labeling can be a useful endeavor and a just expression of human creativity; using words like “sapio-” or “demisexual” are useful for individual expression and refinement of the vast, complex field of human sexuality. These aren’t diagnoses per se but self-exhumed categorizations of thoughts and feelings, unlike the steadfast designations of mental health. Besides, the individual is always going to be more alert to their sexuality than a doctor could ever be.
I understand the struggle of having an undiagnosed illness, and I understand the desperate need to attach a clinical label. Last year I finally received a conclusive diagnosis – and then the same one, twice right after, from equally authoritative sources. Expediency lost out to certainty, and this is a good thing: a broken arm is not the same thing as a fractured psyche.
When people self-diagnose, they not only risk false conclusions but confuse the mental health vocabulary for everyone else.
William Rein can be reached at [email protected] or @toeshd on Twitter.