Error Message Atrophy
In the earlier days of the Internet and of computing in general, software tended to give more specific and descriptive error messages.
Sometimes error messages were scary and contained unfamiliar language. But if we used the system long enough, we would begin to recognize terms, and maybe we would even research them and gain a deeper technical understanding of what they meant. Users were given the information they needed to succeed in their new digital environment.
[TODO: Get some screenshots of common errors across multiple time periods. Browser errors like 404 and DNS failures are good examples].
[TODO: Research, and see if we can find any specific proponents of the new school, like design guidelines from Google, etc.]
If we received an error from a web browser, for example, the error would usually include a title and a message which let us know exactly what had failed. Maybe DNS failed to resolve. Maybe DNS resolved, but the host was unresponsive. Maybe the host refused connection. Maybe we connected, but encountered a 404 error. It was very easy to tell, at a glance, exactly what error had occurred.
Sometime between 2010 - 2015, error messages seemed to get less and less descriptive, or the descriptions were hidden away in tiny text in the case of the Chrome browser. For the sake of this discussion, and since I’ve never seen a specific term for this phenomenon, let’s call this “error message atrophy.”
We can see examples of this in web applications and mobile apps, too. We are in the era of “Something went wrong” and “Try again.” Rarely are we told what exactly went wrong.
The implication seems to be that the application designers think we are too dumb to understand what is happening on our own hardware. I personally have long suspected that this coincided with a deliberately planned onboarding of huge user bases in the developing world, or from older demographics, who had less computing exposure than the the previous user demographic, which was comprised of early adopters in the most literal sense. The tech industry was bringing the unwashed masses online, and they were intentionally treated as consumers and sheep, rather than savvy computer users (who, after all, might someday become competitors). The collective attitude of the big tech companies towards their users grew increasingly paternalistic. I don’t blame any single culprit, but we can look at Apple and Google as trendsetters. Microsoft definitely deserves a mention here as well.
To be fair, sometimes there are legitimate technical reasons for “something went wrong” style error messages. Lack of detailed error information might be a natural consequence of the application architecture. Quite often, applications are a mix of frontend and server-side frameworks, and errors might be caught exceptions triggered from a mishap in a very distant part of the application stack. We might reasonably criticize such an application for failing to implement its own robust error handling and reporting framework, but a robust implementation might be equally as complex as the actual domain logic of the application. So keep in mind that there might be “innocent” examples of error message atrophy, which arise organically rather than as part of any deliberate design choice.
Also, perhaps it is not entirely fair to treat the phenomenon as an entirely malicious top-down conspiracy. After all, the popularity of Apple devices show that a genuine demand exists for computing devices which hide the scary technical details from the end user. Some UI’s should be literally childproof.