“Shell shock,” the term that would come to define the phenomenon, first appeared in the British medical journal The Lancet in February 1915, only six months after the commencement of the war. In a landmark article, Capt. Charles Myers of the Royal Army Medical Corps noted “the remarkably close similarity” of symptoms in three soldiers who had each been exposed to exploding shells:
Case 1 had endured six or seven shells exploding around him;
Case 2 had been buried under earth for 18 hours after a shell collapsed his trench;
Case 3 had been blown off a pile of bricks 15 feet high.
All three men exhibited symptoms of “reduced visual fields,” loss of smell and taste, and some loss of memory. “Comment on these cases seems superfluous,” Myers concluded, after documenting in detail the symptoms of each. “They appear to constitute a definite class among others arising from the effects of shell-shock.”
Early medical opinion took the common-sense view that the damage was “commotional,” or related to the severe concussive motion of the shaken brain in the soldier’s skull. Shell shock, then, was initially deemed to be a physical injury, and the shellshocked soldier was thus entitled to a distinguishing “wound stripe” for his uniform, and to possible discharge and a war pension. But by 1916, military and medical authorities were convinced that many soldiers exhibiting the characteristic symptoms—trembling “rather like a jelly shaking”; headache; tinnitus, or ringing in the ear; dizziness; poor concentration; confusion; loss of memory; and disorders of sleep—had been nowhere near exploding shells. Rather, their condition was one of “neurasthenia,” or weakness of the nerves—in laymen’s terms, a nervous breakdown precipitated by the dreadful stress of war.