The History of fire alarm systems

The first fire departments were formed more than 200 years ago. Unlike the high tech equipment wielding community heroes they are today, the first fire fighters were town and community members organized into roving brigades. They walked through their home towns on scheduled shifts, carrying bells and whistles they could ring or blow to call for assistance when they spotted a fire. While undoubtedly helpful for town safety, this system had serious limitations because the location of the manually sounded alarms could not easily be traced in a timely manner.

Then, much like most aspects of home security, the invention and proliferation of the telegraph in the 1840s lead to the advancement of fire detection. The first telegraph-based fire alarm system was installed in New York City in 1847 and hundreds of similar fire detection networks sprung up around the country in the 1850s and 1860s, mostly under the direction of the newly formed Gamewell Company. These early systems involved a series of telegraph boxes placed around different sections of town and all wired to one or more central fire stations. A resident spotting a fire would crank the dial of the nearest box in order to dispatch a team of firefighters to deal with the blaze.

Inventor William B. Watkins further improved this system in the 1870s with his development of the heat sensing automatic alarm system. Watkins’s systems automatically sent telegraphs to the fire department when the hardware detected temperatures higher than a specified threshold, limiting the need for witness action. Heat-detecting alarms became the standard for fire detection until the 1960s when smoke detectors became commercially available. Studies conducted by the Fire Detection Institute comparing smoke and heat sensors have found that smoke detectors sense threats faster and statistically increase home and business safety; the vast majority of fire-related deaths and injuries today occur in locations without smoke detectors.