In ancient times, the manner of death was naturally assumed by where and how the victim had been found. For example, a man found in a body of water would naturally have drowned, while a man found lying broken and bloodied along the side of a road would have naturally fallen and possibly been dragged by a horse.
Suspicion of motive and the word of others against a possible murderer took precedence over any other facts, and when all else failed, torture was readily available to procure a confession.
During the middle of the 12th Century, ancient Chinese were credited with being the first to attempt to define the difference between natural death and criminal intent. In a book written by Sung Tz'u called The Washing Away of Wrong, the author observed that water collected in the lungs of drowning victims and that strangulation could be assumed by damaged cartilage in the neck. As he so wisely said, so many hundreds of years ago, "The difference of a hair is the difference of a thousand li." (A li is the word that designates the distance of a mile in the Chinese language). The book became an official text for coroners.
In 1775, Karl Scheele realized he could transform arsenious oxide into arsenious acid, which, when combined with zinc, produced arsine. This discovery led to the eventual ability to detect arsenic poisoning.
By the early 1800s, the recognition of fingerprint patterns was studied, but decades would pass before that observance was applied to criminal and personal identification.
In 1835, a former Bow Street Runner employed by Scotland Yard was the first documented case of law enforcement comparing bullets to catch their man. Henry Goddard noticed a flaw in a bullet that was traced back to the original bullet mold.
A few years later, a doctor "experimenting" with the corpses of dead soldiers in Malta discovered that body temperature dropped at regular intervals following death, and could be used to determine time of death.
The discovery that fingerprints were unique to each individual and could provide identification of a particular individual, urged the state of forensic crime investigation to the forefront in 1788 when Dr. Nathaniel Grew published an illustrated anatomy book in which he claimed that "the arrangement of skin ridges is never duplicated in two persons."
Decades later, William Herschel, a Briton working and living in British India, demanded that his contracts be "signed" with fingerprints so that it would be "impossible to deny or forge. The impression of a man's finger on paper cannot be denied by him" he stated. Naturally, he was scoffed at.
Across the miles, another Briton living in Japan had come to the same conclusion. Henry Faulds was curious whether or not fingerprints remained the same despite efforts made to erase such fingerprints. He experimented with volunteers, introducing pumice stone, sandpaper and even acids to determine if fingerprints would appear different after new skin growth. They didn't.
Central Pocket Loop
Lateral Pocket Loop
In his book titled Life on the Mississippi, Mark Twain writes of a murderer identified by fingerprint identification. The first legal recognition of this process had been realized nearly a decade earlier in 1882 during a case involving document forgery in New Mexico, the first official use of the technique in the United States.
A German scientist named Christian Schonbein, who observed that hemoglobin had the capacity to oxidize hydrogen peroxide, which caused it to foam, inadvertently discovered the first presumptive test for the presence of blood in 1863.
By 1879, another German, Rudolph Virchow, was one of the first to note the differences and unique characteristics of hair in the pursuit of individual identification.
In 1888, during the reign of England's most notorious serial killer, Jack the Ripper, the use of crime scene photographs were extensively studied in an effort to detect clues and criminal profiling of the vicious murderer. Scotland Yard is the first to have attempted criminal profiling as a result of the Ripper's savage modus operandi.
By the early 1900s, the field of forensic investigation achieved major developments, due to the design and use of modern forensic methods and discoveries such as Benzidine, a chemical compound used to develop a universal, presumptive test for blood.
Perhaps the most famous of forensic developments, at least on a psychological level, was the statement made by Edmond Locard, who stated that "every contact leaves a trace". The phrase, published in Locard's paper, L'enquete criminelle et les methods scientifique, in 1904, and which is also popularly known as Locard's Exchange Principle, remains the backbone of forensic science collection and recovery to this day.
By the beginning of the 19th century, the study of hairs, fingerprints and blood thrust the development of forensic investigation to new heights. Locard, the forensic professor at the University of Lyons, France, created the first crime laboratory for use by police and other law enforcement personnel.
In 1924, the first American police crime lab was created in Los Angeles, California and the Sacco and Vanzetti case publicized the popularity of microscopic comparisons of bullets used in their case. Following the Valentine's Day Massacre in 1929, Calvin Goddard founded the Scientific Crime Detection Laboratory at the Northwestern University in Evanston, Illinois.
By 1930, an American Criminalist named Luke May had developed tool mark striation analysis and observations and published in the American Journal of Police Science an article discussing the importance of discerning identification and differences in knives, tools and other instruments.
Just prior to the Second World War, a German named Walter Specht developed a chemical reagent called luminal, still used to this day as a presumptive test for the presence of blood.
The years following the war exploded with developments, including techniques for lifting fingerprints using a tape-lifting method, voiceprint identification and perhaps the most famous discovery in the history of forensic science, the discovery of the unique structure of DNA by Watson and Crick in 1953.
By the mid 1960s, forensic developments led to the identification of firearm residues left on skin and clothing, Breathalyzer tests to determine sobriety and determinations of post-mortem cooling had been perfected.
By 1975, the U.S. Supreme Court disseminated the Federal Rules of Evidence, which were enacted by a congressional statute. These rules stated that scientific evidence must be deemed relevant and not prejudicial for presentation in any criminal case. A mere two years later, the FBI began to use computerized scans of fingerprint cards from thousands of individuals in their Automated Fingerprint Identification System, more commonly known by law enforcement personnel today as AFIS.
Advancements in research of DNA profiling and blood analysis perfected methods such as RFLP (restriction fragment length polymorphism) and PCR (polymerase chain reaction) testing made it possible to identify victims as well as suspects in a process commonly known as DNA Fingerprinting, the most famous of forensic discoveries of the 20th century.
In 1987, the first case to go to trial using DNA evidence became a global event. The case involved a seventeen-year-old British man accused in two local rape-murders who was cleared only after the DNA of 5,000 men identified the true perpetrator, Colin Pitchfork. The first man to be convicted on DNA evidence also brought the method into worldwide debate. That same year, it seemed as if everyone was in on the debate on whether or not to allow DNA evidence into an American case which resulted in the process to certify and standardize forensic-related quality control guidelines throughout the United States and the world.
By 1989, America decided that DNA evidence was sound and valid, and the first American to be convicted on the basis of DNA evidence was sentenced 25 to 50 years for rape.
The development of a National DNA Index System created by the FBI in 1998 for law enforcement agencies throughout the United States offers both large and small agencies to access and compare DNA profiles from around the country.
Today, a wealth of technological advancements has made forensic investigation a lot easier than it used to be. However, despite such advancements, crime scene investigation still takes a human brain to rationalize and conceptualize what has happened at any crime scene. While forensic investigators rely on such medical and scientific advancements, one must never forget that the human factor comes into play in every crime.
Such basics have not changed for thousands of years, and while forensic science can explain the how of a crime, it can never solve the why. That is up to the crime scene investigator and law enforcement personnel in order to establish motive prior to trial.
It is up to the trained crime scene investigator to take advantage of every scientific and technological development in forensic investigation in order to correctly analyze, retrieve and collect evidence from the scene of any crime.
Crime scene analysis combines the human factor with scientific procedures and methods to interpret what has occurred. While the scientific evidence may speak for itself, it requires human understanding and voices to translate that evidence into a court of law in order to ensure that justice is always served.