Since 2000, the U.S. Food and Drug Administration has reported a general decrease in the number of new molecular entities – drugs with new chemical structures – submitted for approval each year, indicating a decline in the discovery of new drugs. Thirty-six years ago, pharmaceutical professionals also worried about drug discovery and lamented a decline in the rate at which new drugs were reaching the U.S. market. In December 1970, Technology Review published “Drugs: Has the Age of -Miracles Passed?…and Will That of -Science Ever Dawn?” It was a two-part review of a symposium on drug discovery at the American Chemical Society’s national meeting that year. Barry M. Bloom, a representative from Pfizer, presented data, summarized in the bar graph shown here, demonstrating a decline in the rate of drug discovery.
In 1962 the Food, Drug and Cosmetic Act was amended to require that any new drug should be not only safe but efficacious. Since 1962, Mr. Bloom finds, there has been a marked tendency for new drugs to be either antibiotics, cancer treatments, or neural agents, to the neglect of the general run of diseases. In his view, discovery goes on very much as it used to, but the only drugs that reach the U.S. market are those for which an efficacy demonstration is relatively easy.
The section concludes that it now costs about six times as much to discover a new drug, saleable in the U.S., as it did ten years ago. (Francis J. Blee, of Drexel Harriman Ripley, Phila-delphia, produced the following figures: research and development investment per “new entity” between 1956 and 1962 – $4.1 million; between 1963 and 1969, $23.1 million.) There is therefore a keen interest in the question of which methods of finding drugs are likely to work.
Of equal concern is how these drugs function within the human body. In this month’s Q&A Harvard University computational biologist George Church discusses the impact of scientific breakthroughs such as high-throughput gene sequencing on our understanding of biological systems. Church believes the day is near when drugs will be engineered to recognize tumors and individuals can affordably have their genomes sequenced, revolutionizing personal medical treatment. In the second part of the 1970 article, scientists dreamed of such increased knowledge of biology and the potential impact on drug discovery while recognizing the gaps in their era’s science.
Finding new drugs is a form of professional gambling: some people think they have a system. William P. Purcell of the University of Tennessee holds that “from a philosophical point of view, one can reason that if we have the resources, such as manpower, knowledge, sophisticated instrumentation, computers, etc., to bring a successful ‘moon walk’ to fruition, one would anticipate that a molecule could be tailor made to be effective against a specific disease.”
Dr. Purcell is for designing drugs for specific purposes, applying computers to the correlation of biological activity with chemical structure. He admitted, though, that “one knows more about the molecular structure of an isolated molecule from instrumental analyses than about the specific interaction of this molecule with a complicated biological system….The level of sophistication of handling simultaneous equations is greater than the understanding of a parameter from pharmacological testing.”
John J. Burns, of Hoffman-La Roche, Nutley, N.J., spoke of a related shortcoming which he called the “biological knowledge gap”: a lack of basic understanding of disease processes and of what drugs actually do. For finding new drugs, the random synthesis of compounds, followed by screening for biological activity, is still worthwhile, as long as the biologists are good ones, and as long as they talk to the chemists.