Moving Into the Third Millennium After a Century of Screening

Prabhavathi B. Fernandes

Ricerca, LLC, Concord, Ohio

For many decades, the discovery of drugs has depended on screening. From the previous century until the 1960s, chemical compounds and natural products were tested in whole animal assays, and in the case of antibacterial and antitumor testing, the samples were tested in whole cell assays. Although screening was labor intensive and expensive, it was fruitful. A large number of successful drugs such as the cephalosporins, aminoglycosides, tetracyclines, macrolide antibiotics, doxyrubicin, etoposide, and other anticancer agents, steroids and drugs for use in the central nervous system and cardiovascular areas were discovered by screening [1,2]. In many cases, identification of the lead molecule, such as penicillin, cepha-losporin, macrolides, captopril, and mevacor, spawned a large number of analogs through chemistry programs [3-7]. A single lead molecule identified in one company has, in many cases, kept several hundred industrial chemists and biologists busy for several years.

Screening natural products and a limited number of compound libraries fell out of favor in the early 1980s because very few useful new chemical entities were being identified. Advances in molecular biology in the 1980s contributed to dependence on structural biology, and it was thought that new drugs could be designed [9-11]. At this time, analytical methods were being refined for structural biology. The concept of screening chemicals made in academic and industrial laboratories began to complement the screening of natural products. A large pool of compounds became available through the opening of Russia's borders. Academic chemists, who worked to establish new synthetic methods, became sources of chemical diversity for screening, as the compounds on their shelves, in many cases, had never been tested for biological activity. This chemical diversity was increased tremendously by chemical stores of large numbers of compounds that were made for past programs at pharmaceutical and chemical companies. At the same time, progress in biotechnology allowed isolation of large amounts of pure proteins that were drug targets. Thus testing chemical and natural product libraries against isolated enzymes and proteins became the preferred method for screening [12]. Combinatorial synthesis of thousands of molecules became possible in the 1990s, and many companies tied combinatorial chemistry with high throughput screening [13-20]. The use of microtiter plates made it possible to screen large numbers of samples while keeping reagent costs down. Those companies that were the first to use these methods had products in their portfolios in the 1990s.

Did the use of isolated enzymes and cells speed drug discovery, or did it simply delay the testing of drug leads in whole cell assays or animals? Many drug leads did not show activity in whole cells, which gave employment to many chemists to improve the cell permeability of leads for soluble protein screens. Time gained in identifying a lead molecule could often be lost trying to get whole cell activity [21]. Of course, if the target was at the cell surface, like receptors, screening with isolated membrane systems led to a fair degree of success.

Structural chemistry in the mid-1990s progressed to identify chemical types that could recognize motifs within certain protein classes, such as G-protein coupled receptors and tyrosine kinases [22,23]. Sensitive reporter systems, such as luciferase and fluorescence, which could be used to detect intracellular changes, made whole cell based screening feasible again [24]. Whole cell based screening provided the flexibility of screening against proteins in interacting pathways as well as identifying leads that already acted in cellular systems. Today, screening with isolated proteins as well as whole cell systems are both used in high throughput screening.

Biotechnology has made it possible to identify new drug targets and develop many assays in a short time frame. The identification of many new potential drug targets derived from genomic sequences has increased the drive to win the discovery race by testing hundreds of thousands of samples per day in a large number of screens [24]. Miniaturization of screens has allowed the throughput to increase, and microplates ranging from 384 wells to 9600 wells have been developed [24]. Databases have been built to accommodate millions of data points. During the late 1990s, high throughput screening groups have become ultra high throughput screening groups.

The engineering and automation industries have become major players in the high throughput screening arena. Instrumentation for sample delivery, plate handling, and millions of repetitive motions required for running assays have made robotics and automation an essential component for high throughput screening. Small sample sizes have necessitated the development of very sensitive readers [24-26].

With the need of careful coordination of activities between biologists, chemists, and computer and database specialists and automation for high throughput screening, high throughput screening departments currently consist of a mixture of biologists, chemists, engineers, and computer and networking specialists. High throughput screening departments have become high technology departments. The promise of high throughput screening is that more leads will be identified and that these screening leads will be developed to make new chemical entities and eventually new drugs. During the last few years, the number of new chemical entities entering the marketplace has not increased by much in spite of the increased throughput. The art of screening needs to be coupled to the science of screening again, in order to make high throughput and ultra high throughput screening become effective throughput screening.

The 1980s had the promise of structural drug design; the 1990s was the promise of combinatorial chemistry. Future successes may be derived from genomics and new gene sequences that can be identified as drug targets. Identifying the biological role of proteins coded by these new genes as well as their interacting proteins has opened the field of functional proteomics [27]. As analytical and computational tools meet higher resolution needs, structural biology will increase its importance in designing effective blockers of activators of functional domains [28]. Functional proteomics is expected to give hundreds of new targets to drug discovery in the next decade [29]. As the next millennium unfolds, these new protein targets will be tested in high throughput screening. The challenge is to keep high throughput screening an effective means of drug discovery and deliver on the promise of finding new drugs from screening.

Lower Your Cholesterol In Just 33 Days

Lower Your Cholesterol In Just 33 Days

Discover secrets, myths, truths, lies and strategies for dealing effectively with cholesterol, now and forever! Uncover techniques, remedies and alternative for lowering your cholesterol quickly and significantly in just ONE MONTH! Find insights into the screenings, meanings and numbers involved in lowering cholesterol and the implications, consideration it has for your lifestyle and future!

Get My Free Ebook


Post a comment