Researchers have been very creative when it comes to developing novel assays, but the trick is often making them high throughput.

Patrick McGee, Senior Editor

The majority of assay development, the process of looking at the kinetics and pharmacology of an assay and ensuring it is properly arranged, has been done in the same manner for a number of years. But
Roche scientist Ralph Garippa, shown using automated cellular imaging to screen new cancer drug candidates, says his labs now develop assays in parallel when they are thinking of beginning a high-throughput screening program to better determine the optimal assay. (Source: Roche Discovery Technologies) 
scientists are constantly looking for ways to tweak existing assays and develop new ones that are more precise and sensitive while being less expensive. Drug Discovery & Development spoke with leading researchers at pharmaceutical and biotechnology companies, academic labs, and vendor companies to get their perspective on some of the more recent trends in assay development, some of the common mistakes people make when developing new assays, and how to get more data from them.

Carol Ann Homon, PhD, Boehringer Ingelheim
Pharmaceuticals Inc
Homon says that while it is clear to most that cellular assays are always more difficult than in vitro target assays, there are still some who imply that cellular assays are no more difficult. "Certainly the complexity of the cell would logically say that it would have to be more complex," especially since researchers working with cellular assays are typically dealing with transfected cells.

Homon notes that, given the industry setting in which she works, when she talks about assay development she is thinking of automated assays. "One thing we have seen is taking the hand-performed assay and converting it over to an automation format. There's always some step in that assay that is difficult to convert for one reason or another. . . . As you go into the miniaturized formats, if you really want to maintain the quality of the assay, which is critical in our thinking, it doesn't take much to fall off the edge of the earth."

Protein instability is another problem commonly encountered, especially when working with purified proteins, in which case researchers may get two to four hours of stability before the protein seriously starts to degrade. "In the automated, high-throughput world, that's not a lot of time. That means if you have a 10-hour day, you're going to have to replenish that reagent at least three times during the course of the day." Some groups will use proteins even if they've lost 50% of their activity, but Homon's group does not because at 50% activity, it is unclear what is wrong with the protein.

Still another challenge is the variability in reagents, which Homon says can vary "dramatically" from vendor to vendor. "It's just amazing sometimes. It makes no sense, and it's very, very difficult to track down." In her labs it is a rule that when a new assay is being developed, the researchers must start with all new reagents.

James Inglese, PhD, and
Doulglas Auld, PhD, the National Institutes of Health Chemical Genomics Center (NCGC)

Inglese believes that ion channel technologies are one key area that researchers need to be familiar with, especially as ion channels have been one of the most important target classes in the pharmaceutical industry. But ion channels have been difficult to develop assays for, especially high-throughput assays. "A lot of this relates to the fact that the gold standard for ion channels are the patch-clamp or the electrophysiology-based outputs. For the longest time, those have not been amenable to the microwell format that we are used to," says Inglese.

"Over the past few years, there has been development of several ion-channel technologies that move closer to making actual electrophysiology outputs become a reality. There are still issues that are present but I think like all of these improvements in assay technology, a lot of them tend to be incremental and based on an old, robust method, but which are essentially being updated so that they could work with a more high-throughput capacity."

Inglese says such newer technologies could be a boon for assay development, which he calls one of the most rate-limiting steps in screening/early lead discovery. Because
click the image to enlarge 

A functional GeneBlazer assay for the GPCR target VPAC-1. Fluorescence cell images were obtained by loading with a LiveBlazer substrate. (Source: Invitrogen Corp.)
the data scientists are using to gain biological insights is increasingly being generated by automated technologies, it is crucial that the assays being developed meet the specific requirements of these systems. He adds that one of the biggest challenges they encounter at NCGC, which performs large-scale screening for academia, is that many investigators don't appreciate the simplicity required for an assay protocol to run robustly on the robotics platforms that they employ. "If we are lucky, they are working with 96-well plates. Most of the times they have Eppendorf tubes and are doing all kinds of manipulations and centrifugation steps, filtering, things like that. These, of course, don't translate very well to automation."

Auld says a recent major advance in assay development has been the emergence of generic kits that can be used to measure entire gene families like kinases. A number of vendors have come out with the kits, which allow for phosphorylation detection by using either a metal chelate or some sort of coupled reaction that allows for measurement of ATP depletion or ADP formation, all common products of kinase reactions. "It greatly reduces the assay development time because you're using the same readout for every kinase," Auld says.

Keith Wood, PhD,
Promega Corp.

Wood says the main trend in assay development today is the same as it has been historically. "People would like to be able to get better precision, better sensitivity, at a lower cost. A lot of people also talk about getting more data per assay. That works in some areas. In other areas there are logistical problems in trying to coordinate several assay targets into the same physical degree, and it ends up being logistically more difficult than they would like." The mainstay in the market has been fluorescent space technologies, which are very sensitive and easy to use and probably the most common assay. But one problem is that researchers using them are getting a good deal of interference with intrinsic fluorescence from individual compounds in their collection, something that is resulting in a number of false readings.

Alternative technologies have emerged, including homogeneous time resolved fluorescence (HTRF). Because of its time-resolve component, HTRF largely minimizes interference from fluorescent compounds and luminescence is avoided completely because there is no excitation light to begin with. "They have a process where they do rapid on-off gating of the light source and they measure it when it's off. By doing that process, they can get a cumulative signal."

One of the most common mistakes researchers typically make, Wood says, is too focus too much on the biologies of their targets—how the receptors work, how the kinases work, among other things. Quite often, they will modify the chemistries used for measuring targets. "Sometimes they think they can overcome particular problems in their system by modifying the chemistries. But because they don't know the chemistries as well as the vendors, and also because they're busy and rushed people—they don't have time to go through the extensive validation programs that the supplying vendors typically do—they end up reengineering these technologies in a way that makes them less reliable and more prone to false readings."

Michael Bleavins, PhD,
Pfizer Global Research and Development

When he started in the industry 17 or 18 years ago, Bleavins says there was much less pressure to come to a decision about a compound following a phase I trial. But much has obviously changed over the years and companies are pushing to make decisions on efficacy and safety as early as possible due to financial pressures, something that requires more and more assays.

"I think some of it is just the nature of the diseases and things we're working on. Having a single assay that answers your question is proving to be less and less common.
click the image to enlarge 

The difference in plate performance between neutravidin-coated plates and streptavidin-coated plates can impact plate performance, so standardizing plates can be critical (Source: Carol Ann Homon, PhD)
You're going to need maybe three parameters or five or more to actually zero in on it. And it's showing us that many diseases that they used to just group together are really multiple diseases with similar symptoms or manifestations."

As a result, there is less emphasis on developing a single definitive assay and more on developing an assay that can test several compounds and be at least suggestive of how they would perform in terms of efficacy and safety so they can be prioritized. "With a single assay, because you weren't asking a specific question, frequently the answer was adequate," he says. Multi-factorial diseases such as rheumatoid arthritis, osteoarthritis and long-term progressive diseases like Alzheimer's, "are proving much harder. . . . The idea of getting a single assay that will probably be all you need hasn't been particularly fruitful anymore."

Peter Hodder, PhD,
Scripps Florida

Hodder's department collaborates extensively with academic researchers, and for many of them it is their first experience with high-throughput screening, so there are often kinks that need to be worked out. "One of the most common mistakes that we'll see is that people, for example, have a batch of protein and it's just enough to do their experiments. They're able to complete the experiments showing it's reproducible, but then they don't have enough to actually run a high-throughput screen on it. We try to catch that before it happens."

But the benefit of this is that the assays the academics are bringing in are very different from those being developed by industry, Hodder says. "Although they may use compound libraries to do a screen, they're not necessarily interested in developing a drug just yet. They're much more interested in finding probes to help them validate the biology that they're looking at. So we'll get a lot more creative and different types of assays and things that we've never done before in a high-throughput screening format." While these researchers are thinking of an end product and have very clear research plans, it may not include the endpoint of developing a clinical drug, but developing a probe so they can go back and use that probe to develop yet another assay. After several iterations, that assay may ultimately come back as a drug discovery assay.

That kind of exploratory research is something that is not seen much in industry. "It's hard to deal with those types of assays when you have a bunch of traditional targets that are begging for attention that are potential multi-billion dollar drug targets."

Brian Pollock, PhD,
Invitrogen Corp
Ensuring that assay technologies work at lower volumes, not only in a 384-well format but in a 1,536-well format, has a been a key trend since the late 1990s. Like most trends in the industry, it is driven by the imperative to increase input and cut costs by minimizing the amount of compound and assay reagent used. Another key trend has been multiplexing, the ability to use a single test for multiple readouts, Pollock says. "It's not so much from a cost savings perspective as from the perspective of understanding different biological events that are ongoing. If you're treating cells with compound, you like to know for that actual sample whether different phosphoprotein levels are changing. So you need to have separate readouts for that type of assessment."

Pollock says the greatest challenge comes with novel targets or difficult drug targets and putting them in an assay with the robustness required in the screening environment. Another challenge is ensuring that the assay can actually be built. Others include orphan GPCR, because some GPCR receptors are toxic to cells when they are overexpressed. Yet another challenge are fast ion channels, but technology has been developed to build better fluorescent assay readouts for fast channels. "More recent work has been in the area of using nanocrystals to try to develop more rapidly responding membrane voltage sensors."

Berta Strulovici, PhD,
Merck & Co.

When they are developing assays, researchers in Strulovici's labs gravitate more and more toward complex assays such as basal cellular imaging based on ion sensor types of detection and working with stem cells or other cells that are more difficult to work with. "So despite the fact that we are extraordinarily automated and we have done all our assays this year in miniaturized formats, including ELISA [enzyme-linked immunoabsorbent assay] and others, the focus is to try to work on targets as much as possible in their natural environment."

Strulovici believes there will be a need to be able to do complicated assays in a more automated, high-throughput manner. For example, they use microcoloring imaging a good deal now, and they would like for it to be more high-throughput. "For us, this is where the future is going to be, and we depend on outside vendors in terms of analysis of complex data, market parameter data, and the ability to integrate them altogether. This is where the challenges really are, not in the assay itself."

The most common mistake she sees researchers making in assay development is not having a broad enough range of assays for each type of target. Too often, researchers seem to be limited based on robotics or what the history of the lab is, so they try to fit every type of target into one or two types of assays. "That is a very big mistake because what's most important is to develop the kind of assay that best recapitulates the function of the target in the cell. That's really the bottom line."

Ralph Garippa, PhD,
Roche Discovery Technologies

The pharmaceutical and biotechnology industries have dedicated a great deal of time and effort to rolling out high-thoughput screening (HTS) technologies for biochemical and cell-based means. The effort has been a success, so successful in fact that many companies have had to restructure some of their operations, with many of the personnel formerly dedicated to HTS now contributing to assay development for secondary and tertiary assays. "This can help move the project forward through its later stages, through lead optimization, and finally to the choice of clinical candidacy," Garippa says.

His labs are also doing parallel developments of assays when they are thinking of beginning an HTS program. Formerly, they would decide on one assay format for a particular target, develop it, optimize it, and go. "Now our paradigm is to develop two or three assays in parallel and really look at the metrics: which assay is giving us the most robust screening metrics, the best Z prime, which one is the most cost effective, gives us the highest throughput, the lowest variance from day to day? All those things are factored in when we finally get to the decision on which screening format we'll launch on for the formal HTS campaign," says Garippa.

One factor playing a role in the development of assays are more advanced screening technologies, particularly high-content screening. This uses automation and image analysis to create visual cell-based screens by using mostly fluorescent technology to look at activation and translocation events in a cell. "Historically these kinds of events were noted in the mechanism of action of the proteins involved," says Garippa. "However, now we can quantify hundreds, if not thousands, of cells per well in 384-well plates—very large numbers and powerful statistical means to be able to move compounds forward."