Unraveling the Proteome
By Julia Boguslavsky
Wary of the hoopla surrounding the 50th anniversary of the double helix, protein scientists are prone to remark disdainfully: "Genes were easy."
And for good reason! First, the body contains at least an order of magnitude more proteins than genes. Second, to elucidate their function and role in disease, knowing the primary amino-acid sequence is not enough. Myriad factors, such as the proteins' dynamic 3-D conformation, post-translational modifications, interaction with other proteins and small molecules, and their precise subcellular localization, also have critical roles. Finally, protein scientists do not have the luxury of a polymerase chain reaction equivalent to amplify proteins, requiring that they often work with minuscule quantities obtained from tissue samples. As a result, proteomics is adding new levels of complexity to instrumentation design and integration, sample preparation, and data analysis.
Proteomics is adding new levels of complexity to instrumentation design and integration.
Despite these challenges, proteomics is crucial to understanding biological function and designing better drugs and diagnostics. As a result, proteomics will drive much of the growth in the life science instrumentation market during the next 5 to 10 years. "People are just starting to recognize the complexity of studying proteins and their functions," says Sandra Rasmussen, biopharmaceutical marketing manager for PerkinElmer Inc. "From the instrumentation perspective, the challenge is that the proteomics information is dynamic. The instrumentation needs to be integrated and automated and allow data correlation between experiments."
Several trends will drive the design of proteomic instrumentation. First, increased sensitivity and wide dynamic range is particularly important in proteomic applications, where high-abundance proteins can play havoc with the detection of low-abundance proteins. Second, as the research focus shifts from protein identification to functional studies in the context of disease, there is an increasing need for differential analysis across multiple samples. This need will affect the aftermarket, since differential protein expression will be enabled more by improved reagents and labeling techniques than by changes in instrumentation. Examples of notable product launches last year include Amersham Biosciences' 2-D Fluorescence Difference Gel Electrophoresis and Applied Biosystems' Cleavable ICAT Reagents.
The third trend in proteomic instrumentation is automation, integration, and industrialization, particularly as researchers grapple with profiling entire proteomes of different tissues. Integrating complex, multistep sample preparation protocols and downstream analysis into a seamlessly automated instrument will be a priority for the high-end market. Consider, for example, Amersham's Ettan 2D-MS Spot Handling Workstation, which automatically picks spots from 2-D gels for digestion, then transfers the peptides to MALDI-ToF MS sample trays for identification.
What's Hot
Despite the complaints about 2-D gels and their relatively low throughput, insufficient automation, difficulties in data analysis, and inability to handle membrane proteins, they remain "absolutely critical to the proteomics efforts," Rasmussen says. The shortcomings of 2-D gels create a market opportunity for alternative protein separation technologies, particularly liquid-based systems. Multidimensional liquid chromatography, especially as the front end for MS, is generating a lot of interest, although the technology and its interface with instrumentation needs to be improved.
"Bench-scale liquid chromatography will be expanding as the focus shifts from identifying proteins to protein structure and activity," predicts Stevan Jovanovich, vice president of global research at Amersham Biosciences. "Within five years we'll identify enough proteins that [identification] won't be an emphasis anymore. There will be a shift from identification to functional and eventually activity-based proteomics. Chromatography's role will be to provide the substrate, purify, and feed into MS. Eventually we'll be collecting [fractions] in 1,536-well plates and doing miniaturized enzyme assays."
Over the next five years, MS is expected to drive the "identification" phase of proteomics. Considerable effort is under way to integrate MS with upstream instrumentation (such as chromatography, 2-D gels, or other sample preparation equipment), simplify instrument operation, improve sensitivity, and increase throughput.
"Mass spectrometry will be very hot over the next five years. It will eventually be in every lab instead of a departmental facility," Jovanovich says. "MS will be enabling, but not revolutionary. Protein arrays, on the other hand, I see growing over a much longer time. In five years, they will mature from mom-and-pop solutions to well-integrated kit-based systems."
Scientists still need to work out a few technological challenges before protein arrays become a robust and reliable platform for research and diagnostics. These challenges include designing capture agents with necessary affinity and specificity, optimizing surface chemistry, and working out the assay formats and detection. Eventually, if you need to monitor a few hundred proteins, the protein arrays will provide a high-throughput, quantitative alternative to most other methods.
What do these improvements in throughput, sensitivity, and differential analysis translate into? Prepare to crunch more data!