Generating analyte-specific distribution maps of compounds in a tissue sample by matrix-assisted laser desorption/ionization (MALDI) mass spectrometric imaging (MSI) has become a useful tool in numerous areas across the biological sciences. Direct analysis of the tissue sample provides MS images of an analyte’s distribution with minimal sample pretreatment. The technique, however, suffers from the inability to account for tissue-specific variations in ion signal. The variation in the makeup of different tissue types can result in significant differences in analyte extraction, cocrystallization, and ionization across a sample. In this study, a deuterated internal standard was used to account for these signal variations. Initial experiments were performed using pure standards and optimal cutting temperature compound (OCT) to generate known areas of ion suppression. By monitoring the analyte-to-internal-standard ratio, differences in ion signal were taken into account, resulting in images that better represented the analyte concentration. These experiments were then replicated using multiple tissue types in which the analyte’s MS signal was monitored. In certain tissues, including liver and kidney, the analyte signal was attenuated by up to 90%; however, when the analyte-to-internal-standard ratio was monitored, these differences were taken into account. These experiments further exemplify the need for an internal standard in the MSI workflow.

ACS
doi.org/10.1021/ac3029618
Anal. Chem.

Pirman, D. A., Kiss, A., Heeren, R., & Yost, R. A. (2013). Identifying tissue-specific signal variation in MALDI Mass Spectrometric Imaging by use of an internal standard. Anal. Chem., 85(2), 1090–1096. doi:10.1021/ac3029618