Background: Anti-class I IgG can be detected by complement-dependent cytotoxicity (CDC) and by ELISA. We compared ELISA and CDC for both class I and class II antibodies on method agreement and relation to rejection-free and graft survival.
Methods: Peak, current, and posttransplant sera (n=429) of 143 renal allograft patients were tested by National Institutes of Health technique (NIHT), two-color fluorescence (TCF), and ELISA. Method agreement was assessed by intraclass correlation coefficient (ICC). Rejection and graft survival were analyzed by uni- and multivariate techniques. The screening results for each serum were compared, as was the change in result of current to posttransplant serum.
Results: The ICC of ELISA and NIHT was insufficient; it was lower for TCF than NIHT. Graft survival was not related to the result of any assay. Rejection-free survival was related to ELISA and NIHT in current and posttransplant serum. With the NIHT, the change in percent panel-reactive antibody (%PRA) correlated better with rejection than it did with ELISA. The combined antibody status of current and posttransplant serum was a risk factor for rejection in all assays, and for TCF also in multivariate analysis. The rejection rate was higher if the posttransplant serum was ELISA-negative/CDC-positive, rather than ELISA-positive/CDC-negative. For ELISA, class I specificities (and not %PRA) in peak and current sera were related to rejection, even if the antibodies were not donor-directed. In the case of the National Institutes of Health technique (NIHT), %PRA and not specificity was related to rejection. Class II antibodies were never related to rejection.
Conclusions: ELISA and NEIT are complementary screening techniques in this patient population. They are of equal predictive value for rejection. The optimal strategy in combining these techniques must be determined.