Overcoming Poor Quality Data for ECG Collection Timothy Callahan, Ph.D., Chief Scientific Officer The ICH E14 has become the touchstone guideline when designing and implementing Thorough QT (TQT) trials. Even protocols that are not designed primarily as a QT trial often use recommendations from the ICH E14. Specifically, some sponsors have designed their trials to have multiple ECG collections at a single time point to reduce the standard deviations associated with the QT-interval, thus reducing sample size and, ultimately, the cost of the study. Most trials collect three ECGs at each time point because it seems to be the most efficient number in terms of both standard deviation and cost-effectiveness. Data Quality Beyond the natural beat-to-beat variance inherent in the QT interval, variance in measurement of the QT-interval can be introduced by poor data quality. Problems with data quality can be either from natural factors, such as heart rate and arrhythmias, or mechanical factors, such as poor hook-up techniques or equipment malfunctions. A major source of naturally occurring data quality problems is from changes in the heart rate. The QT-interval shortens with an increasing heart rate and prolongs with a decreasing heart rate. Therefore, the QT-interval is almost always “corrected” for heart rate. ECGs collected in the presence of a changing heart rate can create havoc with the corrected QT-interval. For an analysis of the QT-interval in TQT trials, ECG data should be collected only in times of stable heart rates. This is generally accomplished by having the subject resting upwards of 10 minutes before the ECG is collected. Some subjects have a resting sinus arrhythmia, where the heart rate changes within a few seconds both increasing and decreasing with the respiration system. These subjects present a particular challenge as this condition occurs in young, healthy patients: exactly the ones who are recruited for the TQT trial. The problem is how the heart rate should be calculated in the presence of two or more different heart rates in a single ECG. If these subjects must be used in the TQT trial, it is best to use a median beat instead of the raw beats, with a heart rate calculated over at least 10 seconds. Poor data quality caused by mechanical problems can, most often, be traced back to the patient preparation technique. These mechanical problems can lead to high-frequency artifact (jagged movement in the otherwise smooth ECG signal), low-frequency artifact (also known as a wandering baseline), changes in, or flattening of, the T-wave, and axis changes (which can be the result of poor positioning of the electrodes). ECG core labs measure intra- and inter-reader variability. Measuring within a triplicate ECG can also give a good idea as to the quality of the data. For example, if the range for the heart rate in a given triplicate exceeds a predefined threshold, the triplicate should be reexamined to see if the data are valid. Data Submission and the FDA ECG Data Warehouse Data from TQT trials should be uploaded to the FDA’s ECG Data Warehouse. The FDA will then, using a series of quality tools, “grade” ECGs for data quality. ECGs with questionable quality might be reviewed for annotations. Other ECGs might be reviewed as well, but the sponsor should expect ECGs with poor quality scores to be reviewed. It is incumbent upon the investigational site, the sponsor, and the ECG core lab to monitor the quality of the ECG as the trial is ongoing. The best time to ensure data quality is at the bedside, when something can be done to improve the data quality. Improving the Process The problem with poor data is that the annotated readings become unreliable and inaccurate. There is a very narrow window for QT interval prolongation in TQT trials. In most, but not all cases, poor-quality data results in artificially prolonged QT interval readings, as well as increased variability. Since increased variability adds to the costs of TQT trials, it is important to ensure that the ECG data collected at the site are of a high quality, free of artifact and variability. Since variability can occur naturally and from mechanical sources, it is important for the investigative site to be trained as to how to collect high-quality ECGs. The ECG core lab should be allowed to train all personnel who are collecting the ECG at the investigative site. The training should include use of the equipment and correct subject hook-up procedures — electrode placement at the correct anatomical site. The ECG core lab should be a source of information and training regarding data quality. The sponsor should rely on the core lab for these issues. It is incumbent upon the investigational site, the sponsor, and the ECG core lab to monitor the quality of the ECG as the trial is ongoing. The best time to ensure data quality is at the bedside, when something can be done to improve the data quality. Biomedical Systems Biomedical Systems, St. Louis, is a global provider of diagnostic data management services to the healthcare industry. For more information, visit biomedsys.com. June 2007 VIEW on Clinical Services ECG Data
An article from

Overcoming Poor Quality Data for ECG Collection
Filed Under:
Research & Development