image
image

The Electroencephalogram (EEG) Response
David B. Durham, M.D.

image

The EEG, recorded by positioning 21 or more electrodes on the intact scalp, represents the changes of the electrical field within the brain. Generally, even up to 128 and more EEG channels, each corresponding with a standard electrode position on the scalp can display simultaneously. The results of the EEG signals, after being registered as voltage differences between pairs of electrodes (bipolar derivations) or between an active electrode and a suitably constructed reference electrode (referential derivations)), are measured, amplified and next displayed on paper or on a monitor. The EEG itself is recorded during different behavioral conditions such as eyes closed, eyes open, hyperventilation and photic stimulation to provoke abnormalities. However, EEGs can also be recorded during sleep or during operative procedures.

The exact origin of the EEG is still not completely understood, but generally assumed that the measured responses are generated by neurons in the cortex. To be more specific, changes caused by depolarization of the membrane of these neurons (excitatory and inhibitory post synaptic potentials) cause extra-cellular currents perpendicular to the surface of the cortex. These currents measured at the crown of the scull, the amplitudes and frequency components of which vary from 10 to 100 V and from 0.5 to 30 Hz respectively, give rise to the EEG signal through volume conduction. Recordings directly from the cerebral cortex contain higher frequencies, which are filtered in the scalp recordings. The EEG signals recorded as a function of time are considered stochastic and not deterministic. Historically, phase information has been considered unimportant, and the power spectrum has been assumed to fully describe the EEG characteristics. However, this view, has recently been challenged by the results of non-linear EEG analysis.

The information consists of different types of periodicity's, characterized by their corresponding frequency, their amplitude, the topography and the conditions under which they occur. The alpha rhythm, for example, registered as a periodic activity with a frequency range varying from 8 to 13 Hz and amplitudes ranging from 20 and 100 V, occurs across the visual areas of the brain during attentive wakefulness with eyes closed. What is still unclear is how these various EEG rhythms are generated in populations of coupled neurons.

Clinical application of the EEG

Although the origin of EEG responses is not completely brought to light, the signal itself has proven to be a valuable tool for diagnosis in clinical medicine, in particular neurology, neurosurgery and in psychiatry. In addition to that, EEG recordings still require additional investigations in studying epilepsy.  Classically, epileptiform abnormalities show characteristic changes in EEG waveforms, such as spikes, sharp waves and spike wave discharges.  Not only can specific types of epilepsy (absence epilepsy, hypsarithmia and benign focal epilepsy of childhood) can be found, but also non-epileptic focal brain dysfunctions possibly caused by cerebrovascular disorders, tumors, infections or traumas and generalized brain dysfunction in case of metabolic encephalopathy, intoxication, encephalitis or degenerative dementia are reflected by the EEG signal. Such defects can be classified as either occurring periodically or befalling in a more continuous fashion.

In most cases, the EEG is considered a sensitive rather than a specific diagnostic instrument, making it a suitable instrument to monitoring the course of a disorder on the one hand and determining a prognosis of the abnormality on the other. That is, the EEG can pick up very mild degrees of brain dysfunction, but it seldom gives much information about the exact cause of the abnormalities. In general, one should not try to derive etiologic diagnoses from the EEG.

Current methods for EEG analysis and interpretation

First analyzing and next interpreting EEG records is hampered by an incomplete knowledge of the origin of the various rhythms and the lack of specificity of the abnormalities. Visually analyzing the raw data and quantitatively examining the time series are the two methods that are available today.

In the first alternative, the complete EEG record involving 16 to 32 channels of information and 20 to 30 minutes of recording, is examined visually by the clinical neurophysiologist. The analysis is performed systematically.

The record is first described for the present EEG periodicity's, like the waveforms itself, the amplitudes of it, the topographical distribution and the changes in the regularity due to opening of the eyes, hyperventilation and photic stimulation.  The main characteristics of the EEG are than described in a short text of about 200 to 400 words. The next phase is to decide whether the data is normal or that it indicates potential anomalies. If this is the case than the type of abnormality is determined. The last step is to consider the relevancy of the EEG findings with respect to the clinical problem that formed the rationale for the investigation. An answer must be given to a well-formulated clinical question. This requires neurological expertise.

Two actualities should be emphasized here. Visually analyzing the EEG data is very much an empirical science. Secondly, interpreting the EEG findings in terms of their clinical relevance requires a considerable amount of clinical, in particular neurological, knowledge. Therefore trained physicians have to be involved for the indispensable interpretations, while trained EEG technicians perform the description of the recordings.
In applying time series analyzing techniques, it is expected that it could facilitate an objective description of the defects. This is the field of a quantitative analysis.

As stated before, EEG responses are considered stochastic. This specifically explains the necessity to use power spectra and coherence estimations as the main tools for their analysis. Applications of advanced time series analyzing techniques have so far proved to be of limited use, because quantitative techniques are rather sensitive to artifacts, whereas an experienced electroencephalographer can recognize these more easily. A second argument is that power spectra are not very suitable to detect transients, such as spikes, whereas the human eye is very sensitive for deviations of the background information. On the other hand, a computerized analysis of the EEG data is preferable for long term monitoring.

To summarize, the analysis of EEG records still depends almost completely upon the visual analysis of the raw registrations by a trained technician or physician. Quantitative analysis can assist in this analysis, but so far has proved unable to replace visual assessment. Even more difficult than describing the phenomenon is interpreting it. Consequently, clinical knowledge must be merged with the description of the EEG record to decide whether they found abnormalities have any relevance. Visual analysis is an empirical science.   It not only requires a relatively long training (at least 6 to 12 months) to obtain even moderate levels of performance, but also appears to be a time consuming process – at least 10 minutes per EEG.

Performing through processing

A major task in data acquisition is defining the conditions to process the fingerprint data and to adapt to a suitable indicator for diagnostics purposes. An input space, representing the responses from twenty-one electrodes, positioned on predetermined positions of the scalp, is composed of information obtained from an electroencephalogram, monitoring the brain activities.

Signal pre-processing, leading to data reduction, offers more tractability for other purposes. It can also amount to selective emphasis, including procedures for trend detection and pattern recognition. The requirements for some of these tasks are quite arduous. It is therefore necessary to develop methods that permit recognition of a consistency of adequate strength in the derived signal that could reasonably be considered as a developing – or one to be developed – pattern. Determining and recognizing the characteristics of a pattern would be the most important step.

Another stringent demand in practicing signal pre-processing is that various purposes require the availability of techniques to discover interrelations between the signal of interest and other responses. The latter is essential because specific features may only become evident when compared to other signals.

The way input-space data is preprocessed strongly determines the overall efficiency of the performance of the classificator. Smoothing, spectral analysis, statistical analysis, time-frequency and time-scale analysis, correlation and convolution operations, and matched filtering all come in the category of relevant linear signal operations with continuous or sampled signals. In addition, trend detection and certain aspects of curve fitting and regression analysis belong to the “to be investigated” processing methods. The various procedures selectively accent features of interest, some of them are emphasized explicitly, others implicitly.

Special attention will be paid to investigating the advantages and disadvantages of time-frequency and time-scale analysis methods. The time-frequency analysis -also known as the short-time Fourier Transform or Gabor Transform gives preference to investigating quasi-stationary responses. The time-scale analysis or Wavelet Transform mode on the other hand offers a pre-processing scheme, in which a high frequency resolution is offered at low frequencies and a high time resolution is offered at high frequencies. Rendering signals, containing all the information, in a two-dimensional domain (the scalogram) is a feature that is only offered by Wavelet Transform operations. The inverse Wavelet Transform procedure ends in precise time localization, making this operation suitable to compose characteristic properties of a healthy person.

The role of the feature generator is to extract characteristic properties out of the preprocessed signal first and next putting them together in a feature space for further classification.

The role of the classificator is to categorize the elements of the feature space in defining certain classes in reference to the actual state of monitoring and diagnosing.

Diagnosing through neural networking

Neural networks are able to recognize differences in patterns based on automatic learning procedures. The application is attractive, not only that it provides faster responses, but especially because of its capability to automatically discover irregularities in patterns not seen or detected before. Another important feature is that it enables the discovery of regularities in the training signal itself because of the actual learning process. A training procedure can be performed by three different categories.

Supervised training, requiring the presence of one or more supervisors (in the case considered the neurologists), labels the data to be used in teaching the network. The system, knowing the correct answers, inputs an error signal the moment the network produces an incorrect response. It continues to do so by feeding the difference in assessments back into the network until the error has been decreased to a predetermined minimum value. The error signal as it were teaches the network the correct response. Unsupervised or self-organized training, using unlabelled training sets, do not need a supervisor. Internal clusters, compressing offered data into classification categories, are formed the moment data is presented to the network. The supervisor is also absent in self-supervised training. The error signals, generated by the network, are fed back into the network itself until a correct response is produced.

Neural networks for analyzing and interpreting EEG recordings

Many processes in medicine depend on pattern recognition. Analyzing and interpreting EEG records are in many respects an example of it. During the years it became clear that conventional (artificial intelligence oriented) approaches, based on obeying a set of rules, are not very suitable to perform such a task. Artificial neural networks (ANN), on the other hand, are effective in detecting and classifying patterns in all kinds of data sets. Recognizing these features has led to a search for possible applications of ANNs in medicine, but so far, only a few attempts have been made in applying these techniques to detecting one specific type of activity in the EEG record. ANNs showed excellent performance in detecting epileptic activities in EEG recordings (Gabor and Seyal, 1992; Jando et al., 1993; Gabor Leach and Dowla, 1996).

A logical next step is expanding the scope of ANN applications to EEG analysis. The goal would be to develop a system, which takes the EEG record and clinical data as its inputs, and produces a judgment as whether the record is abnormal, and a clinical interpretation as its outputs.

Using raw EEG time series as an input for an ANN is technically unfeasible. Therefore, some form of processing and data reduction is required. A logical choice would be to use the power contents of a few frequency bands as an input to the ANN. However, power spectra may not be very suitable to detect transient epileptic activity. Alternatively, some form of Gabor transformation must be used. The whole record of 20 to 30 minutes will have to be split up into epochs of a few second's lengths to be further processed.

Another difficult problem to be addressed is the choice of the overall architecture. More specifically: which tasks should be solved by which networks? One option might be to use a collection of networks, each trained for a specific well-defined task. This would make the system flexible because new tasks could be added later on, and individual networks could be changed without affecting the architecture of the whole system. As an example, networks can be trained to detect the alpha rhythm; to anticipate on possible artifacts; to demonstrate the appearance of delta waves; the presence of spikes, etc. An advantage of such an approach is that different types of pre-processing can be applied to networks tailored to their dedicated task.

Each EEG epoch is than simultaneously scanned by the networks, resulting in a situation that each network is able to assign a code to that epoch. After having processed all EEG epochs, the codes generated by all the networks are than used as inputs for a secondary level of networks, which decide whether the EEG recordings were normal or abnormal. A similar type of strategy might be used for the clinical interpretation of the EEG recordings.

Reliability analysis based on risk analysis

It cannot be expected that an electro-diagnostic system posses complete reliability.  There is always an intrinsic uncertainty, because there will never be an exact one-to-one mapping from measured input space to diagnostic output space. This may be caused by uncertainties fundamental to the physics of the problem, or by measurement errors, or by the fact that only a subset of the relevant variables is measured. Nevertheless, there is also an unreliability originating from the errors in a model when the mapping function, used to identify the faults from the input variables, does not correspond to the correct mapping. A third uncertainty is due to statistical errors, because a mapping function is estimated from a finite set of learning samples and the coefficients can therefore not be estimated exactly.

For classifying applications, it therefore does not suffice to make a certain diagnosis, but it is also important to quantify its reliability (the probability that alternative diagnoses are correct).

The diagnosis system should therefore include a risk analysis option. Putting together a reliability module, in which a relative contribution of the different types of errors is investigated, determines the accuracy of the diagnosis system in question. The results of this could provide important clues on how to improve the performance of the ANN system. Defining a decision module, that assists in estimating the consequences of an inaccurate classification of the rules applied to information supplied by the reliability module and by the users, is a second option.  Defining a measure for the reliability of a diagnosis system and finding a method to separate the different errors are problems that need investigation.

Dr. David Durham is Director of Neuropsychiatry for the Mosaic Neuroscience Group in Santa Fe, New Mexico, and a Clinical Assistant Professor of Psychiatry at the University of New Mexico, School of Medicine.

More...Next Article