The need for automated analysis of the EEG for objectivity, efficiency and to improve the contribution it makes to diagnosis and the evaluation of treatment options, if available, is widely recognised. However, automated analysis of EEG is hampered by the lack of a reliable means of dealing with EEG artefacts such as those due to blinks, eye movements, and patient movements. The paper presents a new approach for detecting and classifying artefacts. The resulting system is intended to serve as a front end for an automated EEG interpretation system. It can also serve as an input to an artefact removal or rejection system. An important concept in the new approach is to keep the three fundamental stages of artefact processing: artefact detection, classification, and removal/rejection separate. Thus, it is possible to optimise the stages separately and to cater for different requirements in routine EEG. In the new method, a set of feed forward multilayer neural networks together with a knowledge based system are used to process frequency, time, and spatial features to detect, classify, and mark sections of the EEG. The output of the system is in the form of an EEG artefact report. Tests on the system on EEG records from volunteers indicate a success rate of over 90%. At present, the system operates offline, but it is being combined with an automated analysis system for routine clinical practice.