Subscribe to RSS
DOI: 10.1055/s-0043-110862
Evaluation of the New DIN Standard for Quality Assurance of Diagnostic Displays – Technical Review DIN 6868-157
Article in several languages: English | deutschCorrespondence
Publication History
04 November 2016
08 April 2017
Publication Date:
19 December 2017 (online)
- Introduction
- Essential Changes to the previous Testing Procedures
- Acceptance and Constancy Testing
- Quality Assurance Software
- Discussion
- Summary
- References
Abstract
Acceptance and regular constancy tests are necessary to ensure the quality of diagnostic displays. In November 2014, a new standard (DIN 6868-157) was published which defines the test procedure and limiting values. There are several substantial changes compared with the previous standard DIN V 6868-57, i. e. considering the complete image display system including workstation and application software instead of only the displays. Since its publication, the new standard has raised many questions. This technical review aims to show the strengths and weaknesses of the new standard. Positive aspects are the introduction of a limiting value for the illuminance and the extension of the interval for constancy tests from 3 to 6 six months. The daily constancy test on the other hand, raises several problems and should be replaced by a randomized test. Additionally, the medical relevance is critically questioned and an overview of software for the quality assurance will be given.
Key points
-
Acceptance and constancy tests for diagnostic displays are defined in DIN 6868-157.
-
The new standards has positive and negative aspects.
-
Randomized tests should be introduced.
Citation Format
-
Entz K, Sommer A, Lenzen H. DIN 6868-157: Image Quality Assurance in Diagnostic X-ray Departments – X-ray Ordinance Acceptance and Constancy Test of Image Display Systems in their Environment – Technical Review –. Fortschr Röntgenstr 2018; 190: 51 – 60
#
Key words
diagnostic displays - DIN 6868-157 - quality assurance - constancy tests - randomized testsIntroduction
The findings of digital X-ray images are to a considerable extent dependent on the quality of the monitors and the ambient conditions in the reading room. This dependence is only minor for clearly recognizable lesions with high contrast, as they can hardly be overlooked even under unfavorable conditions. This dependence, however, is critical with respect to lesions in the low-contrast range. Here, the highest quality criteria are required which are rarely met by standard PC components (monitor, display controller, etc.). The physician can only be assured by means of acceptance and constancy tests that the monitors meet these requirements at all times.
Nowadays flat screens are used almost exclusively (an overview of the abbreviations used are given in [Table 1]) instead of cathode ray tube monitors. This resulted in an urgent adaptation of the DIN standards for the acceptance and constancy testing of diagnostic monitors to the state of the art. Shortly after publication of the new standard, DIN 6868-157 [1], there was a comprehensive overview of these changes [2]. Nevertheless, since its publication, the standard has raised numerous questions and has led to many uncertainties among users. This has been made clear, for example, by the many questions in the online forum “Forum Röntgenverordnung” (a forum dealing with questions regarding the X-ray Ordinance, standards, etc.) [3]. This technical review deals critically with the new standard and describes initial experiences with DIN 6868-157 while providing help with the application of the standard. In doing so, the medical relevance of some test items is also scrutinized.
#
Essential Changes to the previous Testing Procedures
A major change to the previous test situation is the consideration of the entire image display system (IDS) as a complete workstation with a PC, software and monitors, in contrast to an image display device (IDD). This standard has the advantage of requiring that changes to all components that can influence image quality, such as the display controller, are also checked and documented, and not just the monitors themselves. [Fig. 1] shows, for example, the consequences of an incompatible display controller and the resulting difference in brightness between the upper and lower half of the monitor.
In addition to the image display system, the standard additionally includes the ambient lighting conditions. Previously they were implicitly checked by measuring the minimum luminance and veiling glare, whereas the new standard explicitly requires measurement of the illuminance of the room. The lighting is regulated by the newly-introduced room classes (RC). The room classes reflect various operating conditions of the monitors ([Table 2]). The activities carried out in the room (diagnosis, examination, etc.) determine the room class and thus set the requirements for the maximum permitted illuminance and the monitor used. The earlier classification of projection radiography (thorax, skeleton, breast) into category A, and classification of fluoroscopy, computed tomography and subtraction angiography into category B have been omitted in their previous form.
In addition to the requirements for the acceptance test, the new standard also contains instructions and limiting values for the constancy test, which was previously regulated by the quality assurance guideline (QS-RL, Germ. Qualitätssicherungs-Richtlinie). In addition, the DICOM Grayscale Standard Display Function (GSDF) that adapts the luminance values to the sensitivity of the human eye for contrast changes is now mandatory for medical displays. Furthermore, contact measurement using a near-field luminance meter is now approved as an addition to the usual distance measurement in the telescope method for the constancy test and certain parts of the acceptance test. For the first time, limits for pixel defects were introduced. The standard is valid exclusively for application areas within the framework of the X-ray Ordinance, that is, not applicable to ultrasound equipment or MRI units.
Modification of the quality assurance guideline resulted in the mandatory application of DIN 6868-157 dated December 15, 2014, thus replacing DIN V 6868-57 [4] [5]. However, old acceptance tests according to DIN V 6868-57 remain valid during a transitional period until 2025. This also includes an exchange of subcomponents, e. g. the PC; only after replacing the monitors does the new standard have to be applied. Likewise, constancy tests can continue to be performed according to QS-RL.
#
Acceptance and Constancy Testing
As before, acceptance tests are necessary during commissioning as well as after monitor replacement. According to the new standard, an acceptance test must also be carried out if the room class changes. Only the radiation protection responsible/commissioner can change this if the relevant activities are modified. A constancy test is sufficient for a change of location. Regular constancy tests need only be carried out semi-annually instead of quarterly as before.
Test Patterns
The patterns used for testing have been adapted to DIN EN 62563-1:2014-01; test patterns used for mammography have been supplemented by additional images of the American Association of Physicists in Medicine (AAPM) [6] [7]. These test patterns can be obtained in the standard 1024 × 1024 pixel format as a bitmap via the Radiology Standards Committee (NAR, Germ. Normenausschuss Radiologie). Based on what we know so far, other storage formats such as DICOM or other resolutions, e. g. 1600 × 1200, which represent the minimum requirement for projection radiography are, unfortunately, not available so the user usually needs a software application containing the images required for acceptance and constancy testing. This can result in expenses in the amount of several hundred euros if the testing software is not included in the purchase of the monitors. These additional costs may be acceptable for an acceptance tester, but this will tend to lead to misunderstandings and reduced acceptance of the test by the operators (medical practices, etc.).
#
Visual Inspections
Overall Image Quality
The newly introduced test pattern TG18-OIQ (overall image quality) is used to assess the overall image quality and geometry during the acceptance test as well as the daily constancy test ([Fig. 2]). Various test image elements have to be assessed depending on the room class. The greatest challenge is test element 3, which shows the lettering “QUALITY CONTROL” in low contrast on white, gray and black fields. From left to right the individual characters are displayed in decreasing contrast. Depending on the room and application class, different characters of the lettering must be visible. The highest requirements apply to mammography, where the entire lettering must be recognizable. At 12 bits, the final letter “L” in the black field corresponds to a pixel value of 16, which corresponds to just 0.4 % of the maximum gray scale value. Previously only the 5 % field had to be recognizable (corresponds roughly to the “U” in “QUALITY”). It is not clear why these requirements were so significantly raised. Thus, for example, EUREF (European Reference Organisation for Quality Assured Breast Screening and Diagnostic Services) does not require legibility of the entire lettering, but that the number of observed characters remains constant [8]. The contrast of the lettering is so low that the other test image elements can influence the evaluation of the test item. This is easy to check by covering the areas around the black field so as not to dazzle the eye.
The value and acceptance of the currently valid procedure was examined in a blind study. To do this, over a period of six weeks the study determined how often and with what care the daily constancy tests were performed. Four modified versions of the TG18-OIQ test image were developed in which either characters in the “QUALITY CONTROL” lettering or the line pair grids in the corners and center were removed. The modified test patterns were included into the RadiCS quality assurance software made by EIZO and, unknown to the users, remotely controlled distributed daily to five to ten diagnostic workstations via a server. A total of 616 tests were evaluated, of which 172 (28 %) used modified test patterns. The completion rate during the testing period was 88 %. This means that 12 % of all pending examinations were aborted or skipped by the examining physician. Of a total of 148 valid tests with modified test images, only 7 (5 %) were correctly recognized as faulty. In 141 cases (95 %), however, the test was incorrectly rated as passed. Subsequently, the users were informed about the study, and a second phase examined whether this could lead to a change in the performance of the constancy test. Modified test images were distributed over a shortened two-week test period. A total of 276 examinations were evaluated, of which 29 used modified images. The completion rate was 85 % and could not be increased. Of a total of 25 tests with modified test images, 7 (28 %) were correctly classified as faulty, whereas 18 (72 %) were mistakenly considered to be passed. This shows, on the one hand, the lack of acceptance of the constancy tests and on the other hand, the low significance of the test results.
Overall, the daily constancy test according to the new standard is considerably more extensive than before. Whereas previously only a check of gray-scale reproduction was prescribed, the overall picture quality is now to be checked visually by means of several elements of the TG18-OIQ test image. The additional workload reduces the acceptance of the test by the user, but there is no evidence of any additional benefit.
#
Homogeneity, Color Impression and Uniformity, Defective Pixels
In both the acceptance and constancy tests, the monitors must be tested for homogeneity and color impression using the TG18-UN80 test pattern (uniformity, at 80 % of the driving level). A check for dead pixels is strictly required only in the acceptance test, but should also be checked in the constancy test.
Previously it was required that the medically used area of the display device must not contain artifacts which influence the diagnosis. Artifacts can be caused by defective pixels, among other things. An accurate assessment of when these pixel defects affect the diagnosis is difficult to determine and also varies with the person carrying out the test, therefore the new standard has defined several types of pixel defects and sets limiting values. A distinction is made among permanently lit pixels (defect type A), permanently dark pixels (defect type B), abnormal subpixels which do not correspond to error types A or B (defect type C) and defect clusters (defect type D), see [Fig. 3]. The exact number of allowed pixel defects must be calculated based on the specified limits for a resolution of 1024 × 1024 and total number of pixels of the image display. It may be helpful for the examiner to create a table with the limiting values of the most frequent resolutions if the software does not provide such a calculation.
A typical display device for mammography with a resolution of 2048 × 2560 may have 5 pixel errors of defect type A, 25 type B defects, 25 type C defects, and 5 type D pixel defects, thus a total of 180 defective (sub)pixels. Especially when combining several – or in the extreme case all four – defect types, these limiting values appear to be too high. In practice, working with so many pixel defects is hardly imaginable and would be rejected by most physicians.
#
#
Measurement Testing
As in the past, the acceptance test is carried out as a distance measurement using the telescope method (measurement method A), but other methods can also be used in the constancy test ([Fig. 4]). The standard permits the use of the calibrated luminance meter according to the telescope principle of the acceptance test as well as the near range luminance meter for contact measurement (method B), as well as meters integrated into the image display system (method C + D). In test procedures B, C and D, the illuminance must also be determined in order to account for the ambient light. With a suitable software, the constancy test can be carried out automatically and, depending on the software, in some cases even remotely controlled.
Illuminance
An addition to DIN 6868-157 was the requirement that the illuminance must be adapted to the environment. Requirements for the maximum illuminance are determined by the room class.
#
Minimum/Maximum/Veiling Glare
The minimum and maximum luminance must be measured for the constancy tests. The ratio of minimum and maximum luminance (maximum luminance ratio, formerly maximum contrast) has to be determined only in the acceptance test. Absolute limits for the maximum luminance and maximum luminance ratio are specified depending the application. The minimum requirements for the maximum luminance ratio were raised from 100 to 250 for projection radiography and from 40 to 100 for other application areas. The transitional periods for legacy equipment will lead to a difference in quality between old and new monitors in the coming years.
#
Luminance Response
In order to accommodate the nonlinear contrast sensitivity of the human eye, the new standard made the grayscale standard display function obligatory for display devices with diagnostic quality ([Fig. 5]). The human eye is more sensitive to minor relative changes in areas of higher luminance (white) compared with low-luminance areas (black). The introduction of a standard luminance response results in a comparable image impression on different monitors, even among different manufacturers.
The luminance response is determined with contact measurement at the 0 – 100 % driving levels (test images TG18-LN8-01 to 18). A suitable software tool is necessary for the evaluation of the measured values. If no quality assurance software is used, the tool offered by the European Reference Organization for Quality Assisted Breast Screening and Diagnostic Services (EUREF), for example, can be used [10]. Veiling glare Lamb has to be considered in the calculations since contact measurement is prescribed.
Exceptions were applications in dentistry (RK5 + 6), in which the GSDF is not mandatory, since low contrasts play only a subordinate role in dentistry. In room class 3 the GSDF has to be measured only in the acceptance test.
#
Display Homogeneity
In order to guarantee that the image impression is uniform across the entire monitor, the homogeneity of the display system must be measured at fixed points at 10 % and 80 % of the maximum digital driving level during the acceptance test. The number of measuring points depends on the screen diagonal and thus also takes into account large monitors that replace two smaller individual monitors (e. g. one 6 MP monitor instead of two 3 MP monitors).
According to the old standard, different test patterns could be used for checking homogeneity, but it was prescribed that one measuring point should be placed near each of the four corners. Following the new definition, the measuring points were moved further towards the center; a check of the corners or edge is thus no longer performed. Although only the medically used area of an imaging display is supposed to be checked and not the area wich is concealed, by the menu bar of the PACS, for example, inhomogeneity tends to appear along the edge and corners, and not in the area to be checked according to the standard ([Fig. 6]). The larger the monitor, the further removed the measuring points are from the edge. It would therefore be advisable to carry out the measurements at a certain distance from the edge of the medically used area regardless of the monitor size.
A further modification concerns the definition of homogeneity. Previously, the deviation of the vertices (E1 – 4) from the center (M1) was considered, whereas now the deviation of the measuring point with the highest luminance to the point with the lowest luminance is evaluated. At the same time, the tolerances were adapted to this changed approach. According to the old standard tolerances of ± 15 % (application category A) or ± 20 % (application category B) were used for the deviation of luminance of the vertices from the center. Now, the limiting value for homogeneity within the entire display device is 25 % (RK1 – 4) and 30 % (RK5 + 6). █Because of this revised approach monitors█ which do not meet the requirements of the prior standard may be operated according to the new standard. According to the old standard, a monitor with 171 cd / m² in the center (M1, z) and 135 cd / m² in one of the corners (E1, k1), exhibits a deviation of approx. 21 % thus exceeds the tolerances for application categories A and B display devices. According to the new standard, homogeneity is approx. 24 %, thus fulfilling requirements for all applications. Therefore, the monitor in [Fig. 6] would pass the measurement tests of the new standard on the one hand due to the position of the measuring points and on the other hand due to the changed limiting values. In the case of a visual complaint by the user, this may lead to discussions regarding warranty claims.
#
Multi-display Image Homogeneity
The check of the homogeneity of adjacent monitors connected to the same image display system and which are supposed to show an identical image was also introduced. This is a positive step, since a workstation frequently consists of several monitors, and varying image impressions should be avoided. A similar rule already existed for mammography in PAS 1054 [11], according to which maximum contrast and maximum luminance were compared. In the new standard, homogeneity of multiple display systems is measured at low luminance (10 % of the maximum driving level, test image TG18-UN10). A comparison of the whole luminescence response would be more useful instead. It remains to be seen how these changes will affect mammography. For other acquisition methods for which there were no requirements regarding the homogeneity of multiple display systems the introduction of this test item represents a tightening of the rules.
#
#
Limiting Values and Tolerances for the Constancy Test
For constancy tests, DIN 6868-157 specifies both absolute limits as well as tolerances for deviation from the reference values. Since it is not useful to use limit values and tolerances for each test item, and since a general listing is missing in the standard, [Table 3] contains the limiting values and tolerances which, in the authors’ opinion, should be used according to the new standard.
#
#
Quality Assurance Software
In principle, it is possible to carry out acceptance and constancy tests without quality assurance software. Since the DIN test images are available only in the standard 1024 × 1024 format, a software solution is usually required to generate the test images at other resolutions. Furthermore, testing is simplified if the related test images are immediately called up and the results documented.
Quality assurance software is provided by manufacturers of diagnostic monitors (e. g. Barco, EIZO), measuring equipment (iba) or PACS (aycan), but also by quality assurance service providers (diraal, mdp dental). [Table 4] shows an overview of currently available quality assurance software focused on DIN 6868-157.
The scope of functions available differs distinctly. For example, software by mdp dental is designed only for dentistry, while other programs cover all areas of application. Depending on the manufacturer, the software is often modularly constructed, so that in the basic version, for example, only the constancy test is available which is sufficient for most users, thus keeping costs down.
Some manufacturers also offer the option to save the results of the acceptance and constancy tests centrally on a server. This can be particularly useful for larger hospitals or multiple-site practices in order to quickly access protocols. A server solution is generally not required for single workstations.
Prices of quality assurance software vary substantially. Some manufacturers offer software at no cost together with the purchase of other products, such as diagnostic monitors; with other manufacturers, on the other hand, a few hundred euros have to be invested in addition to the cost of the workstation. Therefore, prior to purchasing software the requirements should be carefully considered to avoid unnecessary costs.
#
Discussion
An adaptation of the standard to the state of the art is in principle to be welcomed. In addition to some improvements, such as the extension of the test interval from quarterly to semi-annual tests, several items of DIN 6868-157 must be seen critically. Particular attention is to be paid to the daily constancy test.
Acceptance of this test among physicians is low. This is primarily due to the tests requiring the user to only confirm the visibility in a permanently identical test pattern, thus calling into question the usefulness of this exercise. Randomized tests in which the user has to recognize a structure at any point on the screen and subsequently point to it with the mouse are much closer to the actual diagnostic situation. These tests also directly provide a convincing result in which the physician is assured that the monitor together with the ambient lighting conditions have a high probability of meeting quality requirements.
When evaluating the test point, a bias is generated as the user knows exactly what should be seen, thus deviations from the standard may not be recognized for an extended period of time. In this case as well, a randomized test, in which not only the objects to be recognized but also their positions vary from test to test and must be recognized by the tester, would be advantageous. In this way a subjective test would become a semi-objective test. Possibly, these semi-objective tests could even replace the metrological checks, which would lead to a cost reduction. There are already similar approaches, such as the MoniQA software program [9].
In addition to the daily visual constancy test, metrological verification of homogeneity should be seen in a critical light. On the one hand, the threshold has been raised; on the other, the newly-defined measuring points are placed too far in the center of the monitor. Inhomogeneity in the outer regions of the screen is thereby not detected. It would therefore be more advantageous to carry out the measurements at a strictly defined short distance from the edge of the medically used area.
The standard includes limits for pixel defects, providing a clear benchmark for the manufacturer, examiner and user. Previously there was a subjective estimate of the number of allowable pixel defects, which could lead to different opinions, especially between manufacturers and users. While the introduction of limiting values is therefore generally to be assessed positively, the thresholds themselves have been poorly chosen, however. The limits are clearly too high, particularly when combining several types of defects.
The new standard also allows built-in sensors and automated measurements for the tests. The quality assurance software of one manufacturer supports the remote performance of the semi-annual test without a trained inspector on-site. However it is not advisable to allow a completely remote-controlled test in which the device checks itself. In recent months numerous discussions have shown the dubiousness of device-internal testing software (e. g. the emissions scandal at VW).
Quality assurance of diagnostic monitors must not only be aimed at checking compliance with physical parameters, especially since the correlation of these parameters with the needs of radiological activities has not always been proven. Rather, quality assurance must demonstrate that the processing of a binary image is optimally adapted to human visual physiology under the given ambient conditions. For this reason, the eye of the user must absolutely be included in the test.
#
Summary
The transition of image displays from cathode ray tubes to systems using flat screen monitors made an adaptation of the standard to the state of the art urgently necessary. The new standard has led to some improvements, but many questions have been raised by the users as a result of standard’s complexity.
Likewise, the integration of dental applications into the standard was only partially successful. In many aspects exceptions for dentistry resulted in a considerably reduced range of testing.
On the whole, the new standard contains numerous exceptions that make understanding and interpretation difficult. Compared with the earlier standard, requirements were increased for many test items, such as visibility of low contrast in the daily constancy test. Other items, such as homogeneity requirements, were lowered.
The clash of interests during the creation of the standard has been made clear by the numerous objections to the drafts. However, the resulting compromise has led to further discussions and criticism since the publication of DIN 6868-157. A revision of the standard to eliminate ambiguities is therefore to be welcomed; especially useful would be the introduction of randomized tests.
-
The physician has to be confident that the diagnostic monitor can display all relevant lesions,
-
therefore regular monitor testing is indispensable.
-
Since November 2014, DIN 6868-157 has governed the acceptance and constancy testing of diagnostic monitors.
-
Numerous users are having problems implementing the new standard.
-
A revision of the standard to clarify misunderstandings appears necessary.
-
Randomized tests should be used for daily constancy testing.
#
#
-
References
- 1 Normenausschuss Radiologie (NAR) DIN 6868-157. Sicherung der Bildqualität in röntgendiagnostischen Betrieben – Teil 157: Abnahme- und Konstanzprüfung nach RöV an Bildwiedergabegeräten in ihrer Umgebung. November 2014
- 2 Madsack B, Walz M, Weisser G. Abnahme und Konstanzprüfung an Bildwiedergabesystemen. Radiopraxis 2014; 7: 195-210
- 3 Forum-Röntgenverordnung. http://www.forum-roev.de/forum.php Stand: 04.11.2016
- 4 Qualitätssicherungs-Richtlinie (QS-RL) zur Durchführung der Qualitätssicherung bei Röntgeneinrichtungen zur Untersuchung oder Behandlung von Menschen nach den §§ 16 und 17 der Röntgenverordnung.
- 5 Normenausschuss Radiologie (NAR) DIN 6868-57. Sicherung der Bildqualität in röntgendiagnostischen Betrieben Teil 57: Abnahmeprüfung an Bildwiedergabegeräten. Februar 2001
- 6 Normenausschuss Radiologie (NAR) DIN EN 62563-1. Medizinische elektrische Geräte – Medizinische Bildwiedergabesysteme – Teil 1: Bewertungsmethoden. Januar 2014
- 7 American Association of Physicists in Medicine (AAPM) Task Group 18. Report No. 03: Assessment of Medical Display Performance for Medical Imaging Systems. April 2005
- 8 Perry N, Broeders M, de Wolf C. et al. European guidelines for quality assurance in breast cancer screening and diagnosis. 2006 4th Edition.
- 9 Jacobs J, Rogge F, Kotre J. et al. Preliminary validation of a new variable pattern for daily quality assurance of medical image display devices. Med Phys 2007; 34: 2744-2758
- 10 LRCB. Monitor Check. http://www.euref.org/downloads Stand: 04.11.2016
- 11 Blendl C, Hermann KP, Mertelmeier T. Anforderungen und Prüfverfahren für digitale Mammographie-Einrichtungen PAS 1054. Beuth Verlag; 2005
Correspondence
-
References
- 1 Normenausschuss Radiologie (NAR) DIN 6868-157. Sicherung der Bildqualität in röntgendiagnostischen Betrieben – Teil 157: Abnahme- und Konstanzprüfung nach RöV an Bildwiedergabegeräten in ihrer Umgebung. November 2014
- 2 Madsack B, Walz M, Weisser G. Abnahme und Konstanzprüfung an Bildwiedergabesystemen. Radiopraxis 2014; 7: 195-210
- 3 Forum-Röntgenverordnung. http://www.forum-roev.de/forum.php Stand: 04.11.2016
- 4 Qualitätssicherungs-Richtlinie (QS-RL) zur Durchführung der Qualitätssicherung bei Röntgeneinrichtungen zur Untersuchung oder Behandlung von Menschen nach den §§ 16 und 17 der Röntgenverordnung.
- 5 Normenausschuss Radiologie (NAR) DIN 6868-57. Sicherung der Bildqualität in röntgendiagnostischen Betrieben Teil 57: Abnahmeprüfung an Bildwiedergabegeräten. Februar 2001
- 6 Normenausschuss Radiologie (NAR) DIN EN 62563-1. Medizinische elektrische Geräte – Medizinische Bildwiedergabesysteme – Teil 1: Bewertungsmethoden. Januar 2014
- 7 American Association of Physicists in Medicine (AAPM) Task Group 18. Report No. 03: Assessment of Medical Display Performance for Medical Imaging Systems. April 2005
- 8 Perry N, Broeders M, de Wolf C. et al. European guidelines for quality assurance in breast cancer screening and diagnosis. 2006 4th Edition.
- 9 Jacobs J, Rogge F, Kotre J. et al. Preliminary validation of a new variable pattern for daily quality assurance of medical image display devices. Med Phys 2007; 34: 2744-2758
- 10 LRCB. Monitor Check. http://www.euref.org/downloads Stand: 04.11.2016
- 11 Blendl C, Hermann KP, Mertelmeier T. Anforderungen und Prüfverfahren für digitale Mammographie-Einrichtungen PAS 1054. Beuth Verlag; 2005