Development of Real-Time MTF Measurement System
We developed a system that can measure the modulation transfer function (MTF) of a broadcasting camera in real time. The MTF indicates the spatial resolution characteristics of the camera. This system measures the MTF in multiple directions simultaneously by analyzing the edge responses while controlling the focus, iris, and zoom of the lens. It is an ideal tool for evaluating the spatial resolution characteristics of the overall imaging system, from the lens to the signal processing in the camera. The system is widely used for the development, testing, and maintenance of 4K and 8K cameras and lenses.
To reproduce lifelike visual perception or visual realness1) in 4K/8K Super Hi-Vision (hereinafter 4K/8K), it is significantly important to understand the cameras’ ability to reproduce picture details in ultra-high-definition (UHD) image formats. During the process of developing 4K/8K camera equipment and introducing it to NHK’s studio, we devoted ourselves to the research and development of a system for measuring the modulation transfer function (MTF) of a broadcast camera in real time. This paper reports on the real-time MTF measurement system that is now in practical use after several years of development.
2. Real-Time MTF Measurement System Concept
Spatial resolution characteristics refer to the ability of an imaging system to reproduce fine detail and is not the same as pixel resolution. MTF describes the response magnitude of an optical system to the sinusoids of different spatial frequencies and facilitates evaluation of the spatial performance of an imaging system and/or its components. An overview of the MTF concept*1 is shown in Fig. 1.
The real-time MTF measurement system analyzes the edge responses of a camera. Figure 2 shows a schematic of our real-time MTF measurement system, which consists of a standalone application with an intuitive graphical user interface (GUI) installed in a personal computer (PC) that is connected to a video capture device with a serial digital interface (SDI) input. The system analyzes the Y (luminance) component of the SDI input signal. If the optical axis of a zoom lens surface is directly facing a large knife-edge chart with the edge positioned at the image center, the MTF can be continuously observed from the wide end to the telephoto end while operating the zoom. This makes it very easy to adjust the flange focal distance *2, as well as to obtain the aperture values least impacted by aberration*3 and the diffraction limit*4 (i.e., the sweet spot) and the spatial resolution characteristics over the zoom range. Measurements using a slanted starburst chart with multidirectional edges yield a contour plot of multidirectional MTFs that enable the direct observation of the anisotropy due to lens and image sensor misalignments, as well as the pixel arrangement of the image sensor and image processing (e.g., Bayer color filter array *5, demosaicing*6, and detail enhancement *7).
3. Conventional Spatial Resolution Measurement for HDTV Cameras
Currently, the spatial resolution of a broadcast HDTV camera is usually measured based on its bar pattern responses at 800 TV lines per picture height (TVL/PH), which is twice the number of black and white line pairs or cycles across a distance equal to the picture height, to determine whether or not the modulation satisfies a minimum threshold criterion. This is typically a contrast transfer function (CTF)*8 of 45%2). Figure 3(a) shows an HDTV in-mega cycle test chart *9, while the waveform of a horizontally magnified portion of a line in the central area is shown in Fig. 3(b). The chart was captured by framing it to fit the central HD area of a 4K camera with a 2/3-in. three-chip complementary metal–oxide–semiconductor (CMOS) sensor.
Bar charts provide an intuitive assessment of the CTF by allowing for the visual observation of the amplitude displayed on a waveform monitor. However, there is ambiguity in the measurement of waveforms with amplitude fluctuations due to camera noise and Moiré patterns*10. In addition, CTFs measured using the other bar patterns in the spatial arrangement in the chart are not usually used because they are generally affected by degradation of the spatial resolution characteristics of the lens. Furthermore, framing the chart takes a significant amount of effort, and both focal length and shooting distance are fixed.
4. Algorithm for Real-Time MTF Measurement
The new measurement system can measure the MTF of a broadcast camera based on a relatively small region of interest (ROI) that encloses an edge. In Fig. 4, the red and orange shapes represent selected multiple ROIs. The new edge-based method incorporated into our real-time MTF measurement system accepts any edge direction (not just near-horizontal and near-vertical) in arbitrarily shaped ROIs (not just rectangular)*11. There is no need for framing as in the in-mega test chart.
Figure 5 shows a schematic flow diagram of the MTF calculation algorithm. First, the ROI image is rotated by the estimated angle such that the edge is oriented approximately upright. The ROI pixels are then dropped vertically into subpixel-wide bins aligned along the horizontal axis. An 8× oversampled one-dimensional (1D) edge spread function (ESF) is generated from the average value of the pixels collected in each bin. The 1D ESF derivative yields the line spread function (LSF), and a discrete Fourier transform is performed to calculate the MTF.
The system averages the first input video frames to reduce camera noise and improve the precision of the edge angle estimation3). The edge is assumed to be stationary during the measurement. Therefore, the bin locations corresponding to the ROI pixel positions are recorded first in a lookup table (LUT), and the ROI pixels are mapped to the bins based on the LUT in the subsequent input video frames.
5. Measurement Examples
5.1 MTF and CTF for 4K Cameras
Figure 6 plots the MTFs of the same 4K camera used for the CTF measurement shown in Fig. 3. The 52 light gray curves show the MTF results for the last 52 frames, and the blue curve shows the averaged MTF. In addition, he plot shows the best MTF (green), which is defined in this application as having the maximum sum at spatial frequencies from zero to the Nyquist frequency. Since the best MTF is held in the plot until the measurement process is reset by the user, it is useful for accurate focusing.
This system can display the MTF and CTF values at 800 TVL/PH for HDTV, 1600 TVL/PH for 4K, and 3200 TVL/PH for 8K for the resulting MTF curves. All of the above-mentioned spatial frequencies in TVL/PH correspond to 0.37 cycles/pixel. The CTF value of 49.4% at 1600 TVL/PH in Fig. 6 is also approximately obtained in Fig. 3.
5.2 Comparison of MTF for HDTV and 4K Class Lenses
Figure 7 compares the MTF results obtained by measuring the edges of the central and peripheral areas of the images obtained via two B4 mount lenses*12: one for the HDTV class and the other for the 4K class. Both lenses were mounted on the same 2/3-inch 4K camera used for the measurements shown in Fig. 3. Although the differences in the results are small in the central area, the MTF for the 4K class lens is better than that for the HDTV class lens in the peripheral area.
5.3 Detail Enhancement and MTF
Figure 8 shows the MTF and ESF for a 4K camera with a Super 35 single-chip CMOS sensor, with and without detail enhancement, using the default parameter values designed by the camera manufacturer. Here, we can see that detail enhancement boosted the MTF characteristics at low spatial frequencies. However, without detail enhancement, nearly the same MTF was attained near the Nyquist frequency. With detail enhancement, the ESF produces noticeable overshoot and undershoot.
5.4 Multidirectional MTF for Single-Chip 4K Camera
Figure 9 shows a contour plot of the multidirectional MTF, presented in polar coordinates, for the same single-chip 4K camera used for the measurements in Fig. 8. The radial coordinate axis represents the spatial frequency, where the center represents direct current (0 cycle/pixel) and the outer diameter represents the Nyquist frequency (0.5 cycles/pixel). Meanwhile, the angular coordinate axis represents the edge direction of the test chart, with the numbers (90, 80, etc.) representing the MTF value in percentages. The contour shown in Fig. 9 appears square-shaped at low MTF levels because the green pixels in the Bayer color filter array of the single-chip 4K camera image sensor are sparse in the diagonal directions. Figure 10 shows the multidirectional MTF with the lens defocused. Here, we can see that as the lens is more out of focus, the MTF of the lens becomes more dominant with more rounded and smaller contours.
5.5 Lens Aperture and MTF
Figure 11 shows the MTF results for an 8K camera with a 1.7-in. three-chip CMOS sensor and a single-focus lens while controlling the iris from F1.7 (fully open) to F8. This system makes it easy to find the sweet spot, which falls at F3.5 for this lens. The MTF curve at F8 (solid red) is noisy because of inadequate exposure, but the noise can be reduced if the lighting and shutter speed are appropriately adjusted during measurement.
The essence of 4K and 8K videos is their high spatial resolution characteristics, which highlights the importance of MTF measurements. The real-time MTF measurement system developed at NHK is also used in the development of underwater (Fig. 12) and other special cameras, including non-broadcasting applications. In our future studies, we will examine how to apply this method to measuring display resolutions for utilizing MTF to determine and manage the resolution characteristics of entire imaging systems.
This article was written and edited based on Ref. 3.