A hardware-oriented filtering algorithm for noise reduction using a mean filter is proposed in this paper. The authors elaborate the effect of classical and contemporary filters, and show that those filters reduce noise at the cost of blurring image details such as edges. This paper proposes an adaptive filtering algorithm to detect the noise pixels and apply the mean filter only to those pixels. The detection is based on the assumption that noise pixels fit a Gaussian distribution while edge pixels do not. The well-known statistics model, the chi-square goodness-of-fit test, is applied to determine if a pixel is a noisy one by checking the data distribution of a sliding window centered on it. If the data distribution fits a Gaussian distribution, then this pixel will be filtered using a mean filter; otherwise, it will not be filtered. The algorithm performs better than the classical filters and similar to the more complex contemporary filters with simplified computation.
A detailed hardware implementation of this algorithm on field programmable gate arrays is also presented in this paper. Many implementation issues, such as module partitioning, handling of division, look-up table generation, data precision, memory control, and timing, are discussed in the paper. By following the detailed description in this paper, readers can get a good picture of how to implement the algorithm in hardware.
This paper is recommended for engineers and students interested in adaptive digital filter design for noise reduction in the area of image processing. It can also be used as a case study for hardware design.