IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, cilt.21, sa.1, ss.105-114, 2017 (SCI-Expanded)
A new entropy bound with low computational complexity for differential Shannon entropy estimation with kernel density approach is proposed in this study, which is based on defining a bound for the Kullback-Leibler divergence between two Gaussian mixture models. The proposed entropy bound is derived to provide computational efficiency without decreasing the accuracy in detecting heart sound segments in respiratory sound. It is shown both theoretically and experimentally that using the proposed bound in an adaptive threshold-based detection method gives very similar performance compared to that obtained by a nonparametric kernel based approach, while its computational cost is much lower. The performance of the proposed method is shown and compared with the three methods in the literature by means of experiments utilizing a database of 20 subjects. The results show that the false negative rate values for the proposed method are 1.45 +/- 1.50% and 1.98 +/- 1.81% for low and medium flow rates, respectively. These average values are similar to the results obtained by the alternative methods. Moreover, the average elapsed time of the proposed method for a piece of data with a length of 20 s is 0.05 s, which is significantly lower than that of the other methods.