JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Sound Visualization Method using Joint Time-Frequency Analysis for Visual Machine Condition Monitoring
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Sound Visualization Method using Joint Time-Frequency Analysis for Visual Machine Condition Monitoring
Seo, Jung-Hee; Park, Hung-Bog;
  PDF(new window)
 Abstract
Noise from the surrounding environment, building structures and machine equipment have significant effects on daily life. Many solutions to this problem have been suggested by analyzing causes of noise generated from particular locations in general buildings or machine equipment and detecting defects of buildings or equipment. Therefore, this paper suggests a visualization technique of sounds by using the microphone array to measure sound sources from machines and perform the visual machine condition monitoring (VMCM). By analyzing sound signals and presenting effective sound visualization methods, it can be applied to identify machine's conditions and correct errors through real-time monitoring and visualization of noise generated from the plant machine equipment.
 Keywords
Machine Condition Monitoring;Sound Visualization;JTFA;
 Language
Korean
 Cited by
 References
1.
Seiichi Shin, "Human Machine Communication via Sound with Wavelet Transformation," Systems, Man and Cybernetics, 2005 IEEE International Conference on., Vol. 2, pp. 1984-1988, Oct. 2005.

2.
Goseki M., Ding M., Takemura H., Mizoguchi H., "Combination of Microphone Array Processing and Camera Image Processing for Visualizing Sound Pressure Distribution," Systems, Man, and Cybernetics (SMC), 2011 IEEE International Conference on., pp. 139-143, Oct. 2011.

3.
Jingyu Wang, Ke Zhang, Madani, K., Sabourin, C., "A Visualized Acoustic Saliency Feature Extraction Method for Environment Sound Signal Processing," TENCON 2013-2013 IEEE Region 10 Conference, pp. 1-4, Oct. 2013.

4.
Ervin L., Marcel B., Monika B., Zuzana F., "Application of modern technical tools for sound visualization in the teaching process," ICETA 2012. 10th IEEE International Conference on Emerging eLearning Technologies and Applications, pp. 247-251, Nov. 2012.

5.
Adiloglu K., Annies R., Wahlen E., Purwins H., Obermayer K., "A Graphical Representation and Dissimilarity Measure for Basic Everyday Sound Events," Audio, Speech, and Language Processing, IEEE Transactions on. Vol. 20, No.5, pp. 1542-1552, Jan. 2012. crossref(new window)

6.
Nakamura K., Sugimoto T., "A Visualization Tool for High Intensity Focused Ultrasonic Field Using LEDs and Piezo-Elements," Ultrasonics Symposium, IEEE, pp. 733-736, Oct. 2007.

7.
Yatabe K., Oikawa Y., "PDE-based interpolation method for optically visualized sound field," Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on., pp. 4738-4742, May 2014.

8.
Francesco Martellotta, "On the use of microphone arrays to visualize spatial sound field information," Applied Acoustics, Vol. 74, No. 8, pp. 987-1000, August 2013. crossref(new window)

9.
Zimmermann B., Studer C., "FPGA-based Real-Time Acoustic Camera Prototype," Circuits and Systems (ISCAS), Proceedings of 2010 IEEE International Symposium on., pp. 1419-1422, June 2010.

10.
Shin Hur, Hongsoo Choi, Joonsik Park and Tang-Han Kim, "Recent Trends in MEMS Microphone and Application for Hearing Aid System," Journal of the Korean Society for Precision Engineering, Vol. 26, No. 11, pp. 20-28, Nov. 2009.

11.
Zhuang Li and Malcolm J. Crocker, "A Study of Joint Time-Frequency Analysis-Based Modal Analysis," IEEE Transactions on., Vol. 55, No. 6, pp. 2335-2342, Dec. 2006.