Residual Neural Network Model for Detecting Waste Disposing Action in Images
Abstract
Waste in general has become a major problem for people around the world. Evidence internationally shows that everyone, or nearly everyone, admits to polluting at some point, with the majority of people littering at least occasionally. This research wants to overcome these problems, by utilizing computer vision and deep learning approaches. This research was conducted to detect the actions carried out by humans in the activities/actions of disposing of waste in an image. This is useful to provide better information for research on better waste disposal behavior than before. We use a Convolutional Neural Network model with a Residual Neural Network architecture to detect the types of activities that objects perform in an image. The result is an artificial neural network model that can label the activities that occur in the input image (scene recognition). This model has been able to carry out the recognition process with an accuracy of 88% with an F1-Score of 0.87.
Downloads
References
[2] Brook Lyndhurst, “Rapid Evidence Review of Littering Behaviour and Anti-Litter Policies,” 2013.
[3] R. B. Cialdini, R. R. Reno, and C. A. Kallgren, “A Focus Theory of Normative Conduct : Recycling the Concept of Norms to Reduce Littering in Public Places,” J. Personal. Scocial Psychol., vol. 58, no. 6, pp. 1015–1026, 1990.
[4] Y. A. W. De Kort, L. T. Mccalley, and C. J. H. Midden, “Persuasive Trash Cans Activation of Littering Norms by Design,” Environ. Behav., vol. 40, no. 6, pp. 870–891, 2016.
[5] R. M. Krauss, J. L. Freedman, and M. Whitcijp, “Studies of Littering,” J. Exp. Scodial Psychol., vol. 14, pp. 109–122, 1978.
[6] P. W. Schultz, R. J. Bator, L. B. Large, C. M. Bruni, and J. J. Tabanico, “Littering in Context : Personal and Environmental Predictors of Littering Behavior,” Environ. Behav., vol. 45, no. 1, pp. 35–59, 2011, doi: 10.1177/0013916511412179.
[7] C. G. Sibley and J. H. Liu, “DIFFERENTIATING ACTIVE AND PASSIVE LITTERING A Two-Stage Process Model of Littering Behavior in Public Spaces,” Environ. Behav., vol. 35, no. 3, pp. 415–433, 2003, doi: 10.1177/0013916503251446.
[8] M. Bateson, R. Robinson, T. Abayomi-cole, J. Greenlees, A. O. Connor, and D. Nettle, “Watching eyes on potential litter can reduce littering : evidence from two field experiments,” PeerJ, vol. 3, no. e1443, 2015, doi: 10.7717/peerj.1443.
[9] A. M. Jansen, E. Giebels, T. J. L. Van Rompay, and M. Junger, “The Influence of the Presentation of Camera Surveillance on Cheating and Pro-Social Behavior,” Front. Psychol., vol. 9, p. 1937, 2018, doi: 10.3389/fpsyg.2018.01937.
[10] M. Baccouche, F. Mamalet, C. Wolf, C. Garcia, and A. Baskurt, “Sequential Deep Learning for Human Action Recognition,” in International workshop on human behavior understanding, 2011, pp. 29–39.
[11] A. Karpathy, G. Toderici, S. Shetty, T. Leung, R. Sukthankar, and L. Fei-fei, “Large-scale Video Classification with Convolutional Neural Networks,” in The IEEE conference on Computer Vision and Pattern Recognition, 2014, pp. 1725–1732.
[12] S. E. Kahou, V. Michalski, and R. Memisevic, “Recurrent Neural Networks for Emotion Recognition in Video,” in The 2015 ACM on International Conference on Multimodal Interaction, 2015, pp. 467–474.
[13] E. P. Ijjina and C. K. Mohan, “Human action recognition using genetic algorithms and convolutional neural networks,” Pattern Recognit., vol. 59, pp. 199–212, 2016, doi: 10.1016/j.patcog.2016.01.012.
[14] J. C. Núñez, R. Cabido, J. J. Pantrigo, A. S. Montemayor, and J. F. Vélez, “Convolutional Neural Networks and Long Short-Term Memory for skeleton-based human activity and hand gesture recognition,” Pattern Recognit., vol. 76, pp. 80–94, 2018, doi: 10.1016/j.patcog.2017.10.033.
[15] S. W. Pienaar and R. Malekian, “Human Activity Recognition Using LSTM-RNN Deep Neural Network Architecture,” arXiv Prepr., vol. arXiv, p. 1905.00599, 2019.
[16] I. Rodríguez-moreno, J. M. Martínez-otzeta, B. Sierra, I. Rodriguez, and E. Jauregi, “Video Activity Recognition : State-of-the-Art,” Sensors, vol. 19, no. 14, p. 3160, 2019, doi: 10.3390/s19143160.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.