KEYWORDS: Visual process modeling, Control systems, Neural networks, Machine vision, Data modeling, Convolutional neural networks, Telecommunications, Scene classification, Process modeling
This article designs smart trash that can be based on machine vision. Its primary function is to complete the automatic identification, classification, and bucketing operations of the typical garbage types in the community. The system uses a Raspberry Pi equipped with a convolutional neural network. It embeds a deep learning model to realize the identification of garbage and online communication to control the mechanical system to complete the class i f icat ion of was te into bucket s . In this paper, the accuracy of the AlexNet model, ZFNet model, and Inception V1 model top-1 are 62.5%, 64%, and 69.8%, respectively. The top-1 accuracy rate of the vgg model has reached 74%, and the accuracy of the model training can be improved to 94% by enhancing the vgg model.
Waste recycling is very important for economy and climate balance of the world. For this reason, intelligent classifying recyclable garbage is an important goal for humanity and Deep Learning models can be used for this purpose. In this paper, a deep learning framework with different architectures, such as Densenet, Inception- Resnet-V2, MobileNet, and Xception, is tested on Trashnet dataset to provide the most efficient approach. Meanwhile, Adam is selected for optimizing neural network models. Experimental results validate that Deep learning models with the Adam optimizer could provide better a test accuracy rate compared to the Adadelta optimizer. With comparison of quantitative results obtained by those architectures in the deep learning frame- work, we can find that the DenseNet using fine-tuning can get the best result (a test accuracy rate of 95%) and the Inception-ResNet-V2 using fine-tuning is the second best (a test accuracy of 94%).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.