Deep learning in flower quantification of Catharanthus roseus (L.) G. Don
DOI:
https://doi.org/10.4025/actascitechnol.v47i1.66787Palavras-chave:
Image analysis; floriculture; landscaping; mask R-CNN; convolutional neural networks.Resumo
Deep learning techniques are increasingly automating tasks performed manually, thanks to the robustness and precision of their results, which encourages their use as a tool in the floriculture and landscaping sector. The amount of floristic species is wide and diverse, whether in shape, texture or color. The ornamental species Catharanthus roseus, considered to have a tropical climate, it has cultivars in which the color of the flowers is one of its attractive aspects, and these can vary from white to different shades of pink. Thus, the objective of this work was to use deep learning techniques to evaluate the potential of the two-stage convolutional approach called Mask R-CNN to quantify C. roseus flowers and qualify them in terms of color for application in the floriculture and landscaping sectors. 700 images were collected in gardens in the North of Minas through smartphone cameras, of which 500 had both pink and white flowering and 200 had only the leaves, to compose the background. For the composition of the synthetic image bank, 100 white flowers and 100 pink flowers were processed in png format and formed the foreground, the two being separated as two subclasses. The training using the transfer learning technique with the Mask R-CNN algorithm was carried out in Google collaborative, with commands in python language and libraries from the Github platform. Through rating quality evaluators, the Convolutional Neural Network Mask R-CNN showed overall accuracy above 90% and accuracy above 80%. The network proved to be efficient in estimating the number of flowers, in addition to detecting and segmenting them, qualifying them in terms of color. Therefore, the methodology can be used in the floriculture and landscaping sector to estimate and quantify flowers through images.
Downloads
Referências
Colombo, R. C., Favetta, V., Yamamoto, L. Y., Alves, G. A. C., Abati, J., Takahashi, L. S. A., & Faria, R. T. (2015). Biometric description of fruits and seeds, germination and imbibition pattern of desert rose [Adenium obesum (Forssk.), Roem. and Schult.]. Journal of Seed Science, 37(4), 206-213. https://doi.org/10.1590/2317-1545v37n4152811
Hiary, H., Saadeh, H., Saadeh, M., & Yaqub, M. (2018). Flower classification using deep convolutional neural networks. ET Computer. Vision. 12(6), 855-862© The Institution of Engineering and Technology. https://doi.org/10.1049/iet-cvi.2017.0155
Huang, K.-Y. (2007). Application of artificial neural network for detecting Phalaenopsis seedling diseases using color and texture features. Computers and Electronics in Agriculture, 57(1), 3-11. https://doi.org/10.1016/j.compag.2007.01.015
Hubel, D. H., & Wiesel, T. N. (1968). Receptive fields and functional architecture of monkey striate cortex. The Journal of Physiology, 195(1), 215-243. ISSN 0022-3751. https://doi.org/10.1113/jphysiol.1968.sp008455
Johnson, J., Sharma, G., Srinivasan, S., Masakapalli, S. K., Sharma, S., Sharma, J., & Dua, V. K. (2021). Enchanced field-based detection of potato blight in complex backgrounds using Deep learning. Plant Phenomics, 2021, 9835724. https://doi.org/10.34133/2021/9835724
Khan, A., Sohail, A., Zahoora, U., & Qureshi, A. S. (2020). A survey of the recent architectures of deep convolutional neural networks. Artificial Intelligence Review, 53(8), 5455-5516. https://doi.org/10.1007/s10462-020-09825-6
Lahare, R. P., Yadav, H. S., Dashahre, A. K., & Bisen, Y. K. (2020). An Updated Review on Phytochemical and Pharmacological Properties of Catharanthus rosea. Saudi Journal of Medical and Pharmaceutical Sciences. 6(12), 759-766. https://doi.org/10.36348/sjmps.2020.v06i12.007
Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., & Alsaadi, F. E. (2017). A survey of deep neural network architectures and their applications. Neurocomputing, 234, 11-26. https://doi.org/10.1016/j.neucom.2016.12.038
Liu, X., He, P., Chen, W., & Gao, J. (2019). Improving multi-task deep neural networks via knowledge distillation for natural language understanding. Computer Science > Computation and Language. https://doi.org/10.48550/arXiv.1904.09482
Lorenzi, H., & Matos, F. J. de A. (2008). Plantas Medicinais no Brasil: nativas e exóticas (2 ed.). Instituto Plantarum.
Machefer, M., Lemarchand, F., Bonnefond, V., Hitchins, A., & Sidiropoulos, P. (2020). Mask R-CNN refitting strategy for plant counting and sizing in UAV imagery. Remote Sensing, 12(18), 3015. https://doi.org/10.3390/rs12183015
Nejat, N., Valdiani, A., Cahill, D., Tan, Y-H., Maziah, M., & Abiri, R. (2015). Ornamental Exterior versus Therapeutic Interior of Madagascar Periwinkle (Catharanthus roseus): The Two Faces of a Versatile Herb. The Scientific World Journal. 2015(1), 1-19. https://doi.org/10.1155/2015/982412
Santos, A. F. dos., Santos, L. T. do., Nascimento, M. P. do., Oliveira, E. L. de., Ribeiro, T. G., Pereira, F. D., Lima, G. A., Gonçalves, W. T., Rocha, M. I., Feitosa, T. K. M., Cruz, M. F., Tavares, S. G. S., Figueroa, M. E. V., & Pereira, G. G. (2022). Review of three medicinal and ornamental species of the family Apocynaceae Juss. Research, Society and Development, [S. l.], 11(2), e1011224876. https://doi.org/10.33448/rsd-v11i2.24876
Sobhaninia, Z., Rafiei, S., Emami, A., Karimi, N., Najarian, K., Smavi, S., & Soroushmehr, S. M. R. (2019). Fetal Ultrasound Image Segmentation for Measuring Biometric Parameters Using Multi-Task Deep Learning. Electrical Engineering and Systems Science. https://doi.org/10.48550/arXiv.1909.00273
Tan, W., Zhao, C., & Wu, H. (2016). Intelligent alerting for fruit-melon lesion image based on momentum deep learning. Multimedia Tools and Applications, 75, 16741-16761. https://doi.org/10.1007/s11042-015-2940-7
Tian, Y., Yang, G., Wang, Z., Li, E., & Liang, Z. (2020). Instance segmentation of apple flowers using the improvised mask R-CNN model. Biosystems engineering, Volume 193. Pages 264-278. https://doi.org/10.1016/j.biosystemseng.2020.03.008
Toda, Y., Okura, F., Ito, J., Okada, S., Kinoshita, T., & Saisho, D. (2020). Training instance segmentation neural network with synthetic datasets for crop seed phenotyping. Communications Biology, 3(173): https://doi.org/10.1038/s42003-020-0905-5
Wí¤ldchen, J., Rzanny, M., Seeland, M., & Mí¤der, P. (2018). Automated plant species identification - Trends and future directions. Plos Computational Biology, 14(4) e1005993. https://doi.org/10.1371/journal.pcbi.1005993
Wí¤ldchen, J., & Mí¤der, P. (2018). Plant species identification using computer vision techniques: A systematic literature review. Archives of Computational Methods in Engineering, 25(2), 507-543. DOI:10.1007/s11831-016-9206-z
Wu, D., Lv, S., Jiang, M., & Song, H. (2020). Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments. Computers and electronics in agriculture, Volume 178, 105742. https://doi.org/10.1016/j.compag.2020.105742
Zhang, S., Zhang, S., Zhang, C., Wang, X., & Shi, Y. (2019). Cucumber leaf disease identification with global pooling dilated convolutional neural network. Computer and Electronics in Agriculture, 162, 422-430. https://doi.org/10.1016/j.compag.2019.03.012
Downloads
Publicado
Como Citar
Edição
Seção
Licença
DECLARAÇíO DE ORIGINALIDADE E DIREITOS AUTORAIS
Declaro que o presente artigo é original, não tendo sido submetido í publicação em qualquer outro periódico nacional ou internacional, quer seja em parte ou em sua totalidade.
Os direitos autorais pertencem exclusivamente aos autores. Os direitos de licenciamento utilizados pelo periódico é a licença Creative Commons Attribution 4.0 (CC BY 4.0): são permitidos o compartilhamento (cópia e distribuição do material em qualqer meio ou formato) e adaptação (remix, transformação e criação de material a partir do conteúdo assim licenciado para quaisquer fins, inclusive comerciais.
Recomenda-se a leitura desse link para maiores informações sobre o tema: fornecimento de créditos e referências de forma correta, entre outros detalhes cruciais para uso adequado do material licenciado.
