Comparison on Cloud Image Classification for Thrash Collecting LEGO Mindstorms EV3 Robot

Z. Othman, N.A. Abdullah, K.Y. Chin, F.F.W. Shahrin, S.S. Syed Ahmad, F. Kasmin


The world today faces the biggest
waste management crisis due to rapid economic
growth, congestion, urban planning issues,
devastating negative symptoms and political
affairs. In addressing this waste management
problem, many methods of solving waste
management have proven not to be as planned.
In this high technology era, the innovation of
humanoid robots is found to be helpful to support
the everyday human life. The industry is gearing
towards automation to increase productivity at the
same time will improved quality of life to local
communities. Therefore, in this paper Thrash
Collecting Robot (TCR) is proposed to help
provide automatic control in thrash collection. The
TCR, built on the LEGO Mindstorm EV3 robot, can
distinguish between static and dynamic barriers,
and can move according to the programming that
has been created. TCRs are basically composed
of sensors designed according to different
requirements in order to detect dynamic barriers.
TCR is one type of Cloud Robot that implements
image processing techniques to identify the type
of waste that has been collected. The concept of
image processing built in TCR by using Cloud
Representational State Transfer (REST API).
This concept has been applied by Google Cloud
API and Sighthound. This cloud services used
machine vision techniques to identify and classify
the type of thrash images; whether it is plastic,
metal or paper. Experiment results show that
SightHound gives accurate result compared to
Google Cloud in classifying thrash types.

Full Text:



Z. Othman and A. Abdullah, “An Adaptive

Threshold Based On Multiple Resolution Levels

for Canny Edge Detection,” in IRICT 2017:

Recent Trends in Information and Communication

Technology, 2017, pp. 316–323.

Z. Othman, A. Abdullah, and A. S. Prabuwono,

“Supervised Growing Approach for Region of

Interest Detection in Iris Localisation,” Adv. Sci.

Lett., vol. 24, no. Number 2, p. 1005–1011(7), 2018.

M. K. Nurul Nadirah, S. A. (UTeM) Sharifah

Sakinah, and S. Abdul Samad, “Improved fuzzy_

PID controller in following complicated path

for LEGO Mindstorms NXT,” in Proceedings of

Mechanical Engineering Day 2017, 2017, pp. 474–

A. Mohammed, L. Wang, and R. X. Gao,

“Integrated image processing and path planning

for robotic sketching,” Procedia CIRP, vol. 12, pp.

–204, 2013.

F. Umam, “Optimalization of Detection and

Navigation Smart Bin Robot Using Camera,” Adv.

Sci. Lett., vol. 23, no. 12, p. 12432–12436(5), 2017.

J. T. C. Tan, K. Okuno, and T. Inamura,

“Integration of work operation and embodied

multimodal interaction in task modeling for

collaborative robot development,” 4th Annu. IEEE

Int. Conf. Cyber Technol. Autom. Control Intell. Syst.

IEEE-CYBER 2014, pp. 615–618, 2014.

P. Kopacek, “Development Trends in Robotics,”

IFAC-PapersOnLine, vol. 49, no. 29, pp. 36–41,

E. Guizzo, “Robots with their heads in the

clouds,” IEEE Spectr., vol. 48, no. 3, pp. 17–18,

I. A. T. Hashem, I. Yaqoob, N. B. Anuar, S.

Mokhtar, A. Gani, and S. Ullah Khan, “The rise of

‘big data’ on cloud computing: Review and open

research issues,” Inf. Syst., vol. 47, pp. 98–115,

A. G. Del Molino, B. Mandal, J. Lin, J. H. Lim,

V. Subbaraju, and V. Chandrasekhar, “VC-I2R@

ImageCLEF2017: Ensemble of deep learned

features for lifelog video summarization,” CEUR

Workshop Proc., vol. 1866, 2017.

S. Z. Masood, G. Shu, A. Dehghan, and E. G.

Ortiz, “License Plate Detection and Recognition

Using Deeply Learned Convolutional Neural

Networks,” 2017.

A. Dehghan, E. G. Ortiz, G. Shu, and S. Z.

Masood, “DAGER: Deep Age, Gender and

Emotion Recognition Using Convolutional

Neural Network,” 2017.

“Google Cloud Platform.” [Online]. Available:

“Sighthound.” [Online]. Available: https://www.

Z. Othman, N. A. Abdullah, C. K. Yee, F. Farina,

W. Shahrin, and S. S. Syed, “Image Processing

Technique using Google Cloud API and

Sighthound for Lego Mindstorms EV3 Robot,”

Robot. Autom. Eng. J., vol. 2, no. 3, pp. 2–4, 2018.


  • There are currently no refbacks.

ISSN : 2590-3551, eISSN : 2600-8122     

Best viewed using Mozilla Firefox, Google Chrome and Internet Explorer with the resolution of 1280 x 800