Published at : 31 Oct 2023
Volume : IJtech
Vol 14, No 6 (2023)
DOI : https://doi.org/10.14716/ijtech.v14i6.6767
Muhamad Asvial | Department of Electrical Engineering, Faculty of Engineering, Universitas Indonesia, Kampus UI, Depok 16424, Indonesia |
Teuku Yuri M. Zagloel | Department of Industrial Engineering, Faculty of Engineering, Universitas Indonesia, Kampus UI, Depok, 16424, Indonesia |
Ismi Rosyiana Fitri | Department of Electrical Engineering, Faculty of Engineering, Universitas Indonesia, Kampus UI, Depok 16424, Indonesia |
Eny Kusrini | Department of Chemical Engineering, Faculty of Engineering, Universitas Indonesia, Kampus UI, Depok 16424, Indonesia |
Yudan Whulanza | Department of Mechanical Engineering, Faculty of Engineering, Universitas Indonesia, Kampus UI, Depok 16424, Indonesia |
The
recent technological advances have proven to be successful in facilitating
various strenuous activities and improving daily life performance. Furthermore,
the public has been amazed by the presence of Artificial Intelligence. Artificial
Intelligence, often known as AI, is a type of technology in the field of
computer science that has special abilities to solve problems. With its
intelligence, which is said to be able to compete with human cognitive
abilities, AI technology is, in fact, able to help a variety of human jobs,
from easy to complex ones.
The
first work which is now recognized as AI was done by Warren McCulloch and
Walter Pitts in 1943 as they proposed a model of artificial neurons. Later from
that day, research in machine learning were florished. Therefore, Alan Turing
who was an english mathematician proposed a test to asses the machine's ability
to exhibit intelligent behavior equivalent to human intelligence. The word
artificial intelligence was first adopted by American computer scientist, John
McCarthy at the Dartmouth Conference for the first time. The finding of several
computer language such as Fortran, LISP or COBOL marked the enthusiasm for AI
at that time.
The
era of AI had several idle development along the way which called as AI winter
in 1974 to 1980 and 1987-1993. This era refers to the time period where
computer scientists dealt with a severe shortage of funding from government or
companies. Until the year 1997, the IBM Deep Blue became the first computer to
beat a world chess champion, the emergence of AI never went under. Companies
like Facebook, Twitter and Netflix also started using AI deep learning, big
data and artificial general intelligence since the 2006.
The applications of AI are
vast, including in industrial automation, healthcare, transportation, finance,
entertainment, and more. AI continues to develop along with advances in
technology and research, with the ultimate goal of creating systems that have
levels of intelligence and capabilities that increasingly approach human
capabilities. Artificial intelligence also faces numerous
debates regarding potential impacts on individuals. Although it could be risky,
it's also offering a fantastic opportunity. It is estimated that the global Artificial
Intelligence market will reach 267 billion dollars by 2027.