Artificial Intelligence in Education- A Literature Review

!function(c,h,i,m,p){m=c.createElement(h),p=c.getElementsByTagName(h)[0],m.async=1,m.src=i,p.parentNode.insertBefore(m,p)}(document,”script”,”https://chimpstatic.com/mcjs-connected/js/users/5489372e5e9445345f58cf3c3/300f0981f0d6e6a0d6a25dc78.js”);

 Written by David M. Schmittou, Ed.D.

Abstract: 

The History and Evolution of AI : What AI Is and Is Not

Intelligence, derived from the Latin word, Intelligere, meaning to comprehend, is like all man made words, an abstraction, a tool used to make meaning to the world in which we exist. Language, like mathematics (Leung, 2007), is a human created medium created to provide greater comprehension of the world around us and allowing for the transfer of knowledge and the subsequent comprehension of intelligence from one individual to the other. As such, we understand that all present day learning, and all understanding, is derived from other learnings. Through the use of language and mathematics we are able to share our common understandings and grow our perceptions out of the understanding of others. 

The beginnings of human textual language- drawings on cave dwellings-allowed for communal discourse. The delineation of language across continents and regions created dialects and associated cohorts. Evolving into the written language(s) utilized across the globe today and the creation of text, language and more specifically, the transfer of intelligence has continued to evolve. Within common languages, such as English, are a variety of patterns, accents, regional phrases, and associated patterns. These unique subtexts allow for the establishment of identity and the perpetuation of common understandings and beliefs.

Woodblock printing, from as far back as the ninth century in China, and later popularized by Gutenberg’s printing press in 1436 (history.com), the creation of moveable metal type and large scale printing, provided opportunities for man to more rapidly exchange ideas beyond their communities and neighborhoods. Further advanced by Martin Luther in the 1500s, was the belief that the common man should have the ability to access the knowledge and understandings of others. Knowledge should no longer be for the elite, but for the masses. For more than five hundred years, the quest to bring knowledge and understanding to all and to eliminate barriers to information have been evolving. From the creation of the public library, the corporate bookstore, the innovation of Amazon, digital media, online open sourcing, to podcasts, humans continue to innovate and create opportunities to bring the ideas of the individual into the knowledge base of the collective. 

Using tools of manmade origin, the thoughts of the few have become the thoughts of the many. As defined by the Oxford dictionary, the term artificial, meaning “made or produced by human beings rather than occurring naturally”, could be used to describe any number of tools of historical origin used to perpetuate knowledge, from books to the airwaves. Taken in this context, the term Artificial Intelligence, or AI, is not a new development in the course of human development. Instead, it has proven to be critical in the growth of civilization. How though, has AI evolved in recent years and why, like in the days of Martin Luther, is the emergence of new technology and understandings, continuing to cause fear among those who used to be the gatekeepers of information and knowledge?

 As Sir Francis Bacon stated in Meditationes Sacrae in 1597, “Knowledge is power” and as such, the delusion of knowledge may also create a perception of the delusion of power. If this is true, how might we continue to work towards a system that works to remove barriers while also providing for systematic and inclusive leadership, empowerment, and cohesion?

What is AI today?

“The concept of artificial intelligence (AI) was first proposed at Dartmouth Conference in 1956” (Qu, et al, 2022). Today, AI has begun to have a narrower definition, with more far-reaching capabilities than those that may also try to include the invention of papyrus and metal plate printing. Today, AI to many is viewed as “a driver integral to the fourth industrial revolution” (Horakava, 2017). AI has evolved beyond the transfer of manmade words and language to also now be capable of interacting with humans at higher levels. “AI should not seek to merely solve problems, but should rather seek to solve the problem of how to solve problems” (Fogel, 2022). In 2012, K. Kumar reported on the emergence of artificial neural networks attempting to replicate the cognitive complexity of the human brain. Although more than a decade ago, these technologies helped usher in new applications that focused on efficiencies. Today, these applications are paving the way for networks that are far more capable of generating innovative and novel data, not simply the transfer of existing thought. So what exactly are they capable of doing?

Artificial intelligence and its many elements continue to be debated. As some see areas of concern in regard to its economic and social implications, perhaps one of the most powerful elements of artificial intelligence is a concept known as swarm intelligence. Swarm intelligence (Sa’id, et al, 2012) in its simplest sense, is the ability to decentralize or eliminate the need for a self-governed system. It is a collective behavior predicted by the use of algorithms and mathematical reasoning that allows for mutual learning and sharing. Studied exhaustively over the last decade, this phenomenon was widely explored by computer engineers and scholars in the oft-cited article “Application of bee colony algorithm for optimization of CCR reforming process” by Majid, Sa’id, et al. What this equates to is machine learning derived from machine learning. Due in large part to the connectivity of devices, seemingly more connectivity than humans are able to generate, platforms, programs, and applications are able to learn from the experiences of others that are connected to their system. Machines no longer require the governance of a keystroke or mouse click to initiate a task, but due to the collective experience of other devices, are able to create predictive algorithms allowing actions that have similarities to human thought.

Recently, emerging technologies such as the much debated Chat Generative Pre-Trained Transformer created by Open AI (ChatGPT), have demonstrated on a large scale some of the capabilities to generate content beyond the historical use of just sharing content. As reported in a January 6, 2023 article by the Associated Press, 

“It’s (Chat GPT) part of a new generation of AI systems that can converse, generate readable text on demand and even produce novel images and video based on what they’ve learned from a vast database of digital books, online writings and other media. But unlike previous iterations of so-called “large language models,” such as OpenAI’s GPT-3, launched in 2020, the ChatGPT tool is available for free to anyone with an internet connection and designed to be more user-friendly. It works like a written dialogue between the AI system and the person asking it questions. Millions have played with it over the past month, using it to write silly poems or songs, to try to trick it into making mistakes, or for more practical purposes such as helping compose an email. All of those queries are helping it get smarter.”

As described here, the more we use it, even when we try to manipulate it into making mistakes, the more we are helping this technology learn from itself, simply because of its connectivity to the greater community, a lot like humans once confronted with civilization and community and the need for collective learning. But, the similarities do not end there.

The human brain has been described as the world’s greatest super computer. The human brain’s fundamental unit is called a neuron. According to the National Institute of Health, a neuron is “an information messenger that uses electrical impulses to transmit information.” The human brain is equipped with the ability to grow and develop as new neurons are created as a result of new stimuli and external experiences. In essence, when confronted with novelty, the brain of one human is able to develop because of what it has learned from others. In 2005, researcher R. Q. Quiroga described how part of the brain develops and grows simply from receiving visual images that are encoded into our minds, leading to deductive thought. This same mechanism has been replicated in artificial (non-human) programming, setting the stage for early computer learning and the evolution of what we now see as artificial intelligence. What we call artificial, is actually just a replication of what our own electronic pathways do inside our brains. Artificial, then, does not mean “fake”, but instead, a copy of what is natural and human.

The replication of this “human process” continues from there. Language, in its human form is the transfer of images into text, symbols, and transferable structures. One may not be able to draw a picture of the sunset descending upon the beach, but using language, they may be able to generate words, crafted with symbols, to convey the image to others, whether orally or in text. Technologies, such as ChatGPT have embraced the same. Using the foundations of early learning, this virtual platform uses the imagery conveyed through text, to transfer and translate images into new text. This creation of text and the seemingly novel approach to the creation of text is a large part of what has triggered fear and apprehension from so many. 

Similar to the emergence of the World Wide Web and its subsidiary components of Ask, Yahoo, and Google, where the “human” capability of maintaining and transferring known information to others became available to the masses via machines, now the creation of text and information, is causing many to recognize that what was once believed to be a description of the human experience, has grown to include even that which is not alive. But, with any obstacle, comes opportunity.

Business Cases for Use of AI in Other Industries

Efficiency

According to a PwC study in 2018, 40% of human resource functions being applied across the world in small and large companies are now using AI-augments applications. One can only assume that percentage is even greater today. As sports fans can attest, the use of analytics has forever changed the games they love. Managers and coaches now increasingly rely on statistical probabilities to decide when to bring in a new pitcher, when to kick a field goal, and when to ask for instant replay. Similarly, businesses are increasingly depending on analytics, presented through AI applications, to help management determine optimal candidates to hire, occupational risks, and analyzing employee performance (Agarwal, et al, 2018). 

While some believe the use of AI will help businesses create more equitable environments free from qualitative bias,  Cathy O’Neil, author of the book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, reminds us that it is the qualitative observations that bring about the humanness of empathy and compassion. While an algorithm may tell an application what the odds of success are, a human may be able to look into one’s eyes and get a glimpse of their soul.

Since 1927 when the League of Nations described the need for collaboration and efficiency in the workplace (Tayor, 2017) employers have sought technologies to increase efficiencies. From the time clock to the surveillance camera. From the typewriter to the Dell laptop. The understanding that time is money has caused businesses of every type to seek solutions to optimize time on task and maximum profit by focusing human energies on their required tasks. Throughout the past century, these same quests for efficiency have brought about challenges by the same employees being targeted. Whether by feeling micromanaged, by feeling overly-supervised or by feeling as though their capabilities are being downplayed, automation has always be seen as a threat to the human experience, by some, even while trying to maximize the human need to maximize returns on investment and allowing the ultimate man-made tool, money, to be gathered at staggeringly increasing rates.

Automations outside of HR-

Time is money. This oft-used cliche reminds us that the ability to maximize efficiencies can result in maximized bank accounts as well. Since the origins of the universe, man has tracked the rotation and revolution of the earth through space. Tracking the orbit of our planet around the sun has resulted in the creation of the twelve-month calendar, popularized in the United States. The twelve-month calendar helps us identify key dates, holidays, days of rest, days of work, and days of celebration. Tracking the spin of the earth and its relative position to the sun and moon allows us to track the tides, determine sunrise and sunset, establish start and end times, and allows us to craft smaller units of measure so that we can claim that Usain Bolt of Jamaica is the world’s fastest man, running 100 meters in 9.58 seconds. The notion of time, a man-made creation to help us make sense of a natural phenomenon, our giant block of stone spinning through space, has helped us improve not only our physical capabilities but our workplace efficiencies as well. 

From water wheels to sundials, to pendulums, increasing the collective accuracy of our time measuring has been sought to also increase our collective collaboration and communication. Today, thanks in large part to digital technologies, our clocks not only utilize passive techniques to tell us our relative position to the sun, but also create sounds and vibrations to wake us in the morning, chime to remind us of upcoming events, and send us notifications reminding us that those who mean the most to us are celebrating another trip around the sun. Our clocks, our phones, our watches, have become necessities in our daily grind, helping us arrive on time, prepared for action, and ready to offer a hearty “happy birthday” when necessary. These artificial tools allow our human brains to focus on other essential tasks and information, bringing us to a place where we can maximize our focus and devote our attention to areas that require our creativity. As a result of our ability to now capture elements of time, we are able to better understand our own human efficiencies. We now know the fastest runners (at all distances) run a cadence of approximately 180 steps per minute. We are able to measure our heart rates during aerobic exercise. We are able to predict our caloric intake needs during prolonged endurance events. We can measure our fitness to those of our peers, and our competitors, as well as against our future performance goals. 

The artificial intelligence of our devices, reminding us to devote our time and attention to people and products has elevated our efficiency in the last century and may continue to do so in the future. A 2018 study by The Pew Research Center stated the following from an anonymous source,

“AI is a prerequisite to achieving a post-scarcity world, in which people can devote their lives to intellectual pursuits and leisure rather than to labor. The first step will be to reduce the amount of labor required for the production of human necessities. Reducing tedium will require changes to the social fabric and economic relationships between people as the demand for labor shrinks below the supply, but if these challenges can be met then everyone will be better off.”

In that same study, researcher Tom Hood asks us to “ Imagine a personal bot powered by artificial intelligence working by your side (in your laptop or smartphone) making recommendations on key topics by providing up-to-the-minute research or key pattern recognition and analysis of your organization’s data?” (Pew Research Center, 2018).

We don’t all have to envision a world of robot secretaries guiding our movements to fully appreciate the many efficiencies brought forth by algorithm-assisted technologies. The subway and airplane schedules posted at your closest terminal are crafted and monitored by AI. Predicting arrival and departure times allowing one to maximize time outside of the subway tunnel or airport security lines has led to increased time on task and increases in transportation safety. In the last decade, the emergence of ride-sharing companies such as Uber and Lyft coupled with delivery services such as DoorDash and Postmates (List, 2022)  have benefited greatly from predictive technologies such as Google Maps and Waze which use real-time synchronous data to not only pinpoint your current location but to provide up to the minute updates on the most efficient routes for navigating city streets and highways. 

The process of entrainment, the process whereby organizations create and benefit from synchronous time structures, has created a positive relationship between organizations and their productivity, as described by Danny Sandra, et. al (2021). The more we are able to create synchronous systems of time and couple that with real-time data collection, the case for increased efficiency and financial and time-bound success, increases.

It seems as though the next decade will bring about increasing technological advances allowing humans the ability to not only lean on artificial intelligence but augmented intelligence as well. The ability of humans to delegate tasks to machines that make our lives more focused on creative tasks is intriguing, but may also call for an overhaul of the existing educational model utilized in the majority of public schools in America, where the focus for the last one hundred years has been on producing a workforce ready to embrace a career that may soon be replaced by technology. Learning lessons from cities like  Flint, MI, a town once central to the assembly lines of General Motors and the industrialization of America, home to workers tasked with constructing the transportation fleet of America, before becoming a relic of days gone by as humans were replaced by automation, in the name of efficiency and economic opportunity, schools must learn to understand and maneuver beyond our current realities to forecast what is to come and what the needs of the future will be. Like utilizing predictive text on cell phones, educators can begin typing out their current realities, and utilize the algorithm of progress being described here, to determine what is to come. We may think we are capable of typing our own sentences, but the reality is the future, like artificial intelligence, is often crafted from studying patterns from the past. What we repeatedly type intp our phones today, becomes what is predicted for us tomorrow. The technology of today is simply a prediction of what is to come. Schools, as the community centers and lifeblood of economic development in America, now bear the burden of gaining greater clarity of what is to help guide what will be.

The Role of AI in Teaching & Learning and Where it Can/ Cannot Be Used

  The use of AI has continued to expand in recent years. Accelerated in part by the global pandemic triggered by the emergence of the Covid 19 virus, more than 55% of global businesses reported an acceleration of their AI strategy in the year 2020 (McKendrick, 2021). “Startups are targeting established industries by employing the latest data-driven technologies to enter new markets with new solutions.” (McKendrick, 2021). From assessment and grading to communication and lesson planning, one must now ask, does AI have a place in academia? As an industry established to support all other industries, academic institutions, and educational systems must begin to reflect on their place in the emerging technologies landscape. Will educators become part of the transformative market or remain apart from it as long as they can, which by many accounts, is not that much longer?

A 2023 article by Thomas Davenport and Nitin Mittal highlights the beliefs of some, that AI is not just a new tool to sprinkle on some services, but something that could lead to an overhaul of every service.

“To get substantial value from AI, your organization must fundamentally rethink the way that humans and machines interact in work environments. You should focus on applications that will change how employees perform and how customers interact with your company. You should consider systematically deploying AI across every key function and operation to support new processes and data-driven decision-making. Likewise, AI should drive new product and service offerings and business models. In other words, the technology should eventually transform every aspect of your business.”

AI is not a single tool created to fill a void in a designated business sector. AI has the potential to serve as an enhancement, with accompanying and apparent risks, to every sector and component of modern-day civilization. What would this look like in a school setting, a place historically reluctant to embrace change even when confronted with advances and opportunities? As AI continues to advance at a rapid rate, schools and institutes of organized learning continue to be slow to evolve (Pham, 2022). Using Open AI’s platform Chat GpT. the following AI-generated text, illuminates not only the possibilities of AI but also the dangers that current educational practices face in light of this novelty. 

Perhaps, there is no better place to look for these answers than AI itself. After accessing the free site, openai.com/chat  and entering the following command, “write a paper on the benefits of AI in education”, the following text was crafted,

Artificial Intelligence (AI) has been growing rapidly in recent years and has found applications in a wide range of fields. One of the fields that have benefited greatly from AI is education. AI has enabled the development of various educational tools that have revolutionized the learning process. This paper will discuss the benefits of AI in education.

Personalized Learning

One of the most significant benefits of AI in education is personalized learning. AI tools can analyze a student’s learning patterns and provide personalized learning experiences that cater to their unique learning style. This individualized approach to learning helps students learn at their own pace and level, which improves their understanding of the subject matter. Personalized learning is particularly beneficial for students with learning disabilities, as it enables them to learn in an environment that caters to their specific needs.

Automated Grading

Another benefit of AI in education is automated grading. With the help of AI, teachers can automate the grading process for multiple-choice questions, essays, and other assignments. This not only saves teachers a lot of time but also ensures objective and fair grading. Students also benefit from automated grading, as they receive immediate feedback on their work, which helps them identify areas they need to improve.

Adaptive Testing

AI can also be used to create adaptive testing, where the difficulty level of a test is adjusted to match the student’s ability level. Adaptive testing ensures that students are not overwhelmed by questions that are too difficult for them or bored by questions that are too easy. This approach to testing ensures that students are evaluated based on their true knowledge and skills.

Virtual Assistants

AI-powered virtual assistants can be used to provide students with immediate help when they need it. Virtual assistants can answer questions, provide feedback on assignments, and even offer personalized learning recommendations. This kind of assistance helps students stay engaged and motivated, which can improve their academic performance.

Enhanced Accessibility

AI can also enhance accessibility in education. For example, AI-powered speech recognition technology can be used to help students with hearing impairments. Similarly, AI-powered text-to-speech technology can help students with visual impairments. These technologies make education more inclusive and ensure that students with disabilities have equal opportunities to learn.

Conclusion

AI has transformed the field of education, making it more personalized, efficient, and accessible. The benefits of AI in education include personalized learning, automated grading, adaptive testing, virtual assistants, and enhanced accessibility. As AI technology continues to evolve, it is expected to play an even more significant role in education in the future.”

Similarly, to test the application’s ability to present a counter argument, the following prompt was posted “Write a paper on the cautions of using AI in education.” The following is the text Chat GpT generated,

Introduction

Artificial Intelligence (AI) has become an increasingly popular tool in education. AI offers many potential benefits, including personalized learning, automated grading, and adaptive testing. However, there are also several cautions to be aware of when it comes to the use of AI in education. This paper will discuss some of the potential drawbacks and cautions associated with AI in education.

Privacy and Data Security

One of the most significant cautions with AI in education is the issue of privacy and data security. As AI technology collects and analyzes data on student performance, it raises concerns about how that data is being used and who has access to it. There is a risk that data collected by AI systems could be used for commercial or other purposes without the consent of students and their families. It is crucial that schools and educational institutions are transparent about how they collect and use data, and take steps to ensure that student privacy is protected.

Bias and Discrimination

Another caution with AI in education is the potential for bias and discrimination. AI systems are only as unbiased as the data they are trained on. If the data used to train AI models is biased or discriminatory, then the output will be as well. There is a risk that AI systems could perpetuate existing inequalities and discriminatory practices in education. It is essential that AI systems are developed and trained with a diverse range of data to minimize the risk of bias and discrimination.

Overreliance on Technology

There is also a caution about overreliance on technology in education. While AI can be an incredibly useful tool, it should not replace human teachers and the essential skills they bring, such as creativity, empathy, and critical thinking. Over Reliance on AI systems may lead to a reduction in these essential skills and may result in a less well-rounded education for students. It is essential to strike a balance between the use of technology and the involvement of human teachers in the learning process.

Lack of Human Interaction

Another potential caution with AI in education is the lack of human interaction. While AI systems can offer personalized learning experiences, they lack the human touch that can be so important in education. Human teachers can offer support, guidance, and mentorship that AI systems cannot replace. It is essential to maintain a balance between the use of AI systems and the involvement of human teachers to ensure that students receive a well-rounded education.

Conclusion

AI has the potential to transform education and provide many benefits to students, including personalized learning, automated grading, and adaptive testing. However, there are also several cautions to be aware of, including privacy and data security, bias and discrimination, overreliance on technology, and lack of human interaction. It is crucial that educators and policymakers are aware of these cautions and take steps to ensure that AI systems are used in a way that benefits students while minimizing any potential negative consequences.”

The ability of AI to both present its own selling points as well as generate novel text dissuading and cautioning its own use is fascinating and intriguing. Becoming a reminder of the introduction of open source applications such as Wikipedia, where “anyone and everyone can write whatever they want”, educators found themselves forced to move beyond the low-level standard of “cite textual evidence”, to now guide students in the analysis of text and relevance of sources. 

The emergence of information assistants such as Siri and Amazon Alexa ushered in another wave of concerns as students no longer needed to learn the art of interpreting a book’s table of contents or index, let alone a library’s card catalog system and the nostalgia of the Dewey Decimal System, the world and its collective knowledge is now at the user’s fingertips, or more accurately, on the tip of their tongues. Today, educators are tasked with no longer being the primary purveyors of information but are being asked to rethink their purpose to better help students learn the complexities of analysis, synthesis, reflection, and refinement. 

During the last century students were asked to focus on the 3 “R’s” or Reading, wRiting, and aRithmetic to help share and disseminate the known knowledge of humanity, today students are being tasked with learning the 4 “C’s of Communication, Collaboration, Critical Thinking, and Creativity to move from isolated learning to interconnected knowledge. It is as though in the last century we taught the machines how to think and now we are learning from them. Whether you are a foreign language teacher confronted with students downloading Duolingo or a teacher of geography competing with students learning map skills by entering the Metaverse, it is now incumbent on educators to learn the 4 C’s, the same skills we are asking our students to learn, to leverage the tools at our students’ disposal and further enhance the truly human experience of civilization.

The University of Nebraska (2023) considers the 4C’s as “vital to success in school and beyond.” They describe each of these as follows:

“Critical thinking is a focused, careful analysis of something to better understand it…”

“Creative thinking is expansive, open-ended invention and discovery of possibilities…”

“(Communicating) is listening…speaking…reading…turn taking…writing…and using technology to engage in ideas”

And “(Collaborating) is allocating resources…brainstorming…decision-making…delegating…evaluating…managing…resolving conflict…and team building…”

David Ross, lays a foundation for the future of schooling around the 4C’s as he illustrates the need for educators to prepare students for more than just human-human interactions and to “prepare students for a future in which the 4Cs include our ability to engage with artificial intelligence in order to be effective learners, workers, and citizens.” (2018). Similar to the emergence of schools to teach students the components of the manmade mechanisms associated with reading, writing, and mathematics, schools must now adapt to equip our students with the capacity to work with newer tools of learning and insight. Tools that have the potential to impact our world and our minds.

Neurological Impacts

The last twenty years has afforded scientist more knowledge of the functioning of the human brain than the rest of recorded history combined. Leading studies of neurodevelopment in childhood and adolescence is Naama Barnea-Gorlay. In an article published in 2005 by her and several colleagues, Dr. Barnea-Gorlay outlines the patterns by which most healthy individuals can expect brain maturation in the prefrontal cortex and other areas responsible for attention, motor skills, cognitive ability, and memory. The development of the brain is more than a natural phenomenon. It is enhanced by exposure to mediums and stimuli. In 1991, Cesi gave exposure to the notion that due to formalized schooling and the advent of required mathematics education for students, average IQ scores, the most generalized measure of brain development, have continued to rise for the past one hundred years. Further studies have examined whether this pattern remains true in spite of the rise of technological innovation. Lorenzon Cecutti, the author of a much-cited paper, believes that despite popular opinion, technology, including AI, may not harm human intelligence, but complement it. He shares five predominant ways cognition may be elevated by technology: easing complexity, reliance on skill development, freed mental capacity, increased flexibility, and greater self-control.

In a 2016 study by the World Economic Forum titled, The Future of Jobs, it was reported that the three most sought-after job skills were understanding complex problems, coordination with others, and management of teams. In 2018, the same study added two more important needs: critical thinking and creativity. Perhaps this study can give hope to the notion that machines are not taking away the jobs of humans, but perhaps actually elevating humans’ abilities to meet the needs of the market by changing the way our brains function, organize information, and engage in critical thought.

Conclusion: Recommendations, Cautions, To Be Studied

Education continues to evolve as knowledge of the human brain continues in tandem with rapid advances in technology. Although pedagogy has evolved continuously throughout the history of schooling, the challenges and opportunities being presented by artificial intelligence are novel for many.

Educators must be prepared to grapple with finding the balance between efficiency and effectiveness, as well as the harmony between equity and accessibility. Recognizing that the newest iteration of technological advancement is not an ending point, but merely another defining point, can challenge us all to continue to grow, learn, and debate the future we hope to create for ourselves and our students. 

The following questions have emerged as areas for further study:

  1. What skills might students need to navigate, interact with, and leverage AI with success? 
  2. What is the impact on students’ developing brains and social skills? How will students maintain creator skills?
  3. What do teachers need to navigate, interact with, integrate, and leverage AI with success and in service of learning and learners?
  4. How might instruction shift to include AI? 
  5. How can AI make teachers’ jobs more efficient and more effective? 
  6. How can AI solve for problems such as shortages of resources (human, capital, time)?

Bibliography

Agarwal, A., Gans, J., and Goldfarb, A. (2018). Prediction Machines: The Simple Economics of Artificial Intelligence. Boston, MA: Harvard Business Review Press.

Bacon, F., (1597). Meditationes sacrae. 1st ed. Londini.: Excusum impensis Humfredi Hooper.

Barnea-Goralay, N. et.al (December, 2005). White Matter Development During Childhood and Adolescence: A Cross-sectional Diffusion Tensor Imaging Study. Cerebral Cortex. 15 (12): 1848-1854.

Ceci, S. (1991). How much does schooling influence general intelligence and its cognitive components? A reassessment of the evidence. Developmental Psychology, 27(5), 703 – 722.

Cecuttie, L. Chemero, A., and Lee, Spike. (July 1, 2021). Technology may change cognition without necessarily harming it. Journal of Nature Human Behaviour. https://doi.org/10.1038/s41562-021-01162-0 

Davenport, T. and Mittal, N. (January-February, 2023). Stop Tinkering with AI. Harvard Business Review. 

Fogel, D. (May 21, 2022). Machine Learning and the City: Applications in Architecture and Urban Design. John Wiley and Sons Ltd.

Horakova, T., Houska,M.  and Domeova, L. (2017). “Classification of the educational texts styles

with the methods of artificial intelligence,” Journal of Baltic Science Education, vol. 16,

 no.3, pp. 324–336, 2017.

Kumar, K., & Thakur, G. S. M. (2012). Advanced applications of neural networks and artificial 

intelligence: A review. International journal of information technology and computer 

science, 4(6), 57.

Leung, Constant. (December, 1, 2007).. “Kay L. O’Halloran: Mathematical Discourse-Language, 

Symbolism, and Visual Images”. Applied Linguistics. 28 (4): 630–634. doi:10.1093/applin/amm040. ISSN

0142-6001

List, J. (2022). The Voltage Effect: How to Make Good Ideas Great and Great Ideas Scale. New York, NY: Penguin Random House.

McKendrick, J. (September 27, 2021).  AI Adoption Skyrockets Over the Last 18 Months. Harvard Business Review.

National Institute of Health. 

https://www.ninds.nih.gov/health-information/public-education/brain-basics/brain-basics-

life-and-death-neuron#:~:text=Neurons%20are%20information%20messengers.,rest%20

of%20the%20nervous%20system. Retrieved February 14, 2023.

Oxford University Press. (n.d.) Artificial. Oxford English dictionary. Retrieved February 12,

 2023.

O’Brien, M. (2023). What is ChatGPT and why are schools blocking it?. The Associated Press.

O’Neil, C. (2017). Weapons of math destruction. Penguin Books.

Pew Research Center.  (2018). Improvements ahead: How humans and AI might evolve together in the next decade. Retrieved from: https://www.pewresearch.org/internet/2018/12/10/improvements-ahead-how-humans-and-ai-might-evolve-together-in-the-next-decade/

Pham, S. (June 8, 2022). The development of artificial intelligence in education: A review i context. Journal of Computer Assisted Learning.

Printing press – history. (n.d.). Retrieved February 26, 2023, from https://www.history.com/topics/inventions/printing-press 

PwC (PricewaterhouseCoopers). (2018). “Artificial Intelligence in HR: A No-Brainer.” https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html. Retrieved February 14, 2023.

Quiroga, R. Q., Reddy, L., Kreiman, G., Koch, C., & Fried, I. (2005). Invariant visual representation by single neurons in the human brain. Nature, 435(7045), 1102-1107.

Ross, D. (2018). Why The Four Cs Will Become the Foundation of Human-AI Interface. Getting Smart. https://www.gettingsmart.com/2018/03/04/why-the-4cs-will-become-the-foundation-of-human-ai-interface/. Retrieved February 28, 2023.

Sa’idi, M., Mostoufi, N., Sotudeh-Gharebagh, R. (2012) Application of bee colony algorithm for

Optimization of CCR reforming process. Journal of Computer Aided Chemical Engineering. Volume 31. Pages 620-624

Sandra, D., Segers, J. Giacalone, R. (2021). “A Review of Entrainment in Organizations”. Academy of Management. Published online July 26, 2021.

Taylor, M. (2017). Good Work: The Taylor Review of Modern Working Practices, London: Department for Business, Energy and Industrial Strategy. Available online at 

https://www.gov.uk/government/publications/good-work-the-taylor-review-of-modern-working-practices.

University of Nebraska.  (2023). Thoughtful Learning: What are the 4C’s of learning skills? https://newsroom.unl.edu/announce/csmce/5344/29195#:~:text=The%2021st%20century%20learning%20skills,success%20in%20school%20and%20beyond.&text=Critical%20thinking%20is%20focused%2C%20careful,something%20to%20better%20understand%20it. Retrieved February 28, 2023.

World Economic Forum. (January 2016). The Future of Jobs. https://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf

1 thought on “Artificial Intelligence in Education- A Literature Review

  1. Thanks for sharing this informative post. I’m always interested in learning more about new technology and how it’s impacting the world.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close