|Adam Murga; 3rd Year Philosophy and Ethics student University of Leeds 

 

Over one billion people are currently using AI tools, how is this affecting our minds? 

 

Research and surveys into the ease and accessibility of auto generated AI responses to problems, wether emotional or academic, has found grave statistics on fears around critical thinking skills. 

 

A study carried out by Oxford University Press found that out of 2000 students age 13-18, 62% believed AI negatively affects their cognitive development, in learning and skills. The American Association of Colleges discovered that out of the  1,057 faculty members surveyed, 95% believe students are becoming overly reliant on artifical intelligence, and 90% percent of faculty said AI would decrease students’ critical thinking abilities. 

 

 

So what did a philosopher have to say about this? 

I spoke to Adam Murga, a 3rd year philosophy student at the University of Leeds, who’s openly spoke out on rejecting AI, to gain better understanding of what young academics may be feeling about such vast changes, when on courses that demand conceptual and subjective thinking. 

 

Adam rejects AI on both an individual and on a societal level. He believes ‘AI hinders critical thinking skills, especially those of young people enrolled in institutions of education’. I study philosophy’ he continued.  ‘A practice which despite some attempts, cannot escape the bounds of subjectivity’. Adam told us how he was shocked to learn that his course mates use large language models for assistance with coursework and readings. To Adam it seemed ‘entirely counterproductive’, because these systems regurgitate common readings found online, ‘leaving no room for the human conviction required to effectively build an argument’. 

 

How may this affect society on a broader scale? 

 

AI dictates, using data sets and use interactions, what is important and not important when answering queries. Adam believes this leaves comprehensive skills significantly stunted, and their ability to reason and argue in accordance with their beliefs could soon be nil. But how does this affect society on a broader scale? Adam believes if we live in a world where even those studying humanities at an institution of higher education are not able to engage in analysis or criticism without guidance, it would likely follow that we are left with a majority of society who are susceptible to propaganda and brainwashing.’. 

 

South Korea on AI regulation 

January of this year saw South Koreas introduction of the ‘AI basic act’ which requires labelling of AI generated content, regulations on AI tools for study and risk assessments for high impact AI, according to an article by the Guardian. However, public responses were mixed. 33.3% of respondents said that AI and platform regulations limit developers’ creativity and make business operations more difficult, and 45% of IT professionals felt ‘depressed’ by these excessive platform regulations. It begs the big question of whether those in power are engaging with public opinion on either end of the spectrum and whether universal implementations to regulate such rapidly expanding intelligence will take place before fears become reality. The future is uncertain, but the statistics are not, ease is outweighed by fear and order and reassurance is demanded toward those in power.