Using AI to think together, not to think for us
As a university lecturer, I cannot deny the widespread use of AI, which has already become an essential tool in education today.
However, I am deeply disappointed to see that the majority of our students use AI to complete their assignments for them — not to sharpen their thinking skills. Instead, AI is used as a robot for finishing a given task.
I observe that they are telling AI to think for them, rather than thinking with AI. I believe this trend will, over time, blunt the minds of our youth, consequently weakening the value of our young generation's human capital.
Living with AI is a reality we cannot reject. Statistics project that by 2027, spending on AI in the education sector worldwide will rise to USD 20 billion, a 45 percent increase annually. What I hope for is that we learn to use AI more responsibly.
First, AI can help us personalize learning. Based on our interactions with it, AI can understand us and help suggest learning materials and self-assessments suited to our level of understanding and ability. This is how we should use AI, as a relatively fast way to help us grow.
In the past, we had to go to bookstores to find reference books and workbooks to test our understanding. Even then, some workbooks were too difficult, while others were too easy and did not challenge our abilities. Based on our conversations with it, AI can understand our skill level and alieviate this problem by tailoring specific exercises to our capabilities.
Second, AI can be our debate opponent. This aligns with what philosopher Martha Nussbaum emphasized in her book, "Not For Profit." In the book, she highlights the importance of the Socratic method in learning, especially at the higher education level. This not only sharpens our argumentation skills, but also enhances our ability to think critically. According to many reports, critical thinking is the most important skill in every industry in the 21st century.
To sharpen our critical thinking skills, AI can be used as a debate opponent to test how well we understand an issue and how capable we are of defending our stance.
Third, AI can provide feedback. This complements the first and second points mentioned earlier. After doing exercises and debating with AI, it can summarize our progress by providing constructive feedback. Traditionally, we only receive feedback from teachers, however, the quality of feedback is dependent on a person's mood as human feedback can change according to the situation and atmosphere. But with AI, the feedback is more objective. AI doesn't have moods like humans do.
Nevertheless, we must be cautious when using AI, lest we become overconfident about our own intelligence. "The more that people use AI," writer Drew Turney said, "the more likely they are to overestimate their own abilities."
When ordinary people rely heavily on AI to complete specific tasks, it has the potential to create an overinflated sense of confidence in their own abilities, known in psychology as the Dunning-Kruger effect.
A study published in Computers in Human Behavior (Feb – 2026), titled "AI makes you smarter but none the wiser: The disconnect between performance and metacognition," reported that when humans place excessive trust in AI-generated work without monitoring, editing, etc., it increases their overconfidence in their own abilities, and this will ultimately dull their basic skills as workers.
What do I mean by "basic skills"? For example, if writing tasks are completely handed over to AI, then one day, humans potentially risk losing their ability to compose well. So, don't delegate all work to AI.
Finally, AI can be an excellent tool to accelerate our learning process. But, sadly, today AI is more commonly used to complete specific tasks by irresponsible users. If a university assignment is given and ultimately completed entirely using AI, then what is the point of going to university?
Therefore, we must use our sound minds to set clear boundaries in the use of AI because AI will not "think wisely" for the benefit of our lives. We are the ones responsible for ourselves.
Sadiq Salihoddin is a lecturer from Malaysia teaching Malay language studies at Guangxi Minzu University, China.
The views don't necessarily reflect those of China Daily.
If you have a specific expertise, or would like to share your thought about our stories, then send us your writings at opinion@chinadaily.com.cn, and comment@chinadaily.com.cn.






























