Alex Kam

I'm Not Drinking the Kool-Aid (Yet)

Published on 4/3/2026

Let me get this out of the way first. Yes, LLMs have had tremendous impacts on software development and our ability to write code has changed significantly. I acknowledge that AI coding tools such as Claude Code, Codex, etc, has helped to save countless hours of time and have greatly lowered the barrier to entry to programming. This is not an article about why I think LLMs are terrible. Unfortunately, they have proven themselves to be great at what they do.

Instead, this is an article about why I am choosing to abstain from the usage of these tools due to my personal situation and opinions. The opinions below are not of any organisation I represent, but my own perspective as a developer. The facts and studies linked and presented, however, are up to your ownself to critically analyse.

I’m not drinking the Kool-Aid not because I don’t think LLMs are not productive, but because our abuse of the technology due to corporate greed has deteriorated our critical thinking while simultaneously flooded the internet with piles upon piles of garbage software and meaningless noise. Not to mention the plethora of ethical implications, the dire security risks, and potentially irrevocable environmental consequences.

I’m not drinking the Kool-Aid until I can confidently say that the AI models used in these LLMs are trained on ethically sourced data and the general public are equipped with the tools to navigate the several dangers that the technology poses to our daily lives.

LLMs Are Impairing our Ability to Critically Think

This is not a revolutionary take. I feel like there are thousands of articles out there complaining about how the people around them have suddenly lost the ability to explain fundamental topics, or struggle to understand their own work (code or otherwise). However, this is simply empiral evidence and any relationship needs to be analysed further.

So let’s analyse it further.

Critical Thinking and Cognitive Offloading

Critical thinking is defined as the ability to analyse, evaluate, and synthesise information to make reasoned decisions. It involves various cognitive processes, including problem-solving, decision-making, and reflective thinking. Critical thinking is what separates you from acting on every single primal instinct you have. It is the deeper layer of thought that feels difficult to activate as it forces you to engage with the subject matter on a more intense level. Critical thinking is a skill. And like any other skill, it needs to be regularly used to avoid degradation and it needs to be actively practiced to be improved.

Cognitive offloading refers to the delegation of mental tasks, our cognitive load, to external tools. The cognitive load is essentially the effort used in our working memory to retain information. Cognitive offloading is extremely common. Whenever you record someone’s number to your contact list, you don’t have to constantly remember it, giving you mental space for other things. When you use navigation software to go to a destination, it saves you from having to remember the route and the general area.

Cognitive offloading is generally a good thing. Cognitive Load Theory posits that the human cognitive system has limited capacity and reducing cognitive load can enhance learning and performance.

Brilliant! We should pass all our cognitive load to these AI tools then!

AI Brain and Overreliance

A study investigating the relationship between critical thinking, cognitive offloading, and the use of AI tools found that increased trust and usage in AI tools results in greater cognitive offloading, which in turn results in reduced engagement for critical thinking. Participants were sampled on AI tool usage, cognitive offloading (use of digital devices for memory / problem solving), and critical thinking. It found that individuals who depend too heavily on AI to perform analytical tasks may become less proficient at engaging in deep, independent analysis. This reliance can lead to a superficial understanding of information and reduce the capacity for critical analysis.

Of course, this is one study out of thousands that I happen to cherry pick, but you might enjoy reading Your Brain on ChatGPT, which found that essay writers using ChatGPT displayed weaker brain activity that those who didn’t. Surveys also show a split perception, students report AI helping to develop skills in relation to schoolwork, while some express concerns about it impacting their skills.

Like anything in life, things are not as straightforward as it seems. AI does play a hand in skill acquisition, due to the aforementioned cognitive load theory. However, overreliance has resulted in what I like to call AI Brain. When you’ve let Dario take the wheel and entrusted all your critical thinking to LLMs, it is evident that your skills will deteriorate without deeper thought.

My Personal Takeaway

As of writing this, I am at a critical foundational part in my career. I’ve had some experience as a Junior / Mid Developer in one field, but I’m aspiring to delve deeper into Game Development. There are so many new frameworks, engines, languages, and tools that I want to learn each and every day. Having used AI extensively for most of 2025, I have found that it has helped me learn new languages and syntax and solved hours of pain by scrawling through documentation and fixing bugs. However, I have also found that while my output had increased, my understanding did not. Sure, I was making more things and moving faster than I would have without these tools, but I was not retaining the information I had outputted!

It got so bad that I had multiple situations where I had to search up the syntax on how to iterate a dictionary in Python! This was a language I had learned for years and which I had been using daily at work. Not only was I embarrassed in myself, I was shocked at how far my memory had regressed.

Let me give you a thought experiment. Say you are on an 13 hour flight with your laptop. Without any access to wifi, local LLM models, or prior codebases but access to documentation to the language(s) of your choosing and man pages, could you make something from scratch and keep yourself continuously occupied?

When I presented myself with that thought experiment, I realised I was not the same developer I once was. I was too reliant on these tools that I had lost the ability to code by myself. I’m a self-proclaimed developer, for crying out loud! I should be able to make the things that come into my mind by myself!

Until I’ve acquired the foundational knowledge to tell what is wrong from right and to be able to use these tools to empower my learning instead of hinder it, I will not drink the Kool-Aid.

Oh, and by the way, the world is burning too.

The World is Burning

I’ve never considered myself to be an environmentalist. I still kind of don’t. However, I can help but feel miserable hearing the construction of yet another next mega-giga-tera watt centre that has greater demand for electricity than the entire African continent. The full scale of impacts on the environment by these data centers is yet to be fully studied (most aren’t built / fully operational at the time of writing), but forecasts and predictions do not look good.

The carbon emissions by data centres themselves will put the USA’s AI industry’s net-zero emissions targets out of reach. Not to mention the sheer amount of water needed, with studies estimating that AI demand alone would require 4.2-6.6 billion cubic metres of water annually by 2027, roughly half of the UK’s yearly usage.

But electricity grids are being decarbonised! Water usage is being made more efficient!

I hear you cry.

Well let’s also talk about manufacturing the GPUs being run at these data centres! A study investigated the environmental impacts of Nvidia’s A100 GPU, one of the most widely deployed GPUs for AI training during 2023-24. It analysed environmental implications from the resource extration required to manufacture the GPUs, the manufacturing itself, the electricity consumed during use, up until its end of life. It found that the carbon footprint per card (based on proxy and secondary data) was approximately 150kgCO2, with various health and environmental implications such as particulate matter, ionising radiation, the depletion of fossil resources, the deplotion of abiotic resources, and increases in eco-toxicity.

Corporate greed and the never-ending quest for productivity has generated demand for a technology we haven’t had the time to properly assess and prepare for. Yet another thing to add to the bottomless pit of things contributing to climate change.

Can this get any worse?

Oh yes. Yes it can.

We Live in a Society

The societal impacts of AI are vast and it is impossible to talk about all of them without going into crazy amounts of depth. Instead, let’s use my favourite digestible medium: a lighting round of dialogue-like exposition!

Did you know you can jailbreak LLMs to extract copyright protected books verbatim?

Hey, are they allowed to be trained on copyrighted materials in the first place?

Surely these companies didn’t pirate millions of books to train their models, right?

Wow these companies are sloppy. Imagine if they just accidentally released all of their closed-source code!

But at least we’re using these tools correctly right? They tell us what we wanna hear, even if that means self-harm or taking one’s own life.

So they can be biased, at least they don’t hallucinate and misinform us, right?

Yes, the social implications are broad. And this is a tricky topic because different people have different values and what they consider to be important. As a game developer, artistic integrity and creative property are things I value. I may also have views on how modern technology has become enshittified with constant barrages of ads and subscription models limiting our access to the Web.

However, I understand not everyone shares the same opinions or values as me. You decide on your own values. You decide what’s important to you. Then analyse news and facts from different angles and build your own unique perspective. I’m not telling you what to feel or do.

That’s a Wrap

It is without a doubt that the general public’s perception and usage of LLMs needs to be better educated and controlled. We are giving more personal information and access to these technologies than anything else in human history. In an era starved of real human connection and where our attention span is currency in the digital economy, we are losing our ability to think while the world burns around us.

At the same time, the technology is real. The productivity gains are there and the way to achieve them, if educated and accompanied with critical analysis is real. Unfortunately, when thinking about the kind of developer I want to be, the things I want to learn, and the values I want to respect, I’ve made my decision, for now.

I’m not drinking the Kool-Aid.

back-button