"Can you write a Python program to check for a prime number?"
I asked my cousin this simple question last weekend. She's in her 4th semester, studying computer science, getting solid grades.
"Yes, I can," she said confidently.
I handed her my laptop and sat back to watch.
30 minutes later...
She finally solved it. But here's what shocked me—she struggled with basic logic that should be second nature to any CS student.
Curious, I asked her, "How have you made it this far in college?"
She smiled and gave me an answer that changed how I think about education:
"ChatGPT."
That moment hit me like a brick wall.
Here's someone who's "learning" programming but can't solve a basic problem without AI assistance. She's getting good grades, impressing professors, and heading toward a tech career.
But does she actually understand what she's doing?
This isn't just about my cousin. This is about an entire generation that's growing up with AI as their intellectual crutch.
And it's making me ask the big question: Are we creating a generation of smart people who can't think?
Let's be honest about what's happening:
In universities:
In workplaces:
In daily life:
The pattern is clear: Every time we outsource thinking to machines, we lose a piece of our intellectual independence.
Before we panic, let's step back.
Calculators didn't make mathematicians stupid. They freed them to solve bigger problems.
Word processors didn't ruin writers. They eliminated tedious retyping and enabled better editing.
Search engines didn't kill research skills. They made information accessible to everyone.
So maybe the question isn't "Is AI making us dumb?"
Maybe it's "Are we using AI dumbly?"
I've noticed people fall into two camps:
The difference? Dependent users let AI think FOR them. Augmented users use AI to think WITH them.
Here's what I realized watching my cousin struggle:
The problem isn't that AI exists. The problem is that we're not teaching people HOW to think alongside it.
We're teaching students to use tools without building the fundamental skills to understand what those tools are doing.
It's like teaching someone to drive using only cruise control and wondering why they crash when they need to brake.
While AI gets better at everything, these human abilities remain irreplaceable:
Critical Thinking
Creative Problem-Solving
Contextual Understanding
Learning How to Learn
Want to test yourself? Try this:
Pick a skill you use AI for regularly. Writing, coding, analysis, whatever.
Set a timer for 30 minutes.
Do that task without any AI assistance.
How did you feel? Frustrated? Lost? Or energized by the challenge?
Your answer tells you whether you're using AI as a crutch or a tool.
Here's my prediction:
In 10 years, there will be two types of professionals:
Which one do you want to be?
Monday: Identify one area where you're overly dependent on AI
Tuesday: Practice that skill for 30 minutes without any AI help
Wednesday: Use AI to learn something new, but make sure you understand the fundamentals
Thursday: Teach someone else what you learned (teaching reveals gaps in understanding)
Friday: Reflect on what you've learned about your own thinking process
AI isn't making us dumb.
But using AI without thinking is.
The goal isn't to avoid AI—that's impossible and stupid. The goal is to stay smarter than the tools we use.
My cousin taught me something important that day: The real skill isn't knowing how to use AI. It's knowing when NOT to.
Because at the end of the day, AI can help you think faster, but it can't teach you how to think better.
That's still on you.
How do you balance using AI tools while maintaining your own thinking skills? I'm genuinely curious about your strategies—drop a comment below and let's figure this out together.