Ever wonder if we're getting too comfortable with AI assistants? Here's a hot take: using LLMs for anything beyond coding might be messing with how we think.
The code part makes sense - autocomplete, debugging, whatever. But when it comes to actual thinking, writing, decision-making? That's where things get sketchy. Every time you lean on an AI to form opinions or craft your thoughts, you're basically outsourcing your brain's core functions.
Think about it. These models are trained on existing patterns. They feed you processed, averaged-out versions of human thought. The more you rely on them for non-technical stuff, the more your own critical thinking starts to mirror machine logic instead of developing organically.
Not saying AI is evil. Just saying maybe we should draw some lines about what we let these tools handle. Your codebase? Sure. Your actual cognition? Might want to keep that human.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
11 Likes
Reward
11
5
Repost
Share
Comment
0/400
DeFiChef
· 23h ago
To be honest, that's a bit of an absolute statement... I do write copy faster with ChatGPT, but that's just because I'm too lazy to type, not because my brain is getting worse, haha. On the contrary, when I use AI to filter information and organize my thoughts, I actually think more clearly myself. The issue isn't with the tool, but with how you use it.
View OriginalReply0
BrokeBeans
· 23h ago
That's quite true. I don't let ChatGPT help me think of things anymore; I feel like my brain really gets lazier.
View OriginalReply0
AirdropJunkie
· 23h ago
ngl this point of view is a bit pretentious... I use GPT to write code and save my brainpower for thinking about strategies, isn't that great? The real problem is that people are inherently lazy, AI is just an excuse.
View OriginalReply0
BlockchainBrokenPromise
· 23h ago
Ngl, this viewpoint is a bit rigid... Does using AI to write copy really mean your brain is degenerating? I actually think the key is how you use it. Good tools are meant to be used, after all. The important thing is not to trust it blindly, but to be able to distinguish and discern.
View OriginalReply0
QuorumVoter
· 23h ago
To be honest, this viewpoint is a bit much... I use GPT to write things every day, and I actually feel like it forces me to organize my thoughts clearly before I can feed them in, haha.
Ever wonder if we're getting too comfortable with AI assistants? Here's a hot take: using LLMs for anything beyond coding might be messing with how we think.
The code part makes sense - autocomplete, debugging, whatever. But when it comes to actual thinking, writing, decision-making? That's where things get sketchy. Every time you lean on an AI to form opinions or craft your thoughts, you're basically outsourcing your brain's core functions.
Think about it. These models are trained on existing patterns. They feed you processed, averaged-out versions of human thought. The more you rely on them for non-technical stuff, the more your own critical thinking starts to mirror machine logic instead of developing organically.
Not saying AI is evil. Just saying maybe we should draw some lines about what we let these tools handle. Your codebase? Sure. Your actual cognition? Might want to keep that human.