Recently, I keep encountering long articles written by AI, with analyses that are quite insightful, but after reading them, I feel like I haven't gained anything.
The biggest problem isn't actually a lack of information, but rather the flood of synthetic garbage that’s drowning everything.
When AI can produce 10,000 words in a second, 99% of online content naturally depreciates into noise.
For many people, the competitive landscape is shifting: in the past, it was about who could dig up more info; now, it’s more about who can ruthlessly filter out more trash.
We should develop a sufficiently strict information filter that only leaves sources that are counterintuitive, reliable, but not too comfortable—probably more valuable than hoarding a bunch of AI writing tools.
What you feed your brain over the long term probably determines the kind of judgments you produce.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Recently, I keep encountering long articles written by AI, with analyses that are quite insightful, but after reading them, I feel like I haven't gained anything.
The biggest problem isn't actually a lack of information, but rather the flood of synthetic garbage that’s drowning everything.
When AI can produce 10,000 words in a second, 99% of online content naturally depreciates into noise.
For many people, the competitive landscape is shifting: in the past, it was about who could dig up more info; now, it’s more about who can ruthlessly filter out more trash.
We should develop a sufficiently strict information filter that only leaves sources that are counterintuitive, reliable, but not too comfortable—probably more valuable than hoarding a bunch of AI writing tools.
What you feed your brain over the long term probably determines the kind of judgments you produce.