Do I have to complete the quote? It would be additional evidence. [And the moderators rated that FP "Informative"? Really?]
Also me thinks though dost not followeth the link and readeth the story. The original is on the clickbait-ish side, but the Slashdot summary version much more so. The original starts with videos, and I think the complicated answer is more along those lines. So here's a short summary of my latest thinking on the topic of human intelligence, such as it is.
Long time ago some general purpose computing units evolved. They were built out of neurons. Before the mammals showed up, though I'm not sure where to draw the line. Some of them got incorporated into specialized systems for processing certain types of data such as smells and sounds and images. There was no real design process involved in those times, just lots of trials and lots of errors, but Ma Nature is kind of efficient and the sooner the errors died, the sooner their amino acids were recycled back into food and fertilizer. (Going at that level because I'm currently reading "Amino-San No Himitsu", Volume 222 in the series of "good" manga secrets.)
But recently humans came along and subverted the hardware with strange new software. We couldn't do much with the smell stuff, but we added language programming on the sound system and wound up with stories and then implemented concepts like self-awareness and consciousness on top of that "new OS". Complicated part was integrating some of visual stuff into it, but I think the example of cats can help clarify things. A cat can hear and see a lot of stuff and can learn all sorts of complicated behaviors when it comes to catching birds and mice, but there is no language and therefore no internal stories of the cat itself and no external stories to teach to other cats. But working on the level of the sound system its basically a linear thing, like the tape in Turing's machine. From an evolutionary perspective, strings of sounds made sense, but at the level of good and bad patterns. The steady sound of the wind in the grasses versus the changing sound of an approaching prey or predator.
Then we invented written language and things became really perverse as we subverted the visual system. The evolved hardware did stuff like recognize lines and colors at the low level and recognize more complicated constructs at higher levels. Originally we could recognize the foods and predators and take appropriate actions, but when we recognized words we compressed concepts to a whole new level--but the general purpose neural hardware didn't care that it was solving much more complicated problems at much higher levels. Higher level processing of complex words, not simple corners. Very flexible programming indeed.
Now things have gotten really perverse in bad ways. The kids are now saturating their video channels with garbage like cute cat videos instead of complicated books. And the AIs are evolving by design to handle many kinds of problems that really strained our human hardware. "Simple" example from the LLMs. Humans are basically evolved to handle two to four dimensions at a time, but the AI chips are designed to handle thousands and millions of "dimensions" at the same time. We humans have to shift our levels of abstraction and reference frames to see relevant aspects of various problems, but the AIs just swallow the entire problem whole without even needing "meaning" for most of the dimensions...