<aside> <img src="notion://custom_emoji/7f3a86c4-0e4f-8193-9274-00038d571f22/294a86c4-0e4f-8053-a481-007af138f2db" alt="notion://custom_emoji/7f3a86c4-0e4f-8193-9274-00038d571f22/294a86c4-0e4f-8053-a481-007af138f2db" width="40px" />

This page explains the psychological and cognitive consequences of both AI avoidance and AI overuse — for educators, researchers, and leaders designing healthy digital habits. In short: abstinence and addiction both switch off the mind; COMINDING keeps it active, balanced, and ethically awake. It matters because offloading and avoidance each erode reasoning, maturity, and trust — the foundations of governance and growth. Use it when shaping AI literacy programmes, workplace policies, or education reforms that build mental resilience.

</aside>

Abstaining from AI or outsourcing your brain to it are two ways of losing the plot. Younger generations often split along that fault line. The best path is governed co-thinking, which is what COMINDING offers: FRAMING to set intent, CLARITY to interrogate outputs, SANITY to steady the emotions and ethics. The science today can support the risks of over-reliance and the reality of generational polarisation, and it tentatively suggests that overall mental functioning peaks in the late fifties before declining. What we do not yet have is proof that COMINDING will make people reach maturity earlier or delay decline. It is plausible. It is not proven. Keep your curiosity on and your claims honest.

1) The problem with purity and the problem with faith

There are two fashionable mistakes about AI. One treats AI like contaminated water: do not drink, do not touch, tell your mates to back away slowly. The other treats AI like a priest: consult, obey, cut out the messy middle where your mind earns its keep. Both instincts show up strongly among younger users, often in the same cohort. Surveys and fieldwork find Gen Z more likely than older workers to fold AI into daily routines and sometimes trust it over humans in work contexts. At the same time, large public surveys record persistent anxiety about accuracy, bias, privacy and the human cost, which feeds principled avoidance. In short, a split personality.

The risks of blind faith are not just moral. They are cognitive. Multiple studies now flag cognitive offloading as a mediator between frequent AI use and lower scores on critical thinking tasks. The relationship is correlational rather than causal, but the pattern is consistent: the more you delegate the heavy lifting, the less your reasoning circuits get exercised. Early longitudinal work from MIT adds neural measures to the picture and reports reduced engagement on EEG among heavy LLM users across several months, with weaker linguistic and behavioural performance to match. Caveats abound. The samples are small. The designs are early. The direction of causality is not fully pinned down. Still, the warning light is on.

The abstinence camp is not off the hook either. Avoiding AI entirely does not insulate you from AI. It is entering enterprise systems, search, workplace tools and everyday services. Opt-out is a luxury and rarely a strategy. The public wants more control and better governance, not magical thinking.

2) Younger generations and the trust-purity split

Zoom in on youth culture and you see two poles.

Both positions are understandable. Both are incomplete. The first risks passivity. The second risks irrelevance.

3) What the brain science actually says about maturity and decline

The popular myth says your mind peaks early and everything after thirty is tidy decay. Recent integrative work tells a different story. When researchers combine cognitive measures with personality and decision traits into an overall functioning index, the curve rises longer than many expect and appears to peak in late midlife. One large analysis reports the high point between age 55 and 60, followed by decline from around 65 that steepens after 75. News summaries get the headline right, but the paper is the anchor. Importantly, not all components peak together. Fluid reasoning and processing speed crest earlier. Knowledge, conscientiousness and emotional stability keep climbing later. Heterogeneity is the rule.

We also know that certain technologies can shape cognition in both directions. GPS reliance, for instance, has been associated with weaker formation of cognitive maps, while alternative designs can preserve navigation skills. Translation: tools matter, but how you use them matters more.

4) Where COMINDING fits

COMINDING is a human-led discipline for working with AI that keeps cognition switched on. It has three parts:

Those three steps align with long-established principles in cognitive science: goal setting aids control, metacognition improves reasoning accuracy, and emotion regulation protects working memory under pressure. The mapping is theoretical, not yet tested as a single triad in AI contexts, but the ingredients are reputable.