As a long-term project, I've been slowly and consistently introducing Jana's younger brother to basic programming concepts. He's fourteen, he loves videogames, and he has no idea how the actual development cycle works. Coming from a "tablets and smartphones" family instead of a "laptops and desktops" family, his resources (and opportunities for experimentation) are scarce and his interest is still largely theoretical. But, as everyone tells me, he's from the "right generation" for learning all of this technical stuff and he'll take to it like a fish to water. No on-ramp needed.
...And I have a few problems with that.
What It Means When You're "Good with Computers"
Lots of people are "good with computers," but being "good with computers" doesn't mean what it used to.
86% of US households have some kind of desktop, laptop, tablet, or smartphone. The consumer technology market is humongous and, while might have started in the fringe of the business market as a look-what-I-can-do-with-my-home-lab niche, it's grown into something that doesn't fit into the old generalizations.
And do you know what's been central to that growth? The integration of UIs and workflows for casual users that don't scale out to include the more technical and fiddly parts that people actually learn from.
Digital Natives? More Like UI Natives
Modern computers are, essentially, Memory Palaces.
When you look at a graphical user interface (especially modern ones that incorporate motion and velocity), what do you have? You have a series of abstract spaces and spatial relationships where information can be stored, retrieved, and navigated, based on a series of "spatial signposts" that reinforce the navigation process.
The "paperwork at a desk" metaphor doesn't just give us easy touch points for what files and folders are; it anchors the notion of a graphical user interface in physical dimensions by giving users a way to visualize where digital items belong in relation to each other. This notion of where things "should" be allows users to work through memorized functions and features sequentially, combining spatial data and direct recall in a way directly comparable to Memory Palaces
But what does this have to do with "generational skills?"
As much as I love talking about UI design and UX theory, I promise that we're not caught up in a digression here.
Since graphical user interfaces rely on spatial abstractions, there's a skill barrier that keeps GUI's from feeling as natural as Memory Palaces. GUI's provide only a limited view, and it's easy for inexperienced users to confuse that window into a spatial environment for a changing (non-spatial) image they have to memorize.
And if you're wondering what factors influence a user's ability to translate spatial abstractions, age does play a non-trivial role. Kids are developmentally primed for spatial skill acquisition and there's a fairly steady fall-off as individuals age.
Kids really are better at navigating abstract digital space.
The problem with all of this, of course, is that the skills needed to navigate digital environments have diverged from the skills needed to fix, modify, or create digital environments. Children are developmentally primed to adapt to spatial weirdness, but knowing how to navigate an interface won't help this next generation of "digital natives" with the work that actually goes into these kinds of things.
Will "digital natives" grow up with an inherent ability to navigate abstract spatial maps through their early immersion in touch interfaces?
Will we see similar spatial adaptions for generations that grow up with Voice Assistants, AR, and VR?
Will instinctual GUI skills help them learn programming languages or server administration?
Hell no. That stuff hinges on the other common domain of early skill acquisition:
Here Be Dragons: Falling Off the Edge of the UI Map
The core difference between most desktop operating systems (for the end-user) is where the user hits the limits and rough patches of the UI, and at what point they need language skills in order to do whatever it is that they need to do.
But as we well know, the expansion of the consumer tech market has pushed UI development away from language-driven interfaces and towards touch-driven spatial interfaces; there's no real route for users to learn command languages or programming languages through need-driven exposure. The command prompt is foreign land, and the edges of the map are getting smaller.
Worse yet, since Digital Natives are native to mobile operating systems, they're stuck with a hardware gap in addition to the interface gap. Windows 10 and OSX might have some pretty steep cliffs and some shitty help documentation, but they're positively indulgent compared to mobile operating systems.
All you get is the experience you pay for; opportunity costs extra.
Fighting Assumed Competence
This less-than-organized rant started with me expressing my frustrations over the idea that kids are "naturally good with computers" when the consumer market has actually insulated them from the nuts and bolts of the tech market. To put a proper label on what I have a problem with, let's call it "assumed competence."
I have a problem with assumed competence, as it typically rears its head when kids want to learn or want help with something and someone ignores them. Assumed competence is a dismissal, and it represents a decision-maker's failure to properly conceptualize a skillset or a series of obstacles. Assumed competence undermines the platforms-and-resources model that's key to self-education, and it makes shit a lot harder than it needs to be for kids who are already a step or two behind.
Assumed technical competence is especially frustrating for me, however, as it hand-waves the hardest parts of programming and backend work and it encourages educational complacency during a time when early skill acquisition is doubly useful. Kids are already forced to pack in a staggering amount of educational content into a limited time window in a staggeringly inefficient environment; failing to respect how difficult programming can be while simultaneously over-prioritizing material that doesn't contribute to overarching educational goals is inefficient and ineffectual.
A kid who grows up with tablets and smartphones who also happens to really like software development won't be stuck behind the curve, but they won't have a leg up, either. Generational assumptions of all sorts rustle my jimmies, but I'm particularly passionate about assumed competence. It isn't just a missed opportunity to give a kid a leg up, it's an ignored opportunity to help a kid find his passion.