Definitely. We have to find ways to replicate this.
One thing I've noticed is that I've actually learned a lot more code about things I didn't understand before. Just because I built guardrails to make sure that they are built exactly the perfect way that I like them to be built. And then I've watched my AI build it that way dozens of times now. Start to finish. So now I've just seen all the steps so many times that now I understand a lot more than I did before.
This sort of thing is definitely possible, but we have to do it on purpose.
> There is no "separation of moving things on the job vs. moving things at the gym" when it comes to creative craft...
- Coming up with names for cities in a role-playing game you're making
- Summarizing an idea that you're writing about
- Doing research for an article
- Brainstorming character names
- Creating an aesthetic for a new website for a customer
- Etc, etc.
I could go on for days with these examples. And so could any AI.
Pre-2022 ALL these were done 100% by a human.
Now they're not. Now creative people are using AI to help them massively with tons of these. So, yes, the separation needs to happen there as well.
For example, maybe you say, I'll never use AI to help me name characters. Or to come up with plot lines. Or whatever.
For me, there isn't the slightest difference from before 2022 and after 2022, since I continually choose to boycott genAI services, and as an activist in the Pro-Craft movement, encourage others to do the same.
How many senior developers do you need to tell you that they've been vastly surpassed (in coding) by AI before you believe them? The creator of Claude Code just said he's not opened an IDE in a while. He was a principal engineer.
And why would you think this would be the only place that'll happen?
I agree there are things they can still do better than AI, but coding isn't one of them.
You do realize that the entire point of the post is to be cautious of AI, right?
And to make sure that you have your own personal goals separate from it, and that if you're getting help from it you need to make sure it's in line with those goals.
I disagree with your insinuation that being creative with your mind as a human being is a form of recreation or personal growth or whatever (working out at the gym) whereas "creativity" on behalf of work projects gets done by Big Tech's generative algorithms. It becomes even more confusing because then later on you say "I inevitably will use AI to do many of these Gym tasks" …huh? So now even the creative "gym tasks" are done by AI too? Apparently our personal goals will shrink to a freakishly tiny little list by the end zone of your thought experiment!
But maybe both of those are in the category of undesirable things.
And the things we end up with are like art and baking and walking and talking and drinking coffee and such.
Professional Chess is a nice pattern here. A calculator can beat Magnus Carlsen at this point, but Chess is more popular than ever. So it should be ok if AI/Robots are better than us at all the stuff we still decide to do.
Except Professional Chess, taken to mean players earning a living solely from paid tournament play, is in the low hundreds? Thousands? Meanwhile there are over 20 million 'professional' software developers. There are many things about that single number demographic that I would argue against, but despite that I'm not sure there's ever been a market for any kind of 'professional chess player', yet there is for 'professional software developer' (for some definitions of 'professional' and 'software').
I think that comes down to documenting the mindset as a goal and then using all the AI, scaffolding, and tools available to that system to help you nurture that mindset.
OP here, yeah, I think that's a really good point.
I feel like the way I'm building this in is a violent maintenance of two extremes.
On one hand, fully merged with AI and acting like we are one being, having it do tons of work for me.
And then on the other hand is like this analog gym where I'm stripped of all my augmentations and tools and connectivity, and I am being quizzed on how good I could do just by myself.
And based on how well I can do in the NAUG scenario, that's what determines what tweaks need to be made to regular AUG workflows to improve my NAUG performance.
Especially for those core identity things that I really care about. Like critical thinking, creating and countering arguments, identifying my own bias, etc.
I think as the tech gets better and better, we'll eventually have an assistant whose job is to make sure that our un-augmented performance is improving, vs. deteriorating. But until then, we have to find a way to work this into the system ourselves.
there could also be an almost chaos-monkey-like approach of cutting off the assistance at indeterminate intervals, so you've got to maintain a baseline of skill / muscle memory to be able to deal with this.
I'm not sure if people would subject themselves to this, but perhaps the market will just serve it to us as it currently does with internet and services sometimes going down :-)
I know for me when this happens, and also when I sometimes do a bit of offline coding in various situations, it feels good to exercise that skill of just writing code from scratch (erm, well, with intellisense) and kind of re-assert that I can do it now we're in tab-autocomplete land most of the time.
But I guess opting into such a scheme would be one-to-one with the type of self determined discipline required to learn anything in the first place anyway, so I could see it happening for those with at least equal motivation to learn X as exist today.