Optimistically, the idea could be to push prerequisites to an always-on, ever-available resource. Depending on the major, skills could include organizing papers into outlines, using Excel, or building a computer.
Professors can tailor lectures to narrower topics or advanced, current, or more specialized subjects. There may be less need to have a series of beginning or introductory courses--it's assumed learners will avail themselves.
Pessimistically, AI literacy contributes to further erosion of critical thinking, lazy auto-grading, and inability to construct book-length arguments.
Upfront computer literacy may have never been convincing enough; AI could be the ubiquitous and timely leverage to open the way for general machine thinking.
The replaced arm has to graft itself or bootstrap all the things the body already had. It seems insurmountable, but the same thinking that brought this cybernetic substitution seems missing.
Data centers house hardware, and it's a land grab for compute. What actually runs post-AI depends on its owners. A glut of processing might be spent reverse-engineering efficient heuristics versus the "magic box."
To account for something is to track it, and to track it there needs to be controls. Ownership of execution spans a distinct temporal boundary.
There is a difference too between the letter recipient and the owner. For the latter, all labor exerted equates to all rewards reaped, which is at most fractional for the other. Total productivity capture disadvantages nearly everyone.
Regarding demographics, have yet to see an economic model that accounts for imperfect rationalists, or idealists, optimists, and romantics.
reply