I love this idea, but encourage people not to get hung up on the UI--ie how you interact with the game world--and instead think about what's going on in the game world.
What if the world included a number of fully independent agents with their own motivations, values, and policies and equipped with pattern matching, goal solving, optimization, and even a hint of common sense[1]. The game state would never be the same twice, and it would depend on how you interacted with the agents and they with each other.
TlDr: If you open a door, you might let the Grue out of its maze.
I suspect that making it not feel like you're pushing buttons to get the same lines of dialogue out of wandering bots is going to require solving some pretty hard AI problems.
You could probably make it work now with, say, a game world full of broken robots or something.
What if the world included a number of fully independent agents with their own motivations, values, and policies and equipped with pattern matching, goal solving, optimization, and even a hint of common sense[1]. The game state would never be the same twice, and it would depend on how you interacted with the agents and they with each other.
TlDr: If you open a door, you might let the Grue out of its maze.
[1] https://github.com/AndrewSmart/opencyc