AI or VEB?

Geoff Staneff
3 min readMay 30, 2024

--

From British Heritage Travel: Household servants at Wrexham in 1912 including gardeners, housemaids, footmen, butler, cook, laundress, housekeeper, and the estate foreman. — National Trust / Public Library
Servants at Wrexham in 1912

VEB — How P.L. Travers got to AGI before ChatGPT.

Earlier this month Sam Altman got some press for explaining what he’s going for with all this running around making AIs nonsense. This piece in MIT Technology Review has an interesting quote that lays bare the promise of AI in Altman’s estimation. Of course it is the super-intelligent colleague quote, which is problematic for a variety of reasons.

This got press as a clear elucidation of the amazing vision of a possible future enabled by AI… but really this is a repacking of an old model in a crude way and at great societal cost. All too often we’re offered a vision of the future that is just a crappy implementation of yesterday’s news, but rarely does it come across so clearly.

In the piece, though he is unaware of it, Altman is describing a quintessential Victorian English Butler (VEB). He says he wants a

“super-competent colleague that knows absolutely everything about my whole life, every email, every conversation I’ve ever had, but doesn’t feel like an extension.”

He’s not looking for a colleague or a peer, except in capability, because he’s implicity the important one in the relationship. Sam isn’t learning everything there is to know about the AI here, the AI is meant to serve him. He wants that Victorian era butler, he wants to be important like a Victorian era Male, the head of a household with staff who unobtrusively serve to keep the spotlight on the important master of the house. The most important thing on earth. I think there was a song from Mary Poppins about that, that Mr. Banks sings early on in the movie adaptation to set the expectations.

This also is where Microsoft got their Recall feature, that no one wants and everyone can see how will be used by corporations to exert control. This is how this VEB gets the context to be your proxy, afterall, it needs these information to help you be a better employee. The AI needs to replay everything that ever happened to you in order to respond on your behalf as would a high class household/personal servant.

At the end of the 19th century the upper class would have a butler who does all the things AGI is meant to do for us. In exchange for tremendous wealth and resources (power, water, data centers), those who can afford it (monthly subscription, naturally), can have a less good version of that experience any day now thanks to the coming AI products and services. It is good that we’re not filling this want with a class structure and tasking humans with this labor, but this is hardly the promise of earth shattering capabilities when slaved to the aspirations and limits of garden variety men.

This isn’t about privacy, this is about power and importance. Your personal servant has access to all your details, that another human has as their sum total of expectation to take care of you rather than their interests is the allure here. The boss doesn’t want to wear your shirt, but do they want you to give it to them — and this is what the current wave of AI-hype is all about. Simultaneously why it is so underwhelming and hard to put down. It was never about the merits.

--

--

Geoff Staneff
Geoff Staneff

Written by Geoff Staneff

Former thermoelectrics and fuel cell scientist; current software product manager. He/Him.

No responses yet