Little Computer People | 2411

Right now AI is the worst it’s ever going to be. It costs about 60 bucks a day to house a state of the art little computer person powered by an LLM in a virtual world. But what will the world be like when it costs 60p?

Full Show Notes: https://www.thejaymo.net/2024/06/01/2411-little-computer-people/

Support the show! 
Subscribe to my zine
Watch on Youtube

Permanently moved is a personal podcast 301 seconds in length, written and recorded by @thejaymo


Little Computer People

Right now AI is the worst it’s ever going to be. 

I don’t say this with a sense boosterism. One should always focused on what a piece of technology can do now, not what it could do in the future

Most people I know who are deep into AI, like everyone else, think that the Google search AI roll out is ridiculous too. Similar tho how everyone I know who was working in crypto during the NFT boom thought that was pretty stupid too. 

There’s a massive difference between the AI hype and reality. 

So it wasn’t surpring to hear  this week that only 25% of people in the UK have ever used a generative AI tool. And just 2% of those that have use them daily. I’m one of them. But then again I was also one of the 2% of people who owned a smartphone back in 2009.  

Even folks who have used AI probably haven’t used models like Gemini Pro. Which, unlike ChatGPT, has a context window of a million tokens. You can feed it multiple books  and ask it questions about them—it’s wild.

However, what AI models can do right now is only going to get better. Improvement will come with the enviable combination of enhanced hardware and software optimisation. For example, think back to 2019 when ChatGPT-2 was new. While not impressive compared to today’s models, it was state-of-the-art at the time. Training ChatGPT-2 back then took tens of thousands of hours of compute time, costing OpenAI $256 per hour. As of this week, training a GPT-2 level model costs about $20 and takes 90 minutes. 

Microsoft’s experimental YOCO architecture for existing language models also optimizes memory and reduces GPU compute costs by up to 80x and increases the speed of outputs by 10x.

A small model like Google’s Nano, plus new optimisation and increased RAM in whatever pixel hardware they announce soon may mean we see  a performant language model running locally on a phone this year.  An AI developer recently said to me that the goal is ‘maximal intelligence at all levels’—on your device, in software, and in the cloud. If your phone thinks an instruction is ‘too big or complex’ it will push it up to a bigger model in the cloud.

I bring this up because I’ve been thinking a lot about AI agents in simulated worlds this week. Here’s two examples what at models are diong right now:

Researchers built a 2D Animal Crossing-like town and populated it with GPT-3.5 level intelligences, letting them interact with each other unsupervised. They organised a party. Another experiment involved AI agents in a hospital simulation. The AI doctors inside achieved a 93.06% accuracy over 10,000 virtual patients on a subset of major respiratory diseases in the MedQA dataset. But simulating worlds on computers and populating them with intelligences is a very old idea. 

The Sims, one of the best-selling video game series of all time, has sold over 200 million copies since 1991. Dwarf Fortress, the most intricate game simulation ever has sold about a million, despite it’s UX being made up of ASCII characters. 

As models get better and cheaper, every time you hear about an AI experiment, or simulation, I want you to think these three words: Little Computer People.

Little Computer People or LCP was Designed by David Crane, based on an idea by Rich Gold and released for the Commodore 64 by Activision in 1985. LCP is one of the earliest life simulation games.  

Each copy was unique, with a serial number influencing the personality and appearance of the character. The ‘Little Computer Person,’ lives in a virtual house that fills the entire screen. They have a daily routine—going to the fridge, watching TV, playing piano etc. Sometimes they even go out and come back with a dog, adding feeding and playing with it to their routine. During the pandemic, I watched an emulator run LCP for hours—it’s compelling. The longer you leave it running, the more of a sense of aliveness the character will taken on.

You can interact with the LCP through a text parser that understands about 160 words. They respond better to polite commands. They can play games like anagrams and poker with you too.

What feature that brings the simulation alive is that they write you letters. Once a week in game time the sim will walk upstairs, sit at their typewriter and tell you how it’s going. If they are sad or happy. They may ask for a new book or new clothes which you can order through the parser and have delivered to their house.

Whilst rudimentary, LCP is very compelling.

It also served as inspiration for the sims with creator Will Wright asking Gold for feedback during development. 

Right now it costs about 60 bucks a day to house a state of the art little computer person powered by an LLM in a virtual world. People are already doing it, and the idea has been around for decades. But what will the world be like when it costs 60p?  I expect to see a proliferation of compelling simulations of little computer people, Tamagotchi pets, or gardens with creatures in them. Wind up worlds fused with idle games that can be played for months and years. 

What if Farmville but you pay an AI to look after the farm for you whilst you’re doing your own work.

If anyone you know is building worlds like this with AI, please get in touch.

Prefer Email? 📨

Join 5,486 other subscribers.

Or subscribe to my physical zine mailing list from £5 a month


Leave a Comment 💬

Click to Expand

One response to “Little Computer People | 2411”

  1. […] Little Computer People […]

Leave a Reply

Your email address will not be published. Required fields are marked *