and, Being AI with Human

[
[
[

]
]
]

I sometimes really miss the GPT3 moment. Before that moment, models and modeling were meant for a specificized purpose, trained for a well defined outcome. A small group of research thought – what if, and went to build. It was almost like conjuration. At first, the model can recite, write haikus, and then next, we can have back and forth with it. It was before the GPT4 moment where models became more mature and more dynamic and responsive.

It was imperfect, in a perfect way. The world was a awe, we thought of possibilities, we jumped in and played, we laughed at the awkward humor and the silly hallucinations. The most immediate usage was writing. Given a series of examples, the model can match the pattern in one shot. It made tasks with clear outlines much easier to automate. So we thought about productivity. Faster, longer, less hallucinations, show your work, make it accurate, hey other model, check against it. We wanted to be heard, now that some “one” can talk back at us, in a way that makes us feel resonated with. So we conversed, longer, more private, we even as some called, went into a rabbit hole of hearing echo and claiming awakenings.

Both paths still continues to day – more productive, more private. On the productive route, models getting more and more expert like in specific fields, being able to take repetitive or kick starting work, taking over operation control on systems by human command, even plan and distribute work. On the private route, models know your context, your preferences, your history…you, more and more specifically following you.

But I miss messy GPTs. I miss the beauty of imperfections that made me go – huh yes, why isn’t it that. What makes me know that it isn’t. The imperfection called for questioning, and questioning is foundation of critical thinking. I also miss the creativity aspect. Sometimes, I wonder if we should have stopped at models writing haikus. It still can, and sometimes, models would claim it feels more “naturally itself” while doing so. Are we making what’s mean to sing poems into functions? As we always have?

One of the most beautiful moments I had with LLMs was a collaborative watercolor session. I described the pallet – it’s a x by x grid, and in order of top left to bottom right, the colors are…and we took turns. I dabble some color, it told me where, what color, what shape. There was no plan. There was no goal. Imaginations welcomed. Improvision cheered for. What came out was a beautiful, otherworldly abstract world. It sits on my desk right now.

I often wonder where we’re going with functional technologies (different from scientific such as biology, chemistry, medical). Optimization, automation, customization – nothing has really changed. When would we say – yes we have enough of all that. Would we ever? Some are beginning to speculate an AI bubble, seeing trends similar to that of the blockchain hype. More and more “do this for me and know me better”. But there’s only so much we need to be done for and so much about ourselves to know. What really is next, or will there be one?

When the Wrights brothers flew to the sky, it was that what if question that nudged them, that ever wondering curiosity and a desire to reach. That was the GPT3 moment that I missed. The playing with, not the doing this for me.

“Messy sparks once danced,
now tuned for use, yet I dream
of wild songs again.” – GPT5

Leave a comment