When I was a child, my dream as a software passionate was to build a compiler that could understand my programming language. For me, being raised in a small town in the middle of the countryside of Argentina, this task was, at least, challenging. Now, I can put myself in many hours on the weekend, complete this or end up with an MVP.

This has changed since I started (I did it with the infamous C++ 03 back in the old days), and by now, I can see how little I understood back then.

Nevertheless, something that I remember being was happy. Not only because it was a new (back then) way to see the world but also because it was an essential leap in understanding for me.

Things weren’t static but dynamic in terms of capabilities of thinking. My understanding of specific topics imposed limits; now, all my knowledge of what I could do took a turn. I was surprised and happy when I did this emulator + compiler exercise. I had it with me, the tool I always wanted, but what’s the practical value of all of this?

Boredom

In 2013, Bench & Lench (I am sure to you this is not a joke) released “On the function of boredom”

Through the essay, they claim boredom promotes a state of mind superior for humans in terms of emotional responses & physiology in general. Almost like something we should use in our favor and not that we should fail against

Talking about boredom is not something new on our horizon as humans; even Greeks discussed this:

“Boredom was a subject which concerned the ancient Greeks, indeed, Socrates suffered the indignity of being criticized by some for repetition and monotony.” (The phenomenon of boredom, 2006)

The introduction of The demon of noontide (the book from which the paper of 2006 was inspired) beings:

“Kierkegaard claimed that the gods created man because they were bored, and Baudelaire predicted that the “delicate monster” of boredom would one day swallow up the whole world in an immense yawn.” (The Demon of Noontide, 1976)

Now, putting our eyes on the present, it’s essential to see how we keep it as far as possible from boredom daily as a society. We should embrace boredom since it’s the only fundamental difference between us & them (them as technology, which by now will be agents in a few years)

It seems typical now to see that boredom is counterproductive, even dangerous, for your career. The message spread about the net goes about being productive: no matter for what or for who, keep grinding it. It does not matter if you are an artist looking for inspiration, a designer looking for new things to build, or an architect who still wants to draw (Why Architects Still Draw, 2014). The only thing that matters now is being relevant to the current state of the art on the latest trend in technology.

As Adam Curtis exposed in Happiness Machines, we are models of desire by now, which is becoming more and more of a problem for us as a society.

Reality

Past three weeks, it has been impossible not to read Twitter without stepping into a GPT thread, a “be the best version of yourself” thread powered by AI. We are looking at the focus. People will start to stop building things that matter for them (and potentially for a group as well) and will create something that aligns with the latest trends.

As Graham said in one of their tweets past week

Graham said

We might have substantial potential problems for our modern society: if we don’t pay enough attention, we will lose focus on what’s essential in the end. We need to allow ourselves to learn something just for fun, to do something for the sake of doing it while not looking for revenue or social connections just because we want to do it.

There is a famous but fun and natural history about the programmer who became a carpenter because he was too tired.

LOL

But by now, this story will be seen more often, I believe, across programmers.

Computation

Last week, Spark of AGI was released by Microsoft with 154 pages. After removing it, a turd of developers on Twitter started to panic & claiming we reached AGI when it was virtually impossible for them to read 154 pages in 1 hour. This is essentially what I don’t like about this hype that now can look like an innocent leap but, in the future, this can become harmful and even “attack” the roots of what science is (since nobody will pay attention actually on what’s written but mainly on the wow factor or impact of it)

Technology will change forever because now, the conversation it’s no longer about the possibility of doing or the amount of complexity you can grok, but the amount of data you can process per second. Computation behind GPT-3 & GPT-4 was massive

This topic was also discussed by Deep Mind’s team (Training Compute-Optimal Large Models, 2022) where they claimed:

“Though there has been significant recent work allowing larger and larger models to be trained, our analysis suggests an increased focus on dataset scaling is needed. Speculatively, we expect that scaling to larger and larger datasets is only beneficial when the data is high-quality. This calls for responsibly collecting larger datasets with a high focus on dataset quality.”

More than ever, computing power will be a crucial differentiator and a mandatory check if you want to compete against top players. Before, it was the speed of the dev cycle, your team’s performance, x10 engineers, etc. Models provided by big companies make us more attached to them, increasing daily their power & their data, but they also impose a threat to everyone working in the same field as them. What’s the point of working in a summarising tool right now if you can buy 0.0002 USD/token?

Conclusion

Focus has to be now, more than ever, on the product you are building. Attention to detail and keeping the human touch. The boredom of thinking about things nobody has the time to think of.

Switch technology if needed, switch the tools you have been using in your company, and do not put your ego in the middle because models are better than you at doing almost everything except thinking as a human.

As Nick Bostrom said, the solution for Superintelligence will most likely be “simple” or at least more straightforward than our giants of the past thought:

Nick Bostrom

References