RGB gamerfeels

clear as liquid crystal dept.

15 Apr 2026

Blaming the screens for all of society's ills is currently in vogue. When engaged in this, most among the pundits are however usually talking about the new fondleslabs, tablets and smartphones, rather than the desktop monitors they themselves were weaned on. The blind spot is rather telling.

In truth, LCD screens have a lot to answer for. Thanks to the slender profile they caught on pretty much the moment they became viable, this was however a decade before the tech matured properly: office workers in particular tended to be saddled with monitors that had a very narrow viewing angle and by and large were nowhere near colour-proof.

So while the tech was there since the Noughties to do well-presented HD webcomics, both bandwidth and endpoint limitations meant there was little point in going with anything beyond simple shapes and primary colours, which led to the internet basically reinventing the Yellow Kid phase of daily comix. In a way this may have been liberating for writer–artists who never figured out pen nibs, but it did stem from a decidedly cramped communications channel with little room for subtlety, host to metric gigapixels of largely superfluous sprite-stencil comics.

Eventually, the medium became the message. Rather like how 78 RPM gramophones were best suited to doo-wop and opera singers, the 2Knet was a stew of readily identifiable colours, anime with uniform line-weights, and text-only drama desperate to wring some kind of reaction out of people, any kind (because there was no real rapport with the audience).

As it happens, a lack of rapport is also the bane of computer games: how often do you find yourself laughing at an achievement? The fundamental problem would be that the machine has no Goddamn sense of humour, which makes it difficult to tell wether or not you get it. And so the goals tend to be defined in terms of whatever happens to be easy to measure — speedrunners are really just doubling down on what computers are (usually) good at, keeping accurate time. Notably, all except glitchless runs tend to step outside the bounds of what you'd call the regular action of proceeding through the game, to indulge in walking through walls and so on. It suggests a funny corollary to Caillois' observation that the athlete is not playing but working: speedrunners are effectively ranked competitive beta-testers who complain about rugpulls when the game gets patched out from under them.

In the same vein, skill in the traditional sense isn't even what's being rewarded in action games. Where normally we consider the ability to push the envelope to be a natural extension of extraordinary skill, action games are unmistakably paint-by-numbers; there's always a snap-to grid, the only options are fulfilling the requirements faster or else to demolish the barriers altogether. (In truth, the only real difficulty in a computer game is cognitive difficulty, which — quite tragically — is what most punters are desperate to avoid.)

A better word might be production, since players are always trying to maximize their output of damage-per-second or simoleons or whatever. The important thing is to produce win states, to optimize production towards six-sigma efficiency, and of course the pro gamer is always producing content.

To ensure players are instantly made aware when production has been successful, games have to provide clear feedback: beeping, flashing something in the centre of the screen, some signal that can't be missed. This eventually becomes invisible to players, much as pedestrians usually aren't aware of the sound of their footsteps; however, it has always been one of the easier ways to mock computer games as an immature art form. Not entirely without reason: a sign of maturity is often the ability to take something in and then sit with it, and the action game does not sit; it jumps, sometimes even doubly so.

In artistic terms, the action game is a broken record that wants you to hit the same note over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over, and over, to a degree that would be laughable in any other field.

Superhero costumes maximized recognizability across different artists and made it easy to pick out the heroes from the frantic action, which incidentally is the same problem computer games are often trying to solve (tellingly, Green Lantern may have been the odd one out in The DC Universe™ lineup on account of sporting the least lurid colour palette). The blaring crisp signals are like military bugle calls, valued primarily for their clarity rather than their æsthetic impact. Superhero comics, it bears keeping in mind, were once so dominant that some people came to think of bande dessinée in general as an evolutionary dead-end, good for nothing but Sunday funnies and power fantasies. And with games, the situation is even more dire; they can barely manage to pull off comedy.

The era of zero subtlety has also resulted in a slew of shows that fail to stick the landing because they're created by the bathroom singers of deviantArt who devoted their formative years to marinading in the primary colours of emotion (cute/creepy/cringe) to the detriment of all else. Piling accessories onto your OC is an offshoot of this as well: it was the Web 1.0 equivalent of the virtual fashion show, looksmaxing fictional characters being otherwise trivial.

Æsthetes previously excused spending time on art by claiming that it trains our capacity for emotional response (and presumably, the ability to recognize complex emotions in yourself). When the only possible æsthetics now come down to playground toys and realistic plus yellow paint, what room does that leave for advancing towards novel sensations?

Which is not to say that there can be no place for primary colours. But when everyone's trying to be Superman, there's bound to be a spandex shortage.


Purple patches from the silver screen

overdue for the rubbish heap dept.

21 March 2026

Back when the ill-considered field of ludology was trying to contrive an excuse for not putting down their Nintendo controllers they came up with something called process intensity. Its raison d'être seems to have been to act as an easy way to cordon off all that effeminate storytelling from Real Gaming™ so that the Serious Ludo ologists could get back to the all-important business of watching Superb Mario bounce around in their idiot boxes.

The conceit was based on the vague intuition that cutscenes aren't very game-like, and when a system is streaming video files off a CD-ROM it's not doing much in the way of novel computing, just shovelling prepackaged bytes into the furnace of a playback plugin. This may have sounded sensible in the age of rotating optical platters, when only a few titles like Interstate 76 rendered everything in-engine. The renegade interrupt of Mass Effect and elements bursting free from flashbacks in Cryostasis had yet to blur the lines between interactive running-around and passive consumption of canned storylines—it could only happen once the tech moved beyond pre-rendered cutscenes and slow storage.

In fact there was never much that could be done to salvage the whole edifice: showing that the concept is absurd is a fairly simple task. Consider a convoluted function that returns four times its input.

function quaddmg(sum)
	result = sum + sum
	result = result + result
	return result

This is more process-intensive than an optimized version:

function quadruple(sum)
	return sum * 4

However, the additional processing just amounts to the computer spinning its wheels, it does not speak to a more game-like experience. The metric is, if anything, even more useless than body-mass index or measuring programmer productivity by how many lines of code they push to production.

Real codebases often run to several million lines, which makes it an open question exactly how much a given program may be optimized without breaking anything (hence the shift to automated testing in the 21st century; if nothing else, it provides a basic safety net against unexpected side-effects.)

Process intensity properly belongs to the age before unit testing, when games had to be implemented as hand-tuned C code because it was the only way to squeeze performance out of the hardware. The term was obsolete practically the moment it saw print, as there has been nary a mainstream game created on bare metal since Rollercoaster Tycoon and Descent.

The real problem with cutscenes, especially pre-rendered ones, is that they do not achieve a graceful integration with the rest of a game, and so players simply shrug it off as something separate, basically an intrusive advertisement for the game designer's wish to be just like a real film director.


Education by arbitration

the mis-edutainment of the youth dept.

15 March 2026

Puzzle games on 8-bit platforms were necessarily abstract due to the low resolutions and cramped palettes. While console games usually centered around jumping, shooting, or driving to cater to broad audiences, games for home computers could be considerably less constrained owing to the level of technical knowledge that had to be assimilated to get off the ground at all — even on the Commodore 64, rattling off a smattering of BASIC was required just to load games from cassette. This selected for players who possessed considerable reserves of patience.

Given that much of the software was produced entirely by one or two programmers, interaction tended to be simple: press a button to change one square at a time, certainly no side-effects reaching beyond the visible frame (not that it was feasible to hold more than one scene in memory at a time; this was the era of flick-screen scrolling for a reason.)

In consequence, 8-bit puzzle games feature some of the most illegible gameplay ever committed to magnetic tape, where you would be hard pressed to predict what was about to happen before you started mashing buttons to see what they did. Small wonder then that there was a trend towards Boulder Dash spinoffs — the format was a good fit for it, boulders slotting into the cubbyholes of Array data structures, and the mechanics almost made sense. This particular style of arbitrary interaction did stick around in so-called educational software, a sector where it didn't matter wether the content was at all comprehensible because the procurers were clueless parents and Boomer teachers who didn't look twice at the actual flow of the gameplay.

Pattern recognition: find the fatso

Funnily enough, while the simple rules governing the cellular automata in Conway's "game of life" made it especially suited for 8-bit computers, it took a 16-bit game, Populous, to place it in a broader context in a way that felt natural.


New Year's Predix: Peak QWOP

finger gymnastics & pocket tennis dept.

26 Feb 2026

Around the time that shoulder triggers became curved, game interfaces were increasingly standardized: hardly anyone used the right thumbstick for anything other than nudging the camera, the red button always cancelled, and so on.

Meanwhile, the experience of wrestling with unintuitive controls was shunted to a separate genre, the QWOP Flash game and the awkward-physics simulator; as it happens, this was a good match for the particular brand of millennial humour best summarized as isn't it cute how much we suck? And that it did.

The genre of gameplay slowly gathered steam, until eventually punters had become sufficiently inured to it that Totally Reliable Delivery Service was included in Game Pass. Now, this should have been the moment of triumph for qwoppers, the point at which they finally broke into the mainstream to become just another genre. That's not what happened, however.

Uninitiated normies looked at TRDS and wondered aloud why it didn't have better controls. I mean, why would you release a game with deliberately poor ergonomics? And here adherents found themselves suddenly under floodlights with their backs to the wall, because they had to explain what the point was.

And really, what was the point?


In Soviet modernism...

Po-mo and slo-mo dept.

18 Dec 2025

B-movies used to be filler shown before the "serious" main feature. Then Lucas and Spielberg happened, and the B-movie became the main feature, special-effects budgets competing with the star power of actors. As it happens, this was around the time that the modernist Hitchcock gave up on directing films, and postmodernism became a thing.

Postmodernism vs. modernism was a lot like the USA vs. Soviet Russia. Pomo won because it could get everything on credit, but in the process it turned into a parody of itself. (The observant reader may have noticed that Derrida is Reagan in this metaphor, just rattling off whatever mental fluff crossed the little teleprompter the post-homunculus in his head was holding up.)

Pomo was opposed to "grand narratives", but it turns out it's perfectly possible to follow the little narratives right off a cliff, as well. For example: tariffs on international trade have surprisingly broad grassroots support, presumably because most people haven't the faintest idea how a microchip is made. Just like Derrida, in fact.