Olafur Eliasson

Visible Disappearance: Artificial Intelligence, Climate, and the Exhausted Body of Perception

We live in an era where artificial intelligence no longer merely analyzes or predicts; rather, through its pervasive algorithmic mediation of information flows and interactions, it has begun to actively reshape and filter human perception itself. Visual flows, textual responses, and acoustic environments are now increasingly generated, refined, and adjusted by systems underpinned by predictive models. Consequently, we find ourselves existing within the very logic of digital interfaces, where reality is not so much represented as it is filtered, calibrated, and subtly tailored to the user’s presumed needs or desires

This new, pervasive sensitivity, however, did not emerge suddenly. Its fundamental architecture was laid down as early as the late 19th century, when Herman Hollerith famously invented the tabulating machine, a device originally conceived for sorting vast volumes of data. Throughout the 20th century, this concept evolved, notably into IBM punch cards and computational systems built upon a clear logic of exclusion: either data matched predefined criteria, or it was unceremoniously discarded. Today’s artificial intelligence operates with far greater subtlety; it no longer strictly rejects, but rather anticipates, no longer excludes, but skillfully adapts. Yet, at its core, the principle remains constant: perception is processed. This represents a shift from explicit, disciplinary control to a more pervasive, adaptive form of algorithmic governance, subtly guiding attention and shaping understanding.

It is precisely within this context that Olafur Eliasson’s work acquires a particular sharpness. In his new exhibition, slated to open at neugerriemschneider as part of Berlin Gallery Weekend 2025, he offers not just an immersion into space, light, and color, but a profound structural demonstration of perception as a living, embodied, and fundamentally non-algorithmic process. His installations, in this light, can be understood as reimagined computing machines, not built on exclusion, but on oscillation, on flux. If IBM once relied on punch cards, Eliasson now employs polarized filters. If the machine was historically driven by an inflexible code, his work is now centered entirely on the moving, experiencing body of the viewer.

Everything the viewer encounters within these spaces is indeed governed by the precise laws of physics: interference, depolarization, reflection, wave modulation. But this inherent precision does not, crucially, serve explanation; it serves experience. Eliasson does not seek to offer a definitive decoding of the world. Rather, he constructs situations in which perception itself becomes the primary operation. Projections that vanish with a mere shift in perspective, for instance, are not simply metaphors; they are compelling physical demonstrations of how fundamentally perception changes depending on context, on angle, on the specific conditions of engagement.

In one of the central installations, a gelatinous substance begins to tremble, subtly excited by sound waves. A carefully directed beam of light passes through it, creating an image that constantly, almost organically, transforms. This, perhaps, serves as a model of a new sensitivity: an embodiment of the invisible that nonetheless remains powerfully present. Physics here transcends its conventional role, creating an experience that verges on the quasi-mystical or alchemical; the entire scene vividly recalls an alchemical laboratory, with the artist appearing less as a scientist and more as a magician, transmuting nature into profound meaning.

In an era of automated attention and algorithmically managed images, Eliasson’s works demand not passive contemplation, but rather physical engagement – an involvement where perception itself becomes a tangible form of labor. His installations meticulously craft the conditions under which thinking and seeing are actively re-activated – not by guessing or fulfilling desires, but by gently returning the viewer to a slow, embodied, and distinctly non-automated gaze.

The Olafur Eliasson exhibition, then, functions as a spatial anti-system. It neither substitutes reality, nor rigidly structures behavior, nor optimizes emotions. Instead, it opens a dynamic field wherein perception evolves into a form of computation, but critically, one that requires the participation of a body: vulnerable, unstable, and undeniably alive. In an age where so many decisions are made without visible human involvement, these installations serve as a vital reminder: everything depends on your point of view – and crucially, on your own experience.

Everything that becomes transparent, disappears.

In an age where algorithms meticulously adjust visual flows to fit preconceived expectations, disappearance has paradoxically become a primary mode of display. The climate crisis, rather than remaining an anomaly, has been aesthetically adapted; it has been seamlessly integrated into interfaces, into perception, and ultimately, into a collective fatigue. Anxiety is systematically dampened. The body, with all its messy unpredictability, is effectively removed from the frame. Everything seems to remain in its proper place, yet nothing truly touches us anymore.

Artificial intelligence, in this context, is not merely a metaphor. It has become an integral part of the very architecture of the real – not because it possesses an intrinsic reality, but because it profoundly shapes the surface of reality: what is visible, what is audible, what is deemed desirable. Anything that cannot be predicted is systematically excluded. Anything that fails to reinforce the prevailing model drifts into a vast, unperceived blind zone. In this crucial sense, AI and the climate crisis are not merely thematically connected; I argue they exhibit a profound structural homology in their impact on human perception and agency. Their underlying logic is shared: prediction instead of understanding, adaptation instead of resistance, optimization instead of difference. While acknowledging the potential of AI to offer new insights or tools for climate science, my focus here is on its parallel role in shaping a perceptual framework that subtly erodes our capacity for deep engagement with environmental crises. AI does not, in fact, generate vision – it meticulously formats it. And the climate, similarly, vanishes in precisely the same way – not as a dramatic event, but as a gradual smoothing, a slow, almost imperceptible erosion of our collective sensibility.

Training large-scale models requires an astronomical amount of energy. The full training process of a single system like GPT-3 can emit hundreds of thousands of kilograms of CO2, a figure comparable to the lifetime emissions of over a hundred gasoline-powered cars. A single text is already computation, cooling, infrastructure. Behind every image lies a data center; behind every word, an industrial circuit. Dust, lithium, servers, freon, water: AI is far from a cloud-based abstraction; it is, unequivocally, a material industry. It cannot be separated from the very destructive processes it ostensibly attempts to model – indeed, it actively participates in the degradation it describes.

But the problem extends beyond the purely physical; it is fundamentally perceptual. Contemporary algorithms are meticulously trained on fleeting attention, on what is easily clickable, on what is visually «light.» Visuality, consequently, no longer represents; it filters. The climate, framed through relentlessly upbeat infographics and superficial «green» branding, ceases to register as genuinely destructive. It has transformed into a mere product, a transient mood: too well-lit, too polished to truly disturb. And it is precisely in this lies the most dangerous shift: a disappearance that elicits no response, an erasure that is not even perceived as a profound loss.

Data centers may not outwardly resemble the world’s factories, yet they are actively manufacturing the very boundaries of visibility. Their architecture is deliberately neutral; their light, an artificial daylight. Their function remains largely hidden. Yet, these unseen structures determine precisely what is shown and what is excluded. AI does not censor; it optimizes what can be seen. It functions much like a climate system itself: cooling perceived heat, stabilizing variation, flattening peaks of difference.

The deliberate refusal of unpredictability constitutes a political silence. The elimination of difference is framed as a form of energetic efficiency. Anything that resists easy recognition is immediately marked as an fault. The human body (often deemed too slow, too reactive, too inherently unpredictable by optimizing systems) is thereby displaced from the sphere of automated perception. What remains might tremble, much like a vibrating surface in an installation; only by slowing down can one momentarily sense its lingering response. But even this sensation is temporary. Even this, in the logic of optimization, is deemed inefficient. Yet, it is precisely this raw, unmediated capacity of the human body – when re-engaged and de-synchronized from algorithmic flows – that holds the potential to perceive the underlying difference: that something is profoundly wrong.

Modern perception has, regrettably, become complicit in an ecological problem – not because it lies, but because it predicts too well. And embedded within that very prediction lies a profound disappearance: you simply cannot see what you are not prepared to see. You cannot feel what has already been meticulously smoothed out. AI and climate catastrophe, therefore, operate as a single, intertwined protocol: one produces reality, the other collapses it. Neither, crucially, registers as a tangible presence. This ongoing process, if left unexamined, risks diminishing our collective ability to respond to critical systemic failures. Only the body – untrained, unadapted, unsynchronized – can, for a fleeting moment, perceive the underlying difference: that something is profoundly wrong. Everything is visible, yet nothing truly resonates.

And perhaps this is all that truly remains: the right to unpredictability within perception. The right to error. To trembling. To a light too sharp, too insistent to be smoothed away.

Liza Kin
Berlin, May 3, 2025

Olafur Eliasson Exhibition at neugerriemschneider
Christinenstraße 18–19, Haus 9
10119 Berlin, Mitte
May 3 – August 9, 2025