The Ghost of Labor
Scraping the calcified residue off a 1947 neon advertisement for a defunct radiator shop requires more than just patience; it requires a willingness to touch the ghost of another man’s labor. My finger, currently wrapped in a piece of blue painter’s tape, throbs where a fresh paper cut from a morning invoice reminder sliced through the skin. It’s a tiny, sharp irritation that keeps me tethered to the physical world while my mind drifts into the digital absurdity I encountered last night. I was sitting at this same workbench, the smell of ozone and old metal thick in the air, when my phone buzzed with an advertisement for a specific brand of argon-mercury gas mixture.
I had only ever mentioned that specific mixture inside a supposedly ‘secure’ communication app three hours prior. The app prides itself on its 257-bit encryption, its military-grade protocols, and its insistence that not even the developers can see my messages. And yet, there it was. A glossy, targeted invitation to spend $497 on a canister I hadn’t even searched for on a browser. It felt like someone had walked into my shop, looked over my shoulder while I was writing a private letter, and then whispered a sales pitch in my ear. This is the cognitive dissonance of the modern age: we are sold the illusion of a fortified castle while the king is busy selling the floor plans to the highest bidder.
Illusion of Security
Digital Absurdity
Honest Materials vs. Flimsy Intentions
Jackson F.T., that’s me, spends my days restoring the physical manifestations of trust. When a sign was built in 1957, the porcelain enamel was meant to last for 87 years. There was an inherent honesty in the materials. If the sign was bright, it was because the gas was pure. If the structure held, it was because the steel was thick. Today, our digital structures are built with the flimsiest of intentions, wrapped in the most complex of vocabularies. We talk about ‘security’ as if it were a singular thing, but we have reached a point where technical security-the stuff of mathematics and algorithms-has been completely decoupled from ethical security.
Encryption is a magnificent tool. It ensures that if a third-party hacker intercepts my data packet while it’s bouncing across 17 different servers, they see nothing but gibberish. That’s the lock on the door. But what good is a lock if the house is built of glass? The platforms we use are designed to be technically secure from outside threats precisely so they can maintain a monopoly on the data themselves. They protect us from the ‘bad guys’ so they can be the only ones who get to harvest our behavioral profiles. They aren’t selling our messages; they are selling the fact that we sent them, to whom we sent them, for how long, and the emotional sentiment derived from the metadata.
Fortified Gate
No Real Sanctuary
The Vault of Betrayal
It’s a peculiar form of betrayal. It reminds me of a sign I once worked on for a local bank that went bust in the late 70s. The vault was a masterpiece of 27-ton steel, yet the bank failed because the directors were quietly siphoning funds from the inside. The vault stayed closed. The security was technically perfect. The ethics, however, were nonexistent. We find ourselves in that same vault today. We trust the ‘bank-level encryption’ of our social tools, our work platforms, and our organizers, forgetting that the bank itself is the entity we should be worried about.
There are approximately 897 different data points that can be extracted from a single afternoon of ‘secure’ usage. These points are fed into a machine learning model that knows I am looking for vintage neon components before I’ve even fully committed to the purchase. This isn’t a conspiracy; it’s a business model. The frustration lies in the marketing. If a company tells me they are ‘secure,’ I expect a sanctuary. Instead, I get a surveillance suite with a very shiny padlock on the front gate. I’ve realized that I’ve grown tired of being a product disguised as a user. I want tools that behave like my 1947 radiator sign-heavy, honest, and entirely indifferent to who is looking at it as long as it does its job.
Pockets of Resistance
In my search for something that didn’t feel like a trap, I started looking into ecosystems that actually prioritize the person over the profile. It’s a rare thing to find. Most companies want to scale until they are large enough to be bought by a conglomerate that will inevitably strip-mine the user data. But there are pockets of resistance. There are systems designed with a different architecture, one where privacy isn’t a feature you toggle on, but the foundation the whole thing is built upon. This is where ems89 enters the conversation for me. It represents a shift away from the ‘secure-but-selling’ model toward something that actually respects the boundaries of the digital individual. It’s the difference between a shop that lets anyone walk in and touch the tools, and a private studio where the work remains between the craftsman and the client.
When I’m working on a sign, I have to be careful with the solvents. One wrong move and I can strip away forty years of history. Our digital history is being stripped away every second, but instead of being discarded, it’s being archived in 127 different databases across the globe. We have accepted this because the convenience is high and the technical jargon is intimidating. We hear ‘end-to-end’ and we feel safe. We shouldn’t. We should be asking where the ends are, and who owns the air between them.
The Screen as a Barrier and Vacuum
I remember a specific incident involving a restoration for a client in 2007. He wanted his father’s old pharmacy sign lit up again. We spent 47 hours just cleaning the internal glass. While we were working, he told me how he hated the new digital medical records because his pharmacist now spent more time looking at a screen than at his face. That screen is a barrier, but it’s also a vacuum. It sucks up the nuance of human interaction and turns it into a series of predictable triggers for pharmaceutical ads. That pharmacy sign, once restored, just sat there and glowed a warm, steady red. It didn’t ask for his location. It didn’t track how long he looked at it. It was secure in its purpose.
2007
Restoration Project
Present
The Digital Echo
The Digital Paper Cut
This paper cut on my finger is still stinging. It’s a sharp, localized pain that demands my attention. I think the digital world lacks this kind of feedback. If our data being sold caused a physical sting, we’d all be covered in bandages. Because the theft is silent and the betrayal is wrapped in 2047-page terms of service agreements, we don’t feel the wound until the ad appears on our screen, mocking our belief in our own privacy. It’s a psychological paper cut, a tiny tear in the fabric of trust that eventually leads to a total detachment from the tools we use.
We need to stop praising encryption as the end-all of privacy. It is the bare minimum. It is the lug nuts on a tire. You need them to drive, but they don’t tell you where the car is going or who is recording your route. The real revolution won’t be a better algorithm; it will be a return to ethical architecture. It will be the platforms that choose to be blind to our data not because they can’t see it, but because they have designed themselves to be incapable of wanting to.
Transparency Over Opaque Security
I look at the 107 neon tubes I have lined up on the drying rack. They are fragile, beautiful, and completely transparent. You can see the gas through the glass. You can see the electrodes. There are no hidden compartments. In a world of opaque ‘secure’ apps, I find myself craving that transparency. I want to know exactly what is happening to my information. If a platform is scanning my messages to ‘improve my experience,’ it’s not a secure platform-it’s a meddling neighbor. If it’s tracking my location to ‘provide local results,’ it’s a stalker with a brochure.
The dissonance is reaching a breaking point for many of us. We are tired of the bait-and-switch. We are tired of being told that our privacy is a priority by the very people who profit from its destruction. It’s time to look for the signs that are built to last, the platforms that don’t have a hidden agenda behind the encryption. I’ll keep my blue tape on this paper cut for another 7 hours, and I’ll keep my skepticism about ‘secure’ apps for a lot longer than that. We deserve a digital world that is as honest as a piece of 1947 steel. We deserve to be more than just a behavioral profile in a database.
Transparency
Honesty
Integrity
The Cost of Free
[The cost of free is always our autonomy]
I’m going to finish this sign. It’s going to glow with a steady, unblinking light, and it won’t report back to a server in some far-flung data center. It’s a small victory for reality over the simulation. But as soon as I wash the grease off my hands and pick up my phone, I’ll be back in the glass house. The only way out is to change the house entirely, to move toward ecosystems that don’t see our lives as a resource to be mined. We need tools that are actually on our side, not just technically compliant while they pick our pockets. It’s a long road back to digital dignity, but I think I can see the neon flickering at the end of the tunnel. It’s not an ad; it’s a signal.
And for once, I think I’ll follow it.
Comments are closed