early computer game image

When ‘digital’ came to town

I still remember the Christmas my sister was given one of the very first computer games as a gift from our parents. Plugged into our television it allowed us to play a virtual game of tennis, batting what I now know to be a collection of pixels against the sides of the screen, so it bounced randomly around.

I must have been about 11 at the time and I still recall the recognition that something fundamental had changed about the world I lived in, although it was to be several decades later that I was able to articulate it in a way that made sense.

That Christmas morning was a turning point; not just for me, but for all humanity. We didn’t recognise it as such, it was after all just a child’s game. But those who saw it for what it was recognised change was on the horizon and we would need a way of living in a new era defined by technology that would become so ubiquitous it would literally take over our lives, and perhaps even us.

That change is upon us and if we aren’t to be swept away entirely, we must find a way to reassert what it means to be human in an age of advancing and encroaching technical development. It would be easy to assume from this statement that I see the world of technology in a poor light, and yet nothing could be further from the truth. After all, I have spent the best part of three decades working in this world and recognise the value it presents daily to billions of people worldwide. However, I have also come to recognise that it also has the capacity to challenge what it means to be human as we race closer and closer to an era where implanted chips dictate not only our current lives but our future lives as well.

But I’m getting ahead of myself, and first, we need to cycle back to the late 1950s when our current ‘digital age’ began in earnest.

The internet originated in the military intelligence complex of the post-war years as a way of ensuring communication between strategic locations could continue in the event of a nuclear strike. And was used later as a way of monitoring what was considered at the time anti-American sentiment and activism, most notably the rise of the anti-communist agenda. Arpanet (advanced research project agency network), as it was called consisted of a complex of interactive computing sites (nodes) located at American universities. It has been consistently added to over the decades with additional nodes.

The web, by contrast to its underlying enabling infrastructure, the internet, was founded on the principle of open access to the world’s information. When Tim Berners-Lee invented the world wide web in 1989, he thought he’d created an egalitarian tool that would share information for the greater good. As he recalls “It (the web) consisted of one website and one browser, which happened to be on the same computer“. This simple setup demonstrated a profound concept: that any person could share information with anyone, anywhere regardless of any defining difference in who they were.

Over the coming months, I want to consider how technology changes our relationship with the world? Do we see it as more relevant or increasingly irrelevant? Are we more likely to endanger the environment and our habitat because we feel increasingly alienated from nature? Or are we more likely to save it because we can see the destruction we’ve created more readily?

None of these questions is a purely digital issue, they’ve always been around if only we’d had the time and inclination to look. At the time though they were less obvious and often limited geographically, so only those impacted either knew or cared what was happening. Now, with access to reports, research, information, videos, news and opinions within seconds of it being posted, we’re in danger of being so overwhelmed by it that we are again switched off, not because of lack of awareness, but simply because it is too much and we cannot take it all in.

Leave a Reply