Emperors’ New Virtual Clothes

Many predictions are being made about the potential impact of ChatGPT. But, what if it’s not predictions we should be focused on, but questions? What if those questions aren’t specific to ChatGPT itself, but extend across many and diverse aspects of our present and possible future?

Melissa Sterry. Published January 25th 2023

ChatGPT is a genre of innovation that raises question of what it means to be human, or not, and what rights ought to be allocated accordingly. It’s not a new question. For all recorded history, and the signs are long before that, humanity has lived between places real and imagined. In that sense, the notion that the ‘Metaverse’ is nothing new. There’s always been a virtual world, the only difference today is that some of the tools that are used in the construction of that imaginary world are different to those of past. Likewise, and again, for at least the sum of recorded history in its text, image, and oral forms, there have always been technologies that augmented human capacity. Some imagine that earlier forms of augmentation were purely practical (i.e. flints) they weren’t. Anthropological studies of both ancient extinct and ancient living cultures show that humans seek experiences both real and imagined. Those worlds can become manifest in stories fabricated using all manner of tools – words, images, masks, performances, use of hallucinogenic substances, and more. Entering those worlds invariably involves a form of ritual, and one that’s usually affiliated with a particular places or places.

Records of our earliest known forms of imaginary worlds show that our ancestors were every bit as creative, if not more so that we are today at creating virtual spaces. For examples, at the advent of the invention known at the ‘city’, together with the built city, virtual twin cities also emerged. The purpose of the latter, of the avatar cities, is not fully understood. However, we know that these imaginary cities served a partly spiritual purpose. We might also posit that their role was also akin to a form of blueprint – the embodiment of qualities that were desired in the built city. Yet, our ancestors’ imaginations were richer still. We see this expressed in the many early histories of numerous cultures of the Classical world, and indeed earlier. What these histories show us is how intimate the relationship between the real and the imagined was in ancient times. The stories that tell these histories speak not just to avatar cities, but to avatars in human, animal, and hybrid form. Their purpose was not mere entertainment and folly, but, among other things, as educational tools – a lens through which complex constructs could be explained.

Over time, the tools we have at our disposal to create imaginary worlds have greatly expanded. One of the most sought-after tools of the early Medieval period was stained glass – then a technology at the very cusp of innovation. Walking into the synthetic rainbows that the light cast through church and cathedral windows formed, would, to the peoples of that time, have felt an extra-ordinary experience that would have helped transport their minds to the places in the psalms and prayers they would hear during sermons. Roll forward a few centuries, and one finds that the theatre arts were booming and, consequently, abundant in new technologies and methods for transporting audiences to imagined places both near and far. Roll forward again, and one arrives at the advent of photography, animation, and film, and to an age when, ultimately so many became the stories we told and still tell as for their number to be innumerable.

The line between the imagined and the real has always been a thin on. Partly because even imaginary worlds can inspire individuals in the real one. We see this expressed in fashion. All but those that have no choice in their attire are, whether wittingly or otherwise, expressing their values, their ideals, and their sense of self through their clothes and wider appearance in public. What people wear and how they wear it shows the tribe with which they affiliate, and how they wish to be seen. Those that study fashion and psychology can tell, in an instant, much about an individual by how they dress. But, of course, fashion famously changes. Our ideas of what we want to wear, and how, typically evolves over time, and this process is reflective of how we, as individuals, develop. Typically, an individual’s internal world – who they are and what they want to be and why – is greatly influenced by their external world, and all that world contains. The curious and creative tend to be attracted to the novel. Few places provide of greatly novelty than imagined cities and other spaces.

In the 1980s and early 1990s one of the mediums where we saw some of the most influential imagined spaces with respect to the fashion industry was magazines. More specifically, British Vogue became a stable for the world’s most promising fashion photographers, that working in collaboration with a generation of models – of ‘super’ models, and with them pioneering stylists, make-up and hair artists, created dreamscapes which, though every bit as fictitious as the avatar cities of ancient Mesopotamia, catalysed the aspirations of the readers, driving sales of apparel, accessories, perfume, and more. Would the purchase of a bottle of fragranced water really transport an individual to the paradise that was printed on paper, or simply dent their bank balance?

Thin though the line between the real and imagined has always been, in some ways, the challenge that presents is arguably greater today than ever before. In the first instance, we have the matter of sustainability – of the fact that we’re all but out of ‘Get Out of Jail’ cards in this global game of environmental Monopoly. Today it matters more than ever before that we understand - and understand clearly - the real-world impacts of our actions (i.e. understand the carbon and wider environmental footprints that are being created by our now many, and still emerging digital technologies). Arguably, it also matters more that we’re able to distinguish between the real, the imagined, and the fake. I’ve delineated ‘imagined’ and ‘fake’ because, though the line between that which is real and imagined is, at times, a blurred one, the line between what is ‘real’ and ‘fake’ is a distinct one.

Yet another factor that complicates this already complex issue is the fact that today stories real, imagined, and fake are being created at astronomical speed. Howsoever fast the authors of the early civilisations may have created stories, the spread of those stories was relatively slow. In the first instance, the spread of the stories, and the many imaginary worlds in which they took place, was limited to size of the audiences to whom the tellers spoke directly. The invention of the printing press saw the speed of the spread of stories expand near exponentially. But, today stories are being authored in abundance every moment of the day, and what’s more, many are being read, heard, and seen with immediacy. The more stories that populate this now highly human-centric world, the more confusion and contention is emerging between the real, the imagined, and the fake. Is this the ‘Information Age’ or the ‘Mis-Information Age’? It’s becoming hard to tell.

 When we think to times past, one of the lenses through which we might consider the scale of the challenge before us is that of religion. When we look to many conquests of the past we find that peoples that had practiced one religion were often forced – upon their integration into a new empire - to practice another, or at least in public. Sometimes, when this happened, old religious stories were rebranded as new ones, which is one reason why, when we look to religious iconography, we see gods and mortals alike are subject to the whims of fashion. For example, we often see paintings of a religious scene in which the subjects are wearing attire from many centuries after the scene is set. I think of this as the equivalent of the movie-remake… of updating a script, of harnessing new developments in filmmaking, and knocking out a new version of an old Classic. We do it with movies. We do it with songs. We do it with paintings. We’ve always done it. We always will. It seems, we humans like repetition.

So, what can we learn from the past? Firstly, we can learn that in some ways, everything stays the same, yet is different: we will continue to tell many of the same old stories, but with new tools. What else? We can learn that some stories are told with good intent, and some bad, and that sometimes some of the best stories tellers are the ones with the worst intent. Of old, emperors burnt books and many other things, temples included, in an effort rewrite history. Thankfully, even the most destructive of them typically failed to erase realities that are today read in fragments of information that weren’t torched, and with them, archaeological artefacts, among many other things. Those seeking ultimate power seek to dominate public narratives. They don’t want the story to be that someone else, somewhere else has the answers, any answers – this is why the dictator’s playbook invariably includes destruction of the arts, of the humanities, and of those individuals and institutions that educate and otherwise illuminate the richness and variety of the lenses through which we might see, hear, and sense the world.

Should we be afraid of the tools that enable new ways for stories – imagined and otherwise – to be told? Given that, fundamentally, the problems these tools present are not knew, possibly not. However, we do need to think about what we can do to try to ensure that these tools don’t exasperate these problems further. For example, one of the key questions around AI generated content is that of its value, and more specifically as relates to content generated by other means – by humans. Should artificial intelligence systems be attributed as authors? Should these systems have rights? How should the content they generate be renumerated? Is such content ‘art’, ‘science’, or other? What might be the unintended consequences of attributing artistic and other works to AI systems not the humans that programmed those systems? How might assumptions in the programming prove problematic?

Reflecting on just one of those questions, that of whether a work that has been generated by an AI system is art, I would argue no, it’s not art. What is it? I think it’s an aesthetic construct that’s being confused with art because some people don’t understand that ‘art’ isn’t just about what you ‘see’ or ‘hear’ – it’s not about mere optics or sound, it’s about what you think, what you feel, and the impact those thoughts and feelings have on your personhood. Can an AI system, it being a system of which the ‘thinking’ is essentially binary and sensory-simplistic, and that is devoid of the capacity for emotion, possibly comprehend what philosophical and other questions it would be pertinent for a human to reflect upon?

These and other questions are important. One reason they are important is the fact that, though inequality has always existed, today, in Britain, in the United States, and in India, among other places, the scale of that inequality is now immense. Today, the wealth gap is so huge, and with it, the social mobility, that if we’re to be frank, a very many people have very limited life chances. Some suggest that life expectancy and living standards will go up, because, in some places, that has happened for a while. But, it’s not happening now, it being a time when that trend has lurched into reverse. More generally, what is happening is a very small minority of people getting very very rich and with it, very very powerful. Many of those people work in corporations – tech corporations. In some instances, the net worth of the founders of these corporations is greater than that of some nations. Today, some of these individuals are lauded like gods. An overstatement? Perhaps. But, arguably, the faces of those to whom I refer are as familiar to many people today as the faces of deities of old to the peoples of ancient cultures. As are their names. When they speak many listen. However, they don’t just listen, they publish what was spoken or written in newspapers, in magazines, in blogs, in social media – and of copious kind. Some such figures even have followings akin to that of cults - those that will defend their actions, and hail them as the great leaders we all need.

How might the already extremely rich get even richer in the years ahead? And no, that’s not an invitation to hire my services to map said trajectory, unless of course your intent is that of challenging said possibility. One way they might is to further co-opt many artistic and other original works to make money in a way that is deeply inequitable to the authors of those works. We’ve already seen umpteen creatives – creatives working across the entire creative industries – confronted with a situation that sees their works copied and plagiarised at a pace and a scale that is near impossible for many to keep up with. A couple of years ago I asked my former business partner and friend, the late music producer Steve Brown, his thoughts on the Metaverse and music. His primary concern was that which is echoed across the music industry – that of platforms not renumerating artists accordingly, and the impact that was making on talent both new and old. Studies have shown that, commercially speaking, pop music has become more samey, yet at a time when there is potentially greater access to original talent than ever before. Why is that? Why are songs that chart becoming more homogeneous? There are arguably several reasons. Yet foremost is the fact that tech – tech platforms – are a product of a commercial culture that values quantity over quality, and that grossly undervalues original ideas - ideas with something novel, interesting, and relevant to say. Or, put another way, ideas with artistic integrity. Hence, one might also argue that, despite all the potentialities that digital tech might offer singer songwriters and musicians, many simply aren’t reaping the financial benefits they might. Who are? You’ll get no prizes for guessing.

Equally, the matter that many tech platforms benefit from the intellectual property of many other types of artists – of painters, photographers, performers, filmmakers, etc., in a financially inequitable way - is doing damage. What kind of damage? At this point I’ll draw on anecdotal evidence, more specifically, the fact that, for the greater part, the creative freelancers I know are struggling financially. They are struggling because no matter the fact that some of them are among the most lauded talents in their respective fields their salaries are typically highly stochastic and now sizeably lower than many might expect. The implications of that to their work are significant. Time that could and should be spent advancing their folios and projects is spent fretting over their finances. The longer the situation persists, the greater the emotional burden. The situation is now so acute that some are considering leaving their professions.

 At the other extreme, today’s pay-per-click culture sees many of those that don’t originate new ideas, but that extract value from the ideas of others – influencers – earning far higher than their collective contribution arguably merits. Essentially, influencers amount to advertising agents of whom it is the occupation to flog products and services. Barely any influencers contribute any original ideas and works to the creative industries, instead typically adopting a stereotypical style of imaging and narrative. Though, arguably, their advertising services merit renumeration, they amount to just another cog in the economy of extraction over origination wheel. The online culture of which they are a part has reduced the web to a place dominated by replicants and replications - to a place consumed by tired visual narratives and tropes that are being dished up time and again. So very consumed are some brands by the influencer advertising model that they now hire established creatives from the performing arts to create selfie-style content, which is invariably devoid of any of the artistry that seasoned creative directors and production crews can create. True, many a household name has sold their image to sell products and services of past, and true, some of those advertisements were cringe-inducingly awful. But, at least ad content that’s been crafted by creative directors more typically shows some – as their title suggests – creativity. What, more usually, performers, such as actors, reveal when they take to their smartphone to promote a hotel or a beauty product or a clothing range or more is just how much they rely on film and other directors, and the teams of creatives that make movie and other magic happen. There’s a reason why David Bowie, and with him, creatives of his calibre, never sold/sell their soul to sell something other than their creative works, and that’s the fact that they understand that monies earned from the cult of celebrity diminish the value of their personal brand over time.

We’re witnessing the industrialisation of the arts. In the first instance, this industrialisation enabled the democratisation of the pursuit of many creative endeavours. Be they in the form of software, of networks, or otherwise, the first generation of digital tools provided creatives with new ways to create works, and, having created those works, to promote them. For many, myself included, new professional horizons were opened, and in novel, equitable, and exciting ways. However, but for some exceptions, the dominant digital players today increasingly bite the hands that feed them.  Consequently, some of the most notable contemporary creatives now avoid engagement with many platforms, and even when they do engage online, they do so in context specific and limited ways. These individuals recognise that, though some digital platforms, namely social networks, facilitation the sharing of their creations en masse, they do so in a way that sees their intellectual property infringed, and thus their earnings. Earnings that could go into developing new works, both their own, and that of others they wish to nurture and support. Today’s dominant social media platforms are akin to factories – factories that relentlessly create digital artefacts from which they draw renumeration without largely paying those that researched and developed those works, and that brought them into being. The executives of those networks might argue that the creators get something in return – they get exposure. What they neglect to consider is that the exposure is typically outside of the creators’ capacity to control, and sometimes that has serious adverse consequences, those being consequences it appears many in tech industry take no responsibility for. Prince was just one artist that it seems very much understood this issue.

Undesirable though the situation of present is, we might ask – will things get even worse? Quite possibly. How? Why? Artificial intelligence systems don’t magically manifest content, be that content branded as ‘art’ or otherwise. No, what happens is that AI systems ‘learn’ by looking at other, pre-existing content– content made by humans, and often content that is the result of not weeks, not months, not years, but decades of trial and creative error. Ground-breaking concepts don’t typically come in a ‘light bulb’ moment. Instead, they emerge in consequence of an individual or a group of individuals undertaking a period of typically systemic research and development: questions are asked, answers are considered, and in process that engages both the conscious and subconscious mind, both of which are shaped from birth. Most creatives have particular pre-occupations from the outset – apply their talents to very particular genres, in very particular ways. The works they create are not merely informed by their various studies and the schools of thought to which they affiliate, but by their emotions, their values, and their beliefs. Not all, but many creatives, and certainly all those that creative significant contributions to their fields of professional activity don’t create art for art’s sake. They create their art with intent to make a meaningful impact in addressing issues they care about. This is why, when we look to the lives of the most notable creatives of all time, most, if not all have to a lesser or greater degree been martyrs, in the sense that they were prepared to make sacrifices for their art.

An irony of the term ‘machine learning’ is that, in the wider sense, AI systems don’t ‘learn’ as such. Instead, they identify patterns. They do so for various purposes. But, in the context of AI content generators, they identify those patterns so that they can replicate them. Consequently, when an AI system generates an image or a sound or a text, even if that image or sound or text is ‘different’ to what came before, it is only superficially different. Historically, artistically, a ‘copy’ meant something that was the same as another work. In this context, it’s not one other work that’s being copied, it’s elements of several. How will intellectual property lawyers deal with this issue? How will copyright law, among others, need to evolve to accommodate of the infringements AI can potentially make to the IP rights of artists, designers, and other creatives?

We can gain a sense of the extent to which artificial intelligence could further concentrate wealth if we consider the following example. Flip through many a late 20th century magazine and you would see countless advertisements featuring models either on location or in a studio. Creating these images took teams which would typically include not just the model/s and the photographer, but one or more photographic assistants, a make-up artist, a hairdresser, a stylist, and sometimes an art director. They weren’t the only individuals that enabled the shoot to happen. Behind them would be the agents of the various parties present, and with them, other supplies, like prop hire firms. Behind all these companies were yet more, such as the lawyers that were hired by the various agents to draw up the contracts for the various parties involved in the shoot, the ground and air travel companies that got the crews to the shoot. When on location, the hotels and others that supplied accommodation. The list goes on. Put succinctly, creating photographic shoots required an ecosystem of skills and services. Recently, in among other places China, some companies are creating advertisements for which they’ve hijacked the imagery of others. It’s a process that involves either modifying an advertisement or other image owned by another organisation or individual. Those modifications include editing the models, both in an attempt to hide theft of intellectual property and to make the advertisement more appealing to the demographic the products or services it features are being targeted to. Yet, it’s not just IP that’s being stolen, it’s identities too - those being the identities of people that often don’t know the theft has taken place – that don’t know their image is being used by parties without their permission. Technically, as has become much evident with deep fake media, it’s possible to steel someone’s identify using digital tools alone.

However, recent investigations have exposed that some have created fake auditions, wherein models undertake ‘test shoots’, are told they didn’t get the assignment, but find the images that were taken are used without their content, let alone renumeration. In a word, theft. This is part of a much bigger pattern of activity – wherein those without integrity, without morals, and without decency steal other peoples’ works and identities.

 Arguably, there are many good things that can come from the still emerging advances in artificial intelligence. Both in the sciences, and indeed in the arts, there are many ways that AI can be helpful to the many not few. There are, for example, scenarios in which it is beneficial to largely digitise the process of image creation, be that for advertising or otherwise. This process can, in some cases, bring down the likes of carbon footprints. Equally, as the above illustrates, there are many ways in which AI can do more harm than good. It’s simply not fair when those that originate ideas are not numerated proportionately to the use and impact of that work. It’s not fair when tech firms make it easy for their users to steel the intellectual property and identities of others without their permission, let alone renumeration. It’s especially not fair when those that make it possible to take the IP and ID of others without their permission nor renumeration make eye-wateringly vast fortunes. The matter that it isn’t fair is something that artists and other creatives across many fields are now working to highlight, and to find means of address for. And, doubtless, it won’t just be their voices they use, but their art – art in all its forms. These artists and other creatives span the creative industries and at all levels – from those just starting out to some of the most established and respected worldwide.

When we think to the possible unintended consequences of AI on the arts and many other domains that generate intellectual property we need think beyond money. We need think to think too to how inequity in the arts, in the sciences, and in the humanities harms the outputs in these fields.

Fundamentally, although for all the discoveries we make and innovations we create across the creative industries, there is re-invention and repurposing of old ideas, they, none of them, would birth genuinely new ideas if all creatives did was regurgitate old stuff time, time, and time again. If that was the sum of what it meant to work in these industries they wouldn’t be called ‘the creative industries’, but the ‘recreation industries’. Birth requires death, in the case of innovation, the death of old ideas. We don’t have epiphanies if we aren’t prepared to release our assumptions of past. The great masters don’t birth new schools and movements by simply serving up pastiches of the masters that came before. They do observe. They do study. They do something no artificial intelligence is currently able to do – nor, we might posit – ever be able to do - they feel. More specifically, artists feel the moment, and that is highly pertinent to their work. Those that think that you can ‘recreate’ a Leonardo da Vinci or a Pablo Picasso using AI are profoundly wrong. What AI can do is digitally recreate something that looks like it may, to the untrained eye, and within a very specific context, have been authored by those or other artists. What AI can’t do is recreate the context of the works it imitates, that being context that is specifically temporally, spatially, philosophically, and historically.

The Leonardos and the Pablos of the world are not especially concerned with the breakthroughs their creative forebears made. They are not concerned with ideas that someone, somewhere, has already realised. They know that things have moved on and that what matters now is not what has been done, but that which is yet to be done. They are interested in the possibilities that others haven’t yet explored. Indeed, the ideas that consume them are so novel and original that they themselves aren’t yet able to fully iterate them. Art is a process of discovery, as is pursing creative endeavours more generally. Creators don’t have maps and blueprints and other things that show them the way. They find the way, intuitively and otherwise. A pattern, a pattern as identified by an AI system is, in effect, a way. A pre-defined way.

Most of the Leonardos and the Pablos of these times are little, if at all known to the wider public. In many cases, they have ideas at least as interesting and potentially useful as the men that dominate the Rich List. Yet, few of these ideas have visibility. Any visibility. Which is ironic, because though today there is more content than was conceivable in the past, that content has by and large become homogenous. Case in point, go to your preferred browser of choice and type in the words ‘sustainable city’ or ‘green city’ and see what imagery comes up. Then, having looked at that imagery ask yourself this, ‘do you think, do you seriously think that what you see is representative of the diversity of ideas that have been researched and developed in this domain over  the last several decades? Is that the best we can do? Or, might you think that what you see is a consequence – a direct consequence – of the combination of not merely quantity over quality of content [of far too many works being authored by those with little if any expertise in the field, and even then, largely not bothering to do much research], in combination with flawed search algorithms that are too unsophisticated to distinguish between expert insight and original ideas and tired old assumptions and out-dated ideas?’ Might that just be possible?

The old saying goes, ‘less is more’, and certainly a browser search of 15 years ago yielded more interesting results than one today with respect to not one, but many subjects, including that mentioned above.

Many things have become mass produced since the advent of the industrial age, and not just products. What we’re looking at today is mass produced content: content churned out in truly vast volumes, but, at typically low-quality. I state ‘low’ because much of this content is a consequence of briefs that were too short, budgets that were too low, and editorial ambitions that had been quashed by a business model that made quantity not quality content ‘king’. Sadly, all too many digital publications today publish the contents of press releases verbatim – without so much as questioning even obvious flaws in many concepts and proposals both in and beyond the domain of sustainability. Arguably, in most cases it’s not a case of intentional ‘see no evil, hear no evil’ editorial, but of editors working within such tight constraints that they can’t deliver to reasonable standards.

Technically, there are all manner of ways that we could create a veritable content revolution – an inspiring new dawn in online content of many and varied kinds. But, doing that would require ditching a highly problematic, yet dominant content revenue model – online ad content – and experimenting with something else. It would no longer be good enough for online publishers to use catchy, but misleading headlines to get more site traffic. It would no longer be good enough for them to churn out content that didn’t bother to question assumptions, let alone fact check. It would no longer be good enough to rely on stock photography and all the cliches it invariably contains. It would no longer be good enough to publish trivial tittle tattle. It would take making a conscience decision to raise standards, to push boundaries, and to pioneer the next generation in the genre. To create, not recreate.

The matter that compelling, beautiful, thought-provoking content experiences and curation are largely conspicuous by their absence is not just a missed opportunity for online publishers, be those publishers as defined legally or otherwise, but for society. How can we possibly expect to help society at large to imagine new ways of living if the same old tired content is churned out time and again? Admittedly, as a woman that’s worked with experimental imagery and ideas at the interface of the real and imagined world for now decades, I’m rather more critical than some. That said, given the data on shifts in public interest in press and media, TV viewing included, it seems to me that I’m not alone. On the contrary, it seems that a very many are bored, if not very bored of what many networks, both social and TV, are serving up, thus looking for something new. However, as all those that have tried to push the buck know, though a decent creative team can generate compelling and original new ideas in content creation, getting funding for such concepts is like trying into find the Holy Grail.


Interestingly, though the domain of digital content has been largely disappointing for now years, the print domain vice versa. This past several years we’ve seen not a few, but many inspired new print magazines and zines, and with them books, as independent publishers’ experiment with new approaches. All these titles are niche, because this community of publishing practice is not trying to please all of the people all of the time. They don’t throw out content for its own sake. They carefully craft thoughtful publications that reflect original and emerging ideas in the various fields of their interest. Collectively, these works are the industrialised content counterculture. This community is to the publishing sector what the ‘Real’ movement is to filters of the kind many influencers use to modify their online appearance. They show that, much as some like to suggest that tech’s influence is exponential, it isn’t. Trends come and go. Every major trend has a counter trend. Some trends co-exist, and sometimes indefinitely. However, what more usually happens is that a major trend will ultimately be displaced by its counter trend. Then, at some future time, that trend will, in turn be displaced by its counter trend. Time may move in one direction, but culture doesn’t. Culture is a far more complicated and that’s because culture isn’t beholden to the laws of science. Culture is instead beholden to forces that are far more complex, of which the foremost complex is, arguably, emotions both at the individual and the collective level.

Humans tend to like novelty. Place something new in our environment, real or imagined, and we often engage with it. Sometimes that thing intrigues and attracts us – we like what we see. Other times vice versa and, in this instance, we tend to seek its removal. The same is true of imaginary things. We know this thanks to studies of human response to, among other things, sequences of images. MRI scans reveal our responses, as do other forms of data collection. One way, or another, we pay a lot of attention to novelty. We’re also prepared to pay a lot for it. How much? Let us consider how much money people across the ages have spent on fashion. Or on refurbing their homes. Or on getting new modes of transport, when their existing vehicle, be it a chariot or a car was still perfectly fit for use.

Will AI generated content deliver on novelty, genuine novelty? Currently, we’re seeing some interesting AI works, but we’re largely seeing a lot of crude pastiches of things that were done before, as has been highlighted by, among others, Nick Cave. Some of the more interesting AI works may provoke some new thinking about what we do offline as well as online, both at and beyond the interface of the real and imagined. But, even then, the role of the AI in the creation of those works amounts to no more than a tool, a tool that is programmed by a human/s and that is thus innately biased in its design. The matter that some can’t grasp that construct serves to illuminate just how deeply engrained that bias has become.

If our digital histories are indicative of our digital futures, the more content we create – and the more content AI creates – the more of that content will simply pile up in the ether. Most of it will pile up accruing potentially exponential carbon footprints. Good or bad content, its very existence will require CO2, lots of CO2 to remain in existence. Those that imagine that an ever-growing demand for digital can be reconciled with environmental issues, climate and otherwise, always pivot their arguments on assumptions – namely the assumption that tech can fix it. We have yet to read a single study that evidences that the growth in digital can be mitigated in a way and in a time frame that is both sufficiently effective and expedient. Had we all the time in the world to reduce our emissions, biodiversity loss, and pollution, perhaps, just perhaps, we could reconcile tech’s monstrous environmental footprint, but we don’t. Time is now very much of the essence. Some posit it’s already run out.

Will the world be a better place if AI software enables the creation of seemingly exponential images, texts, and other works and all at the click of a button? Will much, or indeed any of that work solve any of the foremost critical problems of people and planet? Will any of that work have any significance in the history of creation – of the arts, of design, of architecture - aside from it marking the advent of just another tool in the long, long history of tool innovation?

Some have suggested that academics fear that ChatGPT will be used by students that are too lazy to write essays and other written works. In response, others still have suggested that education will have to revert to hand-written essays in tech free rooms. Maybe it will? However, we’ve seen many academics cheat of past. Plagiarism is rife in academia. Umpteen papers plagiarise others. More papers and essays still fail to duly cite authors of original works, scientific and otherwise. While some academic publishers have bothered to keep abreast of the issue, and have measures in place to remove plagiarised works, others don’t. I think of an online academic publisher that promoting itself as being partnered with one of the world’s top  tech companies failed to respond to not one, but two requests on my part that they remove a paper that didn’t just plagiarise one of my published works, but that was authored by someone with the audacity to cut and paste three whole paragraphs of text from one of my papers into theirs. Only at the point when I stated the next request would come from my lawyer did the academic publisher remove the paper. Doubtless, those that are seeking to copy other people’s works now will be employing ChatGPT and similar AI apps to rewrite others’ works. I suspect the original authors will know when this happens, because it’s not just word-for-word plagiarism that academics deal with, but their concepts being dressed up as new ideas more generally. In other words, we’ve been here before. So, what’s big tech going to do about it? Anything? Because it could. As with every other unintended consequence of tech innovation, if the right questions are asked at the start, and if responsibility is shown, it is possible to avert some problems – to design them out. For that to happen those building the tech need to feel accountable to others, those others being the original content creators. With that, those building tech have to understand the damage it does when the lineage of ideas is obscured – it matters that it’s clear how, when and where new concepts and discoveries came from. This is an issue fundamental to science, to the arts, and to the humanities.

There are so many questions. Important questions. Given the preference that many an investor has historically showed for ‘back of a beer mat ideas’, and for ‘bullet points’, we might reasonably expect for many of those questions to go unspoken in the very places they need be asked most.

Having brain-dumped these thoughts at a speed of around 120 words a minute, this isn’t an essay, so much as an exercise. Its purpose is not to serve up answers. It’s a case of publicly airing concerns. If my tone is somewhat sombre, consider that a consequence of the matter that this is authored against a backdrop of economic and political turmoil of a kind that is already causing many to suffer, with more likely to soon. True, this isn’t the first hard times the creative sectors have squared up to tough times. We have been here before, many times over. Nonetheless, this doesn’t feel like the time to be making light of how serious our predicament is. We have difficult choices. Choices made more difficult still by the fact that though walking the thin line between real and imagined worlds is not new, the sheer scale at which we’re having to do this is. Likewise all humanity that’s hooked up to a wi-fi and smartphone.

We have a lot to process and though AI can help us do that, it can’t do it for us. We must decide what kind of world we are going to build, and the role of the imaginary in that world: our composite physical meets virtual future world. At the very least, as we do that, might we consider the very real-world consequences of the new tools in our box, and not just at the level of statistics. I hear many casually comment that this or that technology will take this or that person’s job. Mind-boggling though some are, even our most advanced technologies are not human. They are tools, simply that. Tools change. They evolve. Not always in a linear direction.

The signs are there, clearly there, that the next paradigm in tool making and use is already on the horizon. Though there are some theoretical parallels with e-tech – with digital technologies, these next-gen tools are of a qualitatively different kind. A different species. Literally, because I speak to biocomputing and living technologies. These tools aren’t robots. They are creatures and assemblies of creatures. Creatures are very different to robots, to AI. You don’t create using these bio-tools, you cocreate with them. They have their own agency. Own autonomy. They have an intelligence that isn’t human made, but that can, at most, be edited. Robots, AI, is used in developing these tools. But, it won’t ‘take’ the jobs of the humans that work with living technology, because how we work with that technology involves making moral choices, and not merely of the kind that discern the future of humanity, but of life on Earth, and possibly beyond.

The idea that robots will take ‘people’s jobs’ is as contrived as the idea that AI will become ‘smarter’ than humans. Humans develop tools to automate some activities. We’ve done this for as long, if not longer, than we’ve had analogue computers – for millennia. Automating processing doesn’t ‘replace’ humans. What it does do is enable the delegation of some tasks to a machine of some kind. When we look to history, we find that though automation changed the way we did some things it didn’t erase the need for humans in the process. AI is essentially a fancy calculator in that AI calculates information in response to tasks we set it. Like the invention of the calculator, it has increased the speed at which we can attain some types of information, more specifically, information that can be generated using binary and linear processes and within the limits of the specific information types it can read. Put another way, data is being mistaken for ‘intelligence’.

If ‘robots’, AI powered and otherwise, ‘took’ peoples’ jobs, that’s not progress – human progress. That’s something else and it’s something that, if it happens, will be a failure of both the human imagination as well as ethics. When used responsibly, AI can empower people, whether in the context of work or otherwise, to make more informed decisions. When doing so, if designed well, it can produce useful information quickly, in some cases very quickly. When it does this is can create more time to do other things, like consider other facets of a problem, or a new problem altogether. And vice versa. If used incorrectly, whether in a way that rests of false assumptions, or on an approach that is inequitable and unfair, it can create more problems than it solves. Those problems can come in many and varied kind, and some of them have been iterated and clearly in more science fiction novels, films, and other works than I can count.

Now isn’t the easiest time to buck trends. For many, it’s hard enough to just get by. Not one, but many events have taken their toll and it shows. Many people, and certainly the planet – or rather many of the other life forms we share it with - need a break. Multiple data sets are strongly suggesting that humanity is in the Last Chance Saloon as outside a ‘perfect storm’ gathers. That storm, one notes, is comprised multiple elements that not only destroy electronic systems, but with great speed – floods, hurricanes, wildfires, and more. If you doubt that statement, affix a tracker to your laptop and place it in a Category 5 storm system. Having located said device upon the storm’s passing, see how well it works. And then, when you’ve established the effects of the storm on you laptop, take a look at the regional infrastructure, powerlines included. If such storms remained ‘once in living memory events’, mitigating their effects would be relatively straight-forward. But, not for nothing are some suggesting that one geological epoch – the Holocene – has ended, and a new one begun.

Here’s hoping that it doesn’t take an epoch for life to become tangibly better for many. Here’s hoping that the wealth gap starts to close, not further open. Ditto the social mobility gap. Technology has a pivotal role to play in helping us imagine better ways of doing. May more, many more start to hold that technology to account, and to treat those that create those technologies like the mortals they are. They are not gods. Those that doubt that, and no less so than those that imagine themselves to be gods, are welcome to try to disprove it… by walking on water, throwing a thunderbolt, flying unaided, or doing any other such things as our forebears imagined supernatural beings do. Case accepted, even if they, the tech ‘gods’, did do this, in a world of deep fake media, barely a soul, if anyone, would believe it ever happened. Might, I muse, it be more apt to compare the founders of the Big Tech 5 not to ‘gods’ but to emperors? Certainly, if those individuals, and those that follow their instructions, their employees, seek to populate the world with entities made in their image – with AI that is programmed in a way that reflects their worldview, such that it can perform particular tasks that they deem to be important, when and how they think appropriate - they would be no different to every emperor that sought to rewrite history. While the process may not involve piling up books to burn in the streets or tying scholars and other knowledge-keepers and creators to pyres, it would potentially destroy copious information, or all but, as search engines became consumed by yet more, ever more, content of the unhelpful and potentially harmful kind. All the while, though the creative industries and those that populate them may not burn, it’s not beyond possibility that many would struggle to simply survive. The buck lies somewhere and that somewhere isn’t with the artificial intelligence and other tech systems now being bank rolled by every Tom, Dick, and Harry Venture Capital Firm, Investment Bank, and other investor that buys into hype faster than ChapGPT can knock-out a comedy bad, not ‘scary good’ interpretation of a Nick Cave song.

 

Disclaimer: This was drafted at speed, without planning, and has not been proofed. You can consider its doubtless many typos and other errors as evidence that it was authored by a human and not an artificial intelligence system.