Actualauthor :: authenticity in the age of artificial intelligence

Melissa Sterry. Published January 3rd 2025

I didn't sign up to Coauthor and I have no plans to. While I have been approached to engage in an AI experiment in the general domain of making some of my past works more accessible to others, a digital replicant of me is not on the cards. That said, I have loved Jason Silva’s chats with AI Carl Sagan, among other notable scientific figures.

Why is no digital replicant of me on the cards? Because, having experimented with AI for some considerable time, I’ve concluded it’s not so much an expert as a student. With a lot of training, it can learn some subjects well, but it’s understanding is too patchy to constitute expertise. When using it I think of it like an assistant in a studio: it undertakes some rudimentary tasks, which frees up some of my time to work on more complex ones. When it’s done a job, I check the work and usually end up making revisions. Put another way, I find it handy but limited in its application.

It doesn’t take a professional futurist to see that this is the year AI mainstreams, and in the process makes further impacts to the employment market. When predicting the future impacts of AI of past, I highlighted how in and of itself it wouldn’t take people’s jobs as such. Instead, AI would augment the ability of those performing many tasks, such as when I have used it to assist on activities that don’t require any specific expertise. I am going to revise that statement to, ‘it won’t take your job if you’re particularly good at what you do, but it might if your performance is mediocre’. For example, if you’re a mediocre copywriter, AI has probably already marked the end of your career. If you’re an average graphic designer, likewise. Musician, translator, proofer ditto. AI can do many things to a mediocre level, and it does those things at great speed and with great convenience – on demand, any anytime and anywhere with a wi-fi connection. But AI is no master. AI doesn’t author sublime sonnets, songs, and scripts. It can write you an essay on how human emotions work, but it can’t feel them nor understand their complexity. Let’s face it, we humans are weird. We really are. I say this with certainty, not least having watched a television show roundup of the biggest trends of Tik Tok in 2024, which illuminated me on, among other things, the fact there’s a market for watching road signs being cleaned. Yes, really.

A good friend of mine is a professor of psychology. We often muse over the latest online trends, comparing notes on both the technicalities – the science of psychology and human behaviour, and anecdotal experiences and observations. Occasionally we’ve shared our conversations through livestreams, which evidenced that we’re not the only ones to conclude that we’re living at a time when quantity trumps quality of content. Or, put more candidly, there’s a truly vast amount of $^@% online. Roads signs being washed included.

Communications when done well are crafted and curated with care. Great communications aren’t created for their own sake, but within intent to make incisive impacts. The best communicators are highly creative and original, and with that, they understand the human condition intimately. They don’t deliver the same old same old because they know that people are usually more interested in that which is novel than common. Our brains are literally wired to ignore many mundane matters and focus on things that are unusual or out of place, because it’s these things that are more likely to present a threat.

In 2009 I participated in a handful of seminars and other gatherings that explored the future of media, one of which was the InterACT Participatory Media Forum. Largely populated by people that, like me, had worked with digital media a long time, we discussed how emerging technologies might shape the coming decades. Though my keynote centred on how digital media would shape environmental and social activism, as relates to 2025, my most pertinent contribution was in an interview to-camera in which I spoke of how advances in mobiles (burner to smart shift) and software applications would change the face of news and media reporting, as everyone, everywhere with wi-fi could, at least in theory, become a reporter. I discussed how we’d see mass media produced, and this would be a force for both good and bad. On the upside, it would enable the democratisation of media production. On the downside, it would wash away quality benchmarks, both with respect to the quality of the sound and image production, and more importantly still, the journalism standards. Back then, that as new media and information communications technologies came online engagement in media production would boom. What wasn’t clear was just how bizarre some of the content that citizen content producers would create would be – such as videos of… people cleaning road signs, let alone how popular some of that content would be.

What the Tik Tok hits of 2024 illustrated is that sometimes people do things just because they can and regardless of what the collective implications of their actions might be. As is, the combined energy footprint of citizen use of social media apps, and particularly now many are powered by AI is high enough to bring into question whether the many efforts to try and reduce carbon emissions may be wiped out. Contrary to those that wax lyrical about the virtues of collective intelligence, both in humans and in many other animals, sometimes herd behaviour can be problematic and lead to unintended and potentially dire consequences. Studies have shown that if people see other people engaging in activities, they can be more likely to partake in them themselves and particularly if they perceive there being a potential reward, which in the case of Tik Tok and its ilk can come in the form of revenues from advertising.

Andy Warhol anticipated our age. He understood that not only would everyone have their 15-minutes of fame, but that replication – the reprint – had agency. His use of the term ‘factory’ was not incidental. He grasped the fact that he, and with him society at large, were living in times carved by the Industrial Revolution and honed by Modernism. First, we mass produced things. Then we mass produced images of things. Warhol’s work was a mirror to the human world and all its eccentricities. He foresaw what many did not because he was the consummate observer. As others talked, he watched. Though present, he was typically observing, as opposed to participating in an active sense. There are discussions as to whether he was on the spectrum. I think yes, yes because both anecdotal experience and research findings assert that many that are on the spectrum have acute powers of observation and/or other above-average sensory abilities. Thus notice the small things, the details that others often miss. Warhol had the power of profound foresight because he didn’t just understand the nature of the new technological developments – the industrialisation of media production, but he understood people. Had been around a bit longer might he have foreseen that road signs being cleaned would be a big Tik Tok hit in 2024? Quite possibly.

Noreena Hertz has made many notable observations on the human condition today, and not least about loneliness. Her observations much resonate with those of the contents of many a private conversation I have had with friends and peers on the great social challenges of our time. On one level, we are the most connected civilisation of all time. Billions of us online, connected to umpteen people many of whom live far from our places of home and work and with whom our relationship is so tenuous we’d likely not recognise one another if we passed on the street. Yet, as Hertz has highlighted, many feel lonelier than ever before. Mental health is in crisis in many nations, and so too are auto-immune diseases and other issues associated with mental anxiety, stress, and depression. Gaging the extent to which these problems have increased isn’t easy, because, by and large, the means we use to monitor these matters didn’t exist of past. Sometimes, for all our efforts to be scientific we’re essentially stabbing in the dark, doing our best with the current methodologies, technologies, and approaches to answering complex questions. However, whatsoever the relative increase in certain problems, we can say and say assuredly that we have crisis level of anxiety, depression, and other serious conditions. There’s an app, so say, to solve that. But, any - and that’ll be many of us - that have helped friends, family, and peers with mental health problems will know that apps don’t cut it, at least not alone. Mental health is one of the most complex aspects of the human condition and we’re still so far from understanding our minds that we’re not even sure how consciousness works. Are our minds in our head, our body more generally, or do they extend to the external world?

If Warhol’s age was that of mass replication of images, the last decade was that of the mass replication of ideas. First industrialisation was concerned with materialism. Then with representations of it. Now it’s all about industrialisation of the ideas upon which we build our world. Those that have observed my comments on social media of this past couple of years will be aware that I been critical of some efforts in the general domain of sustainability, and particularly some statements on systems, complexity, and that most subjective of topics -‘nature’. Extractionism is rife. What do I mean by that? What I mean is highly complex topics being handled like a pick n’ mix, wherein some make statements about things that they clearly haven’t studied, thus grasped in any detail, and about which they hold a very many clearly flawed assumptions. Complexity is, as it says on the tin, complex. How complex systems are varies depending on factors such as whether they are closed or open, but, as a rule of thumb, they are more complex than things that are singular entities.

Nature as a construct has many and highly varied interpretations. Look to most of the notable historical writings, images and other records of perspectives on the non-human world and that becomes much evident. It’s understandable. Nature is complex. So much so that we’re still unravelling critical facets of its workings – from how Earth systems interact to how certain optical, electrical, and other phenomena form, and much more besides. On balance, we probably know less than more about the natural world. In the sense that, now that we have extraordinary tools that enable us to see that which could not be seen before, we’re making discoveries by the day. As I type, somewhere, indeed some several ‘wheres’, a scientist or team thereof will be making a new finding of possibly significant implications. Some such findings will eventually reach the daily science news despatches that drop in the inboxes of people like me that are members of the various scientific institutions. Some such discoveries will have incremental implications, but others will shift our perspective on aspects of the world about us, perhaps even the nature of reality itself. Yet, if we were to take many social media posts at face value, it could be construed that we’ve got ‘Nature’ all figured out. So much so that we can round it up in bullet-points and soundbites.

There is a deep irony to the reductionism of Nature, of systems, and of complexity. That being, reductionism is what caused at least some of the major issues of our time. Why did we end up with an environment which - from the top of mounts to the deepest ocean trenches, from the bodies of some of the biggest animals on the planet to those of unborn babies, from the air in our homes and offices to the air in the skies far above our heads - is saturated with nano and micro plastics pollution? How did it come to pass that this past 48 or so months Britain’s news headlines have been dominated by headlines about its water ways being full of, excuse my frankness, sh*t? What might, per chance, be the reason that across the highly (human) populated parts of the world there are landfills spewing out toxic substances and of which the stench is not just offensive to the senses, but damaging to them, among other body functions? Because of reductionism. Because for too long too many people looked at too few issues, all the while many others were scaling to such proportions that, despite humanity’s talent for invention, many of us now question, and deeply, if we can really build a better future or if we’re just being delusion. The commercial world became dominated by those that knew the price of everything and the value of nothing and we are all paying the price for that now, as are most, if not all other macro life forms on Earth.

Bullet points and soundbites are part of the problem, in that in order that we understand some things we need afford some subjects greater attention than fits into the wordcount of a micro blog. We need understand the magnitude of tasks in hand and have the humility to gage how far we are from having many answers, let alone the policy, economic, social, and cultural frameworks to take those answers from paper to reality. Yet, now many people are trying to ‘save the world’ by applying exactly the same mindset that has and continues to destroy it, or more specifically, a world of which the Earth systems are within the bounds that this current epoch of ecological entities have evolved to exist within. Earth will live on. Life will live on. But, we might not for that long in epochal scheme of things.

We may not have all the gear, but we sure as hell have a lot of it. Anyone, any place, with a wi-fi can become their own micro media empire: a content factory in their own right. Certainly, resharing can play a beneficial role in tackling some issues, particularly those that are underrepresented in mainstream media. However, just as resharing of content can help raise awareness of actual issues, it can and does propagate misinformation and make-believe. Misinformation and make-believe aren’t new. Though we don’t know when they first emerged in human culture it was likely early on. Afterall, misinformation is commonplace in the natural world. Don’t believe me? How do many animals survive? Camouflage, which is a form of misinformation, such as when an insect, a bird, a fish, mammal, or mollusc appears to be something it’s not, with the Mimic octopus being just one of innumerable examples. Misinformation in human culture can be used in many ways, but typically it's used for defence, for offense, and to otherwise mislead in a way that prioritises the interests of the individual or individuals that present something as other that what it really is. The nature of misinformation, and with it make-believe in all its forms hasn’t fundamentally changed, but the tools used in its creation and distribution have, and 2025 is a year when I think humanity is going to confront that in a significant way.

In the first quarter of this century, we saw digital media production democratise communications and media, which went from being essentially gated communities to near ubiquitous in accessibility. Branding has always played a big part in discerning who and what becomes successful. Some of the most notable examples of this can be found in history. For example, take Elizabeth I, she was a master of personal branding. When we look to the portraits of her, we’re seeing a woman that understood and clearly that to become beholden to her subjects she needed craft her image not in that of a woman, but a deity: a more than mortal being that rose above the sins of the populous to defend the interests of them and the realm with near-supernatural ability. Of course, blood not magic flowed through her veins and of that she was aware. She knew she had enemies on many fronts and her image was part of her armour – her weaknesses weren’t just masked, but masked in a way that garnered wonder and awe at the persona she presented. The pharaohs were largely formidable personal branding experts too. They too created more than mortal identities: they ‘were gods’ and they reminded their subjects of that in often less than subtle ways. Ways so impressive that tourists still visit them today. Though the everyday citizenry of times past had ways of communicating their ideas and sentiments, they could only largely do so through that which was limited in its geographic reach, i.e. graffiti etched into a wall or a lament etched in a clay tablet. Today, like the pharaohs, many people can create communications - and both as relates to their personal branding and their commercial interests. Ordinary people can craft a more-than-human image by doing a few tweaks here and a few tweaks there to their appearance. They don’t need a workforce of thousands like the pharaohs did. They just need a few moments, a smartphone, an app, and wi-fi. But many don’t just present an image of themselves that is inauthentic, but of reality itself. Take for example the fact that Instagram hosts ads that state that certain products are akin to an elixir of not Life, i.e. can reverse the greying of hair caused by DNA damage or eradicate wrinkles formed by the breakdown of collagen in the skin. Whatever your woes, there’s an influencer on Instagram or Tik Tok that tells you they know how to troubleshoot it, be it a problem of the heart, wallet, mind, or more. Experientially, it feels like we’re all living in the movie ‘The Life of Brain’, and more specifically, in the scene where all the ‘prophets’ are lined up proclaiming they have the power of foresight. Though not all those purporting to be able to remedy our every problem are snake oil merchants, it’s probably fair to state that a not small few at the very least overegg the level of their understanding of some fields, whereas others are outright liars and tricksters.

To a soundtrack of the 1946 Broadway hit “Anything You can Do I Can Do Better”, AI can outpace even those that, like me, can knock out an essay like this at a pace of over 120 words a minute. In fact, it can’t just churn out text at speeds far in excess of that, but simultaneously produce multiple ones. In theory, every social media account could now be populated with an endless stream of content on copious subjects. But, in practice we’ve already reached the stage when many people are dis-engaging from mainstream content, be that newspapers and magazines, or social media. It’s a shift that started a long while back yet was not understood by the major media outlets.  Ultimately, greed and vested interests played a major hand in why we ended up in a situation where even many of the titles that once held high prestigious fell in both readership and influence. Vogue, once the fashion bible can now look more like a nepo baby fanzine. Mainstream media has lagged not led on content publishing trends, and because even the biggest titles failed to duly invest in R&D sufficiently to find sustainable models going forward. Those that should have been taking risks and driving change fell for the idea they were too big to fail. At the level of strategy, the most insightful people in the boardroom aren’t the ones that always agree with you. They are the ones that are smart enough and principled enough not to.

Readers and audiences alike looked elsewhere, and at a time when several technology classes were coming together to underpin new media approaches. As someone that’s been engaged in experimenting in those approaches since the early 00s, my take is we’re a long way off seeing the true scale of their potential. 2025 will, I think, be a continuation of the Mass Media Extinction Event that sees some major players pushed beyond their operating limits as many much smaller and newer entities fill old and new niches. AI looks set to play a pivotal role, and for both intended and unintended reasons. AI can’t author can’t craft a compelling content strategy and it can’t create a reader connection, but it can handle some tasks in a way that’s efficient enough to help micro and small business, like zines, get projects off the ground in a way that’s a bit less protracted than it might have otherwise been.

A machine is eating itself, and AI is a modification to that machine that has rapidly increased the speed at which it is doing so. AI has created a scenario in which the media marketplace – including that which relates to personal branding – is vastly over-saturated. Supply already greatly outstrips demand, which is evidenced by the fact that for every social media hit there are a great many misses. Out on the frontier of experimental communications all sorts of ideas are being explored. As Chauncey Zalkin and I discussed in a podcast last year, analogue is back in a big way, proving those of us right that pushed back against predictions that print media was dying. Print media, and in multiple formats is anything but dead. But most of the interesting stuff is happening at a micro-scale. So-say, live events would largely die-out too. The reality is that live gatherings are alive and well, they just aren’t largely happening on the scale they had, with many now happening under the radar – the secret societies of our time.  The more sanitised, sterile, and polarising social media has become, the more people have started to seek out other places and communities to converse on the issues on their mind. The most creative communities have always had their undercover hangouts, from invisible societies to private members clubs and more. Now, as then, these communities emerge both out of shared interests and needs and they write their own rules and codes, and not only with respect to such things as their values, goals, and protocols, but with that the ways in which they convene and communicate those things. The influence of such societies varies depending on the agency of their founding and wider members. But, given the conditions we find in wider society – the matter that some are trying to force others to adopt ideas that are the antithesis of theirs, I for one anticipate that secret societies will be exerting far greater influence in the coming years than was typical in the last decade.

Just as we see extremes in other areas of communications, we see them in the fact that, on the one hand we now have some intellectual ideas and constructs – philosophies, disciplines, and schools of thought and practice more generally – that are being subjected to such extremes of reductionism as to not merely remove their agency for change, but to do so in a way that, paradoxically, harms the very motives and values that underpin their founding. While, on the other, we see experiments that of varying levels of success seek to counter the overall demise in the quality of content. Ideas of gravitas cannot always be conveyed in a 15-minute TED talk. Indeed, 15 minutes isn’t long enough to duly contextualise some subjects, let alone present a provocation of the kind that can truly shift any societal, economic, environmental, or cultural needles. Much as some TED talks are useful, more generally they are like the fast food of culture – designed to be consumed en masse. Devoid of any capacity for engagement and critique. Some sound like sales pitches, indeed many probably are. Had the curation of the endeavour been a lot more discerning it would have helped. But today and for now several years the agenda has been so open that TED has become the stuff of satire and even some of those that had been champions of the concept now distance themselves from its brand. Likewise, some popular science books can be compelling, but the limitations of their wordcounts often require difficult editorial choices. In some cases, editors manage to pull off the task with aplomb. In others the texts become so skeletal as to do their authors a dis-service, let alone the ideas being discussed. Then there’s the matter of how books become best-sellers, which as those that have worked In book buying – as I once briefly did – will know is less than equitable. Some authors, some topics, some publishers get the positioning needed to ride the top of the book charts, whereas most don’t. Sometimes truly significant discoveries and ideas do get into print, but sometimes not and it’s not whether those discoveries and ideas have significance scientifically, culturally, or otherwise that underpins that matter, but of whether a publisher thinks a book will sell in sufficiently great volumes to yield a profit.

Tectonic is how I would describe the scale of the shifts I anticipate in 2025. Ultimately, it’s not technology driving the changes, it’s society – our behaviour. It’s our very human foibles that shape which technologies flourish, for how long, and how. Some tout that we’re all going to become consumed by technology. Some imagine that’s a good thing, and other vice versa. But Moore’s Law has limitations in that though, in theory, we could continue to see leaps forward in computing of many and varied kinds, computing doesn’t happen in a vacuum. It happens here, on planet Earth, this - from a distance - non-descript little planet in a tiny corner of the vast Universe. All manner of events, some distinctly human caused, others not, many hybrids of both, could change our technology landscape overnight. We become reliant on the idea that technology will save us at our peril, for though it can be extremely useful if applied to certain tasks, those tasks can be undertaken by those with intent to do wrong as well as right. Thus, as James Cameron advocated in his recent address on the possible perils of AI, we need tread carefully when it comes to the assumptions we make about our technological future and bring our understanding of human intelligence to the fore of any discussion about artificial intelligence.

Without trying to sound like a Poundshop Aristotle, as we venture into the coming year and the last half of this decade, it would be helpful if we invested our time and attention towards how we can push back against misinformation in a way that does not undermine the need to bring all sides to a debate. Personally, I’m vastly under-satisfied with the state of multiple publishing and media outlets. I dislike how some magazines have become glorified press distribution services that churn out content with little if any attempt to interrogate the credibility of some ideas. I’m not impressed with how over-represented some issues are, and under-represented others. I consider the design, navigation, and general architecture of most social media platforms to be far poorer than possible and I’d like to see some significant disruption in the sector. I think change is coming, but, as of predictions past, I don’t think it’s coming in the way many think it’s coming. Many people in many places are dissatisfied and have lost confidence in leadership in many and diverse forms. It shows not only in the shift towards alternative media, but in voting patterns and in life choices. Having lost trust in traditional sources of reference, significant numbers of people has disengaged with the mainstream and have formed communities of interest of their own.

Some may imagine that tech moguls have remained among the most powerful influencers on the planet, and in some ways they are right. But if I were to sum up how many of those that I have spoken with this past several years now feel it’s that they have lost respect and care for what many of the most prominent members of society think. They have seen too many mistakes of policy and leadership to imagine that those that gather at WEF, G7, and COP really know what they are talking about. They feel politics as usual has failed. They feel umpteen companies are out to fleece them. And I for one get it. Respect is earnt. Trust is earned. All too many may think they’ve earned it, but even if they did in the first place they’ve lost it now.

Is God in the machine or is God in the mind? Perhaps the answer to that question lies in the fact AI is just another tool in a very long line of tools. Tools that predate the emergence of the first member of our species, no less. A flint or a smartphone, the technology is by the by. We, like other mammals, use tools that we think to be useful. When we stop thinking they are useful we use something else. Once upon a long-ago time there was probably a member of an earlier human species that imagined that humans had entered an age of singularity with the flint. They presented their hypothesis not on a TED stage, but around a campfire in the entrance of a cave using a now lost language and some sketches they’d drawn in ochre on some rocks.

Some, notably an executive at Google, mused that AI will have greater impact on humans than the advent of the control of fire. Given that Homo sapiens would not have evolved without the advent of the control of fire, that’s saying something, something that I disputed in among other places my PhD thesis.

Its usefulness in undertaking tasks of the mind-numbing kind aside, AI can be used in other ways. Though I’ve not used it today – this, as you may be able to tell, is a mind dump of some of my thoughts of this past year and earlier - I have sometimes experimented with AI as a provocation tool. The matter that it gets a lot wrong can be helpful, in that it can highlight where we need focus our attention when trying to address certain issues. AI magnifies biases already established in the digital domain, such as the gender bias, plagiarism and other forms of falsehood, such as the now seemingly common trend for rebranding old ideas as new. Now, as of old, those with depth of knowledge of given fields can see through such ploys, but in an age when so many stand on digitised orange boxes their voices are often drowned out by those that imagine that Likes and Shares can be indicative of the quality of data and other information.

For now, at least, AI can be handy in expediently identifying ways in which key issues are being undermined by misinformation. In theory at least that gives some glimmer of potential that it may be possible to push back against the veritable tsunami of content of the kind that makes conceptual needles get lost in content haystacks. If we use AI with care, if we subject our content to reflection, if we seek not to simply replicate ideas, but to reflect on them, then perhaps the future of content could be substantially more inspiring than the landscape we see about us now. There are many of us that crave new content formats that have the integrity so many today do not. The idea of Brain Rot may be so prevalent that the Oxford Dictionary declared it the Word of the Year for 2024 but we – the members of a secret society possibly yet to be formed, still believe in the power of considered communications to inspire and inform action on issues that count.  

Those that had not the bandwidth to read a long form essay of this kind may well have turned to AI software of the kind that summarises it within a specified word count. For the benefit of those that didn’t, but that might be interested to see how AI summarises the text, I’ll copy and paste AI’s interpretations below, in work counts of 1000, 500, 100, and under 50 words. Consider it an experiment 

Disclaimer: This was drafted at speed, without planning, and has not been proofed. You can consider its doubtless many typos and other errors as evidence that it was authored by a human and not an artificial intelligence system.

AI Edit #1 ‘AI Digital Replicants’ (1,000 words)

Dr. Melissa Sterry presents a deeply reflective critique of the evolving landscape of artificial intelligence (AI), human connection, and the increasing complexity of global challenges. Her thoughts resonate with the paradoxes of modernity: hyperconnectivity coupled with loneliness, technological advancements with environmental degradation, and an era of information deluge often lacking substance.

On AI, Dr. Sterry states her reluctance to endorse the creation of a digital replicant of herself. Her extensive experimentation with AI has revealed its limitations: while AI can learn and assist, it remains a tool, not an expert. Comparing it to a studio assistant, she highlights its utility in performing rudimentary tasks, freeing time for more complex endeavors. However, its outputs often require significant revisions, underlining its role as an augmentative force rather than a replacement for human expertise.

Sterry forecasts the mainstreaming of AI in 2025, further altering the employment landscape. Mediocrity in performance may render certain roles obsolete, as AI excels at tasks requiring average competence. From copywriting to graphic design, AI's speed and convenience outmatch the average human output. However, it lacks the creativity, emotional depth, and originality that characterize truly exceptional work. She remarks that while AI can write about human emotions, it cannot genuinely feel or comprehend them.

Illustrating this gap, Sterry humorously reflects on the peculiarities of human behavior, such as the trend of watching road signs being cleaned on TikTok. She attributes such phenomena to a broader societal shift where quantity trumps quality in content creation. This shift aligns with her observations at the InterACT Participatory Media Forum in 2009, where she foresaw the democratization of media and the erosion of quality benchmarks. The mass production of content has birthed a mix of innovation and mediocrity, exacerbated by the energy demands of AI-powered platforms. This raises concerns about the environmental footprint of digital media, questioning whether sustainability efforts are being undermined by such trends.

Dr. Sterry transitions to a discussion of loneliness and mental health, drawing from insights by Noreena Hertz. Despite unprecedented levels of connectivity, many people feel isolated, a paradox of modern life. Mental health issues such as anxiety, stress, and depression have reached crisis levels in many nations. While technology offers tools to address these challenges, such as mental health apps, Sterry emphasizes that these solutions are insufficient on their own. The complexity of the human mind, still not fully understood, demands deeper, more holistic approaches. She questions whether consciousness resides solely in the brain or extends beyond, reflecting the profound mysteries of human cognition.

In a broader commentary, Sterry explores the interplay between societal behavior and technological evolution. She describes the "tectonic" shifts she anticipates in 2025, emphasizing that technology is shaped by human behavior, not the other way around. While some fear or celebrate the idea of humanity being consumed by technology, Sterry stresses that tools like AI exist within the context of Earth’s intricate systems and the unpredictability of human influence. She cautions against relying on technology as a panacea, noting its dual potential for both beneficial and harmful applications. Echoing James Cameron’s warnings about AI, she advocates for careful, reflective discussions about technology’s role in society, urging an emphasis on human intelligence and ethical responsibility.

Sterry compares today’s era to Warhol’s age of replication, noting a shift from the industrialization of material goods to the mass reproduction of ideas. This transition, while enabling innovation, has also fostered reductionism. Complex topics, such as sustainability and nature, are often oversimplified into soundbites and bullet points. Sterry critiques this trend, arguing that reductionism has contributed to many of today’s environmental and societal crises. From microplastics polluting the deepest oceans to toxic landfills, these issues stem from a failure to consider the interconnectedness of systems.

She warns against the superficial approach to understanding nature, which remains an enigma despite scientific advancements. New discoveries constantly reshape our perspectives, yet social media often reduces complex topics to easily digestible snippets, undermining meaningful discourse. The rise of misinformation, facilitated by digital tools, poses additional challenges. Sterry likens misinformation to camouflage in the animal kingdom, a survival mechanism that misleads for defense or gain. However, the stakes are higher in human society, where misinformation can have far-reaching consequences.

Ordinary individuals today can craft a more-than-human image with just a few tweaks to their appearance. Unlike the pharaohs who needed vast workforces, people now need only a smartphone, an app, and wi-fi. Yet, this democratization has its pitfalls. Many present not just inauthentic images of themselves but of reality itself. Platforms like Instagram host ads claiming miracle cures for aging or DNA damage, perpetuating unrealistic ideals. Influencers on TikTok and Instagram claim solutions to problems of the heart, wallet, and mind. Sterry compares this phenomenon to the scene in “The Life of Brian” where prophets proclaim their foresight—many modern influencers, she suggests, are overconfident or outright deceptive.

AI further accelerates this dynamic. With its capacity to generate vast amounts of content at incredible speeds, AI enables a saturation of media. Sterry notes that while AI can't craft compelling strategies or create genuine reader connections, it can aid small businesses and zines in launching projects more efficiently. Yet, the media marketplace is oversaturated, evidenced by the rarity of viral hits compared to the multitude of misses. Many audiences are disengaging from mainstream outlets like magazines and social media, turning instead to niche, innovative platforms. This shift is emblematic of what Sterry terms a "Mass Media Extinction Event," where legacy outlets are being replaced by smaller, experimental entities.

Sterry observes that analog media, once thought obsolete, is resurging in the form of print and live events. These formats thrive on a micro-scale, often under the radar, forming what she calls "the secret societies of our time." These communities, built on shared interests and values, are rewriting the rules of communication. Sterry anticipates these groups will grow in influence as society grapples with increasing polarization and a desire for authentic connection.

However, Sterry criticizes the extremes in contemporary communication. On one hand, some disciplines are reduced to soundbites, stripping them of nuance. On the other, experiments in content often fail to counteract the overall decline in quality. She critiques TED talks as cultural fast food—too brief to provoke meaningful change—and laments the constraints of word counts in popular science books that often dilute significant ideas.

In addressing these multifaceted challenges, Dr. Sterry calls for a shift in mindset. Tackling complex issues requires moving beyond reductionist approaches and embracing the magnitude of the tasks at hand. Humility and a willingness to engage deeply with problems are essential. The world, as shaped by human activity, faces existential threats, but Sterry emphasizes that Earth will endure. The question remains whether humanity can adapt and thrive within the ecological limits of the planet.

Sterry concludes by highlighting the dual-edged nature of digital media. While it offers unprecedented opportunities for communication and creativity, it also demands responsibility. The proliferation of content must be balanced with efforts to uphold quality, accuracy, and sustainability. As we navigate this complex era, Sterry’s insights serve as a reminder of the importance of critical thinking, creativity, and a commitment to addressing the pressing issues of our time.

AI Edit #2 ‘AI Digital Replicants’ (500 words)

Dr. Melissa Sterry’s reflections on AI and societal challenges highlight profound tensions between technology and human behavior. She critiques the paradoxes of modernity: hyperconnectivity paired with loneliness, technological leaps alongside environmental harm, and an overload of information often lacking in substance.

On AI, Sterry rejects creating a digital replicant of herself, citing its limitations. While AI excels at basic tasks, it requires human oversight and lacks the depth to replace expertise. Sterry likens AI to a studio assistant—helpful but not autonomous. She anticipates AI will reshape employment by 2025, excelling in mediocrity while failing to replicate creativity or emotional intelligence.

Sterry humorously reflects on trends like watching road sign cleaning videos on TikTok, symbolizing a societal shift favoring quantity over quality. She recalls predicting this media evolution in 2009, noting the energy-intensive rise of AI-powered platforms. While democratizing media, these trends often degrade content quality and exacerbate environmental concerns.

Mental health is another focus, as Sterry discusses Noreena Hertz’s findings on loneliness in a hyperconnected world. Despite technology’s potential, Sterry argues that mental health solutions require more holistic approaches. The enigma of human consciousness, possibly extending beyond the brain, underscores the complexity of these challenges.

Sterry explores societal behavior’s influence on technological evolution, foreseeing "tectonic" shifts in 2025. She emphasizes that technology reflects human behavior, not vice versa, and cautions against overreliance on AI. Echoing James Cameron’s warnings, she advocates ethical responsibility in technology use.

Comparing today’s era to Warhol’s replication age, Sterry critiques the mass reproduction of ideas. She laments the oversimplification of complex topics, linking reductionism to environmental and societal crises. Issues like microplastic pollution and toxic landfills stem from neglecting system interconnectedness.

The rise of misinformation, likened to camouflage in nature, is another concern. Sterry notes how social media amplifies oversimplified narratives, drowning nuanced voices. Ordinary individuals can now craft inauthentic realities with minimal tools, fueling unrealistic ideals and misinformation.

AI accelerates these trends, flooding media with content. While helpful for small projects, it can’t replace genuine creativity. The oversaturated media landscape has driven audiences toward niche platforms, signaling what Sterry calls a "Mass Media Extinction Event." Analog formats like print and live events are resurging, forming "secret societies" built on shared values.

Criticizing extremes in communication, Sterry calls TED talks cultural fast food and laments the dilution of complex ideas in popular media. Addressing today’s challenges demands moving beyond reductionism and embracing the scale of problems. Earth will endure, she asserts, but humanity must adapt within ecological limits.

Sterry concludes by urging responsibility in digital media. Balancing content proliferation with quality and sustainability is vital. Her insights call for critical thinking and creativity to address pressing global issues.

AI Edit #3 ‘AI Digital Replicants’ (100 words)

Dr. Melissa Sterry highlights the paradoxes of modernity: technological advancements coexisting with environmental harm, hyperconnectivity fostering loneliness, and an overload of shallow information. While AI accelerates content creation, it lacks emotional depth and creativity, serving as a tool rather than a replacement for human expertise.

Sterry critiques mass media trends, oversimplified ideas, and misinformation, urging a return to nuanced thinking. She foresees societal shifts in 2025, with analog formats and niche communities resurging. Advocating ethical AI use and holistic mental health solutions, Sterry stresses the need to balance innovation with responsibility, ensuring quality and sustainability in addressing global challenges.

AI Edit #4 ‘AI Digital Replicants’ (Under 50 words)

Dr. Melissa Sterry critiques modernity's paradoxes: hyperconnectivity breeds loneliness, tech advances harm ecology, and AI excels at tasks but lacks creativity. She foresees shifts by 2025, urging ethical tech use, nuanced thinking, and sustainable solutions to counter shallow content and societal crises.