Is AI Holding the Future of Creative Jobs in Its Seven-Fingered Palm? An update.

A turbulent 2 years have passed since our first AI blog post appeared on our website. Now that it’s barely possible to open a notes app without the offer of AI salvation at our fingertips, we felt it was high time for an update—proving once and for all that humans can evolve without a changelog.
Dearest reader, I know this is shocking, but I have decided to write this blog the old-fashioned way. No, not with a typewriter, that’s old-old-fashioned, but rather with a blank page on my screen, one keystroke at a time.
This might not have been a feat in 2019, but just 6 years later it feels like digital warfare.
I am writing this article with one hand, while I keep angrily swatting away all the auto-completes and sentence suggestions, and trying to see past the passive-aggressive red lines suggesting that “old-old-fashioned” is not a proper word.
How did we get here?

The horsemen of overblown marketing — crypto, NFTs, and now AI?
In the last few years, we have witnessed a mania around cryptocurrencies and NFTs, and while these were definitely interesting technologies, the hype has died down dramatically.
Generative AI was the next impressive breakthrough, and as large language models like ChatGPT continued to impress, companies were eager to use these buzzwords to gain an edge in fundraising and thus lure in more subscribers or investors. The term AI washing was coined to describe this practice and swiftly attracted the attention of the authorities.
Even when products are correctly labeled as generative AI, the amazing demos that we get to see are often just that — cherry-picked demonstrations where the AI has managed to produce stunning results. However, as someone who likes to play around and test the limits of the technology, I’m increasingly sceptical about the feasibility of AI fully replacing creatives in the near future.
AI as a tool, not a replacement
The last few paragraphs may suggest the opposite, but I actually like AI and see great potential when it is used and thought of as a tool rather than a replacement. There are a lot of tasks in design that can be quite tedious and take a surprising amount of time. Whether it is removing the background behind a person, hiding logos in video footage, creating a staggering number of different formats for various media channels, or more.
Currently, Adobe seems to be one of the few companies that has started integrating AI as a tool to help designers, rather than attempting to replace them. While this can be interpreted as an act of goodwill in support of their target audience, I can’t help but wonder if Adobe also believes that current tools simply can’t replace experienced designers, and won’t be able to in the near future.
Modern smartphones are one of the reasons I don't think AI will take over creative roles as quickly as some would have you believe. We all have access to one, but give one to a professional photographer and watch in awe as he casually shoots a masterpiece. Much in the same way, the most impressive AI-generated content will still be created by artists who simply use the generated output as an asset to refine and rework into something new.

Can AI make it pop?
In contrast to designers, I have noticed that writers are a lot more positive about the potential of AI technology. Sure, one group uses an image generator and the other a text generator, but the main difference between these tools? The ability to easily revise the output. Words are a lot easier to change and restructure than an image.
Every creative knows that we go through very precise revisions every day, whether based on our own taste, feedback from our team, or the client’s needs. We often pride ourselves on delivering exactly what is required. Just look at corporate identity guidelines, the use of a logo and its spacing is defined down to the pixel. But the nature of Midjourney and the like is to deliver a finished product that fills in all the unknowns with guesswork. It is at best difficult to change (and at worst completely unhinged).
That's why I'm convinced that improving the user experience is crucial for generative AI to achieve mass adoption. We are forgetting about the needs of professional creatives when we build these tools, like the need to have everything on a separate layer, built brick by brick, from the foundation to the roof and swappable at any time. The first AI tool to cater to this need is destined for success.
Overrelying on AI
This doesn’t change the fact that AI is and will continue to be used as a tool, even at its current stage. But it's important to remember that there are risks associated with using AI. Microsoft recently conducted a study showing that overreliance on AI tools can reduce critical-thinking abilities. Just as our body needs a workout, so do our brain and our creative abilities. How many weeks can we spend outsourcing our creative thinking before the well runs dry? The law seems to be “use it or lose it”, after all.

In summary
While Generative AI is here to stay, only time will tell if it is here to rule. A growing wave of misleading advertisements is raising expectations, making AI tools seem more omnipotent than they truly are.
In order to make the tools vital for professionals, developers will need to focus on the user experience, specifically, on creating a tool that allows us to easily and accurately tweak the details of the output, rather than blindly churning out piles of images with varying degrees of imperfection that will never see the light of day.
Properly designed tools will in turn empower creatives and give them greater control and confidence in their day-to-day work. But we can’t get lazy and rely too heavily on these tools either. Passionate and skilled individuals will always outshine the masses, so honing our craft will remain a top priority.
It is almost ironic that the rapid pace at which AI is improving is also pushing us to improve even faster. And while this may seem stressful, I firmly believe that it will inevitably result in incredible work that may not have been possible until now.