Loathe it or like it, machines will be taking over some important jobs in the newsroom. For humans, the job landscape will change radically. It’s time to get our robots in a row, reports Piet van Niekerk.
- AI will reshape the job landscape within media organisations.
- The diversity of new AI roles underscores the multifaceted nature of AI integration in media and the need for specialised skills.
- Journalism’s core values, particularly editorial judgment, remain pivotal amid technological advancements.
“XX media company is looking for an AI Content Coordinator who will be responsible for developing and implementing content strategies that leverage artificial intelligence and machine learning technologies to create, curate, and optimise digital content. This is a brand new role that combines expertise in content creation, data analysis, and AI to enhance the overall content quality and user experience.”
The above vacancy, advertised in the first week of February 2024, is one of many going up on job sites branded as “new roles” related to the use of AI. In this case, it is specific to media. The “ideal candidate” will have a Bachelor’s degree in Marketing, Communications, Computer Science, Journalism, or a related field. Skills in Python and SQL (both programming languages) are preferred, as well as “a strong interest and understanding of AI and machine learning technologies and their applications in content creation and distribution”.
Should you land this job, you won’t be alone. Similar jobs related to managing Large Language Models (LLMs) at media companies are exploding online. Some are “AI Tone-of-voice Editors” to streamline the process of refining written communication, making it more effective, coherent, and in line with the desired tone or brand identity. Others include Head of AI and Media, Editorial AI project coordinator, and AI-assisted reporter for an NCTJ-qualified reporter to expand AI technology across newsrooms in West Yorkshire, using AI technology to create national, local, and hyper-local content.
The New York Times recently hired an editorial director of artificial intelligence initiatives to “establish protocols for the newsroom’s use of AI and examine ways to integrate the technology” into its journalism.
You get the picture. On the one hand, droves are being hired to manage AI, while others continue to debate how many jobs will be lost because of AI.
A white paper by the World Economic Forum (WEF), titled Jobs of Tomorrow: Large Language Models and Jobs and published towards the end of last year, is probably the most comprehensive analysis of this thorny subject. The report provides a structured analysis of the potential direct, near-term impacts of LLMs on jobs. It used as its base reference a study that concluded that 62% of total work time involves language-based tasks and found that the widespread adoption of LLMs, such as ChatGPT, could significantly impact that broad spectrum of job roles.
The paper is based on an analysis of over 19 000 individual tasks across 867 occupations, assessing the potential exposure of each task to LLM adoption, classifying them as tasks that have:
– high potential for automation;
– high potential for augmentation;
– low potential for either; or
– are unaffected (non-language tasks).
The paper also provides an overview of new roles that are emerging due to the adoption of LLMs.
The two industries with the highest estimates of total potential exposure to automation and augmentation measures are both segments of financial services:
- Financial services and capital markets; and
- Insurance and pension management.
These sectors are followed by information technology and digital communications, and then media entertainment and sports. Specific to media and publishing, the paper predicts 23% of functions can be automated, 33% augmented, 23% have a low potential for automation or augmentation, and 21% consist of non-language tasks.
There is no attempt in the white paper to link these percentages to job losses. Instead, the paper concludes that as generative AI introduces a new paradigm of collaboration between humans and AI, it will redefine how work is done. This will reshape the nature of various job roles.
It is within this new paradigm of collaboration that there is room for job development in several key areas, which include:
- AI Model and Prompt Engineering to “fine-tune LLMs with evolving skill sets, covering algorithm design, custom chip development, server infrastructure, and power systems engineering”. Prompt Engineers will play a critical role in refining prompts for optimal results;
- Interface and Interaction Designers to craft user-friendly interfaces for LLMs, acting as user experience (UX) designers. These designers will need to adapt LLMs to different user inputs or specific tasks, such as developing personalised AI assistants, tutors, or coaches;
- AI Content Creators will build on the work of Interface Designers to use LLM knowledge to rapidly produce in-depth content across various fields, from articles and books to teaching material and media storylines;
- Data Curators and Trainers will ensure continued high quality of training dataset and performance by LLMs. They will also monitor the data integrity through rigorous quality checks, with a specialised workforce dedicated to curating internet text; and
- Ethics and Governance Specialists to address biases and ethical concerns in LLMs. AI Safety Officers, ethicists, and regulators at company and government levels are expected to play a crucial role in testing and ensuring ethical AI deployment.
While these five “key areas” presented in the WEF paper are general to the wider job market, the current AI media jobs posted to job boards do not seem to relate to any specific development area. The most popular post advertised, “Editorial AI project coordinator”, seemingly covers all of the above. Even the folks at The Times are vague about the specifics of their new “Newsroom’s Editorial Director of AI Initiatives”, saying this individual will “work with newsroom leadership to establish principles for how we do and do not use generative AI”.
Drawing on 30 years of newsroom experience, I have done my own analysis to pinpoint a few roles, pairing them with the key development areas highlighted by the WEF. Here’s a starter for ten:
AI Model and Prompt Engineering
AI features assistant. An experienced journalist who collaborates with AI systems to generate creative storytelling ideas, combining human creativity with machine-generated content.
AI content editor. An experienced editor who curates content generated by AI systems, ensuring accuracy, quality, target audience relevance, and adherence to editorial standards.
AI narrative strategist. A prompt engineer who creates a cohesive prompt strategy for generating engaging AI-driven narratives.
Interface and Interaction Designers
UX designer for AI. An experienced graphic designer assisting in the design of user interfaces to ensure AI-driven content on all the company platforms contributes to the user experience and enhances content consumption and interaction.
AI visual designer who combines graphic design expertise with AI capabilities to create visual and interactive storytelling.
AI interface designer who creates designs to assist platform users to seamlessly interact with AI-driven content.
AI Content Creators
AI-assisted reporter. A journalist whose task is to harness the power of language models to generate relevant news articles, reports and analyses, and interview transcripts, and create content variants for diverse distribution platforms. The role’s main focus is to speed up efficiency and creativity.
AI language editor, or AI tone-of-voice editor: An AI subeditor to refine and enhance the language models used in content creation by ensuring they align with the newsroom’s editorial standards and style.
AI innovations editor. An experienced editor who collaborates on developing AI systems to find new ways of storytelling and new ways to engage with content.
Data curators and trainers
Data journalist. A journalist skilled in interpreting and analysing data generated by AI tools, providing insights and trends that inform editorial decisions and content creation.
AI tool trainer. An AI expert responsible for training and fine-tuning AI models specific to a publisher’s needs to ensure high performance and relevance to the target market.
AI content strategist. An AI editor responsible for using AI to develop content strategies, recommending topics, angles, or supplementary content based on audience preferences and trending themes. Ensures alignment with audience preferences and optimises AI for engagement.
Ethics and Governance Specialists
AI ethics editor, who ensures responsible and ethical use of AI technologies in content creation and distribution.
AI compliance officer. An editor who monitors AI systems for compliance with ethical standards, addressing potential biases and concerns.
AI governance strategist. An AI strategies specialist on the editorial team who develops strategies and protocols for the ethical use of AI, aligning with worldwide industry standards.
If we adopt these roles and their subsequent skill sets become part and parcel of the modern newsroom, the logical question is: where will the fresh talent be found for these jobs?
David Caswell, an application architect for AI in news products and workflows, predicts that the most valued skill in an AI-empowered news organisation will likely be the same as in traditionally configured news organisations. “Editorial judgement: the ability to maintain a keen awareness of the deep informational needs of an audience or society, identify stories that meet those deep needs, verify and contextualise those stories, and then communicate them to audiences in clear and engaging forms — will probably remain the foundation of journalism.”
What was that about the more things change…?