Artificial intelligence tools such as Chat GPT, Bing and Bard, to name a few, play an essential role in streamlining work processes across various platforms and industries. How can this be used in newsrooms?
When Open AI released ChatGPT to the public, many people believed it was the beginning of the end for the human workforce. But before the public release of chatbots, artificial intelligence was operating in many sectors already.
Car manufacturers like Ford, General Motors, Mercedes-Benz and BMW have used collaborative robots on their factory floors for some time. Many cars also use AI for diagnostic testing and navigation.
Modern homes are also inundated with AI, such as Siri, Alexa, Google ChromeCast and smart security systems. Banks use fraud-detecting tools, businesses utilise them for data collection, and social media management, and travel agencies introduced virtual travel bookings, to name a few.
AI has been virtually everywhere since the birth of the digital age.
AI tools can streamline work processes
Leveraging advanced natural language processing and machine learning techniques, AIs like ChatGPT, Bing and Bard can quickly analyse vast volumes of data and extract real-time insights and trends.
“These tools can be incredibly time-saving for mundane, routine tasks that take up lots of time, such as data entry, sorting, and analysis.. What’s the payoff? People will have more time to focus on other facets of their work, like strategising and creative thinking,” says Briefly News chief editor and managing director, Rianette Cluley.
Language processing and machine learning artificial intelligence tools can also search and retrieve information from extensive knowledge bases. This can improve decision-making by providing accurate and up-to-date information at a moment’s notice.
The tools can also bridge communication gaps, connecting team members who work remotely and across different time zones.
Undoubtedly, AI will continue to evolve, and its potential to optimise workflows is boundless.
The use of AI has become an integral part of the newsroom. It can speed up long processes such as copy-editing and help with idea generation, sourcing information and experts on a topic, and facilitating team communication.
The Briefly News team uses several AI tools for effective communication. The newsroom’s completely remote team uses Slack for its internal communication. Slack can also integrate other tools via its add-on features, such as linking your calendar to notify you of upcoming meetings.
Not only are work-based matters, updates and policy changes discussed on Slack, but team building, celebrations and meetings are also integrated into the app through its features.
Using AI for basic copy-editing can save a ton of time.
An effective tool used in the Briefly News team is Grammarly. The app offers free or subscription versions. The free version is perfect for basic spell checks, while the paid option does more in-depth editing and gives more suggestions.
“In our newsroom, Grammarly is an invaluable tool that assists with proofreading the copy we put out to our readers. But, we never blindly accept the suggested changes, and our proofreaders still manually check if the suggestions are warranted,” says Cluley.
3. Content crawling and analytical tools
In the digital news age, news is published and distributed quickly. To stay up to date with trends, content crawling tools are essential.
These tools crawl social media platforms and RSS feeds for trending or breaking news. Furthermore, content crawling tools also have a search option that can aid in fact-checking and gathering more information from other sources.
Analytical tools, like Google Analytics, are also an important part of digital newsrooms. These tools help newsrooms identify what content their readers enjoy, how long they read the articles or the primary traffic source.
Chatbots have been the talk of the town since they were made available to the public. Writers, editors, journalists and social media managers fear the worst. The question on everyone’s lips is: Will AI take our jobs?
Cluley recently undertook an eight-week training programme called the JournalismAI Academy for Small Newsrooms, powered by Google Initiative, where it became apparent that no matter the advances of chatbots, human input will be required.
“Job loss was one of the main concerns when we began the training. But by the end of the programme, it became clear to us that human input will be needed, no matter how well a chatbot can write content,” she says.
“Chatbots are great tools for content generation, but human emotion is what sets apart a good article from a brilliant one. The chatbots in circulation at present cannot add the emotional touch to an article like a person can.”
Chatbots are used to generate questions, find sources, suggest experts to contact or create content ideas and calendars, to name a few.
Risks of using AI and policies on its use in organisations are vital
Like with any new technology, organisations must identify potential risks and have policies in place to avoid those.
Using AI to write content violates Google’s policies and will affect an organisation’s ranking in search results.
Google Search’s guidance about AI-generated content indicates: “Using automation— including AI— to generate content with the primary purpose of manipulating ranking in search results is a violation of our spam policies.”
Essentially, Google can downgrade or even ban a website. If you want your content to rank well in Google Search, creators should produce original, high-quality, people-first content.
“Having clear policies in place for using AI in your organisation is imperative to ensure it is not abused. Briefly News has very clear policies on using AI to create articles. Violating these rules will result in disciplinary measures and, in severe cases, termination of employment,” Cluley adds.
What Briefly News does, and what we don’t do when using AI
The policy document clearly indicates the do’s and don’ts of using AI in the newsroom.
Writers are not allowed to:
- Generate articles using AI, even partially
- Rely solely on AI research without conducting personal fact-checking and verifying sources
- Play with the prompts they give AI. The final result will depend on how well a writer uses a prompt.
- Be mindful that chatbots sometimes provide inaccurate data or make up data (called hallucinations).
- Not blindly believe the references chatbots give. Occasionally, some chatbots will reference statistics, for instance, but it can’t source them. Often, these are also out-of-date. (Bing, though, provides real references at the end of an answer).
- Always proofread and fact-check the information it gives.
What does the future hold for content creators in the AI era?
The saying goes: Adapt or die. And this could not be truer in this AI era. Newsrooms and organisations that don’t embrace AI’s potential possibilities will be left behind.
“Focusing on the potential threats of using AI maintains business for some time, but real growth will only happen once we look beyond the threats.
“So, embrace AI’s boundless potential to create a future for your business,” said Yelena Boginskaya, co-CEO of Legit, Briefly News’ parent company.