News5 Cleveland outlines AI rules limiting tools to assistance, with human oversight and on-air disclosure

News5 Cleveland describes an AI policy focused on support functions, not automated reporting
News5 Cleveland says it has adopted internal rules that limit how artificial intelligence can be used in its newsroom, framing the technology as a tool for efficiency and quality control rather than a substitute for reporting. The approach is summarized by a central principle News5 leaders described as “enhance, not create.”
The station operates as WEWS in Cleveland and is owned by The E.W. Scripps Company. News5 newsroom leadership says the policy is tied to companywide governance designed to review and approve permitted AI uses and specific AI “agents” before they can be deployed in day-to-day journalism work.
Only approved tools are permitted; internal portal organizes AI applications
News5 says journalists are restricted to using an approved internal AI platform rather than turning to outside tools on their own. The station described this system as an internal portal that provides access to a menu of applications, including mainstream AI products as well as proprietary agents created for newsroom use.
One of the tools demonstrated by newsroom managers is designed to check scripts and other text against the organization’s ethics and style expectations as a quality-control step before editorial review. News5 also described using AI-based assistance for tasks such as catching grammatical issues and digital style errors in web publishing workflows.
What News5 says it does not do: AI-generated journalism, with limited exceptions for demonstrations
News5 leadership says the newsroom does not use AI to generate journalism content such as full articles, images, video, or other publishable material as a routine practice. It described limited exceptions in which AI-generated material may be created as part of coverage meant to show audiences how the technology can replicate a person’s voice or otherwise produce realistic synthetic media. In those instances, News5 says it discloses that the demonstration content was generated.
News5 leaders described AI as a tool used to support journalism workflows while keeping reporting and editorial decisions in human hands.
Disclosure remains a trust issue for news audiences
Audience research in the journalism industry has found strong demand for transparency about AI use, alongside persistent skepticism. Surveys have shown that many news consumers want disclosure when AI is used, while a substantial share report reduced trust after seeing such disclosures. Other national polling has also found that many Americans who encounter news through AI chatbots or AI-generated summaries say it can be difficult to determine what is true, and that they sometimes see information they believe is inaccurate.
Regional context: other Northeast Ohio outlets are still developing policies
Across Northeast Ohio, newsroom approaches vary. Some organizations say they are not currently using AI in their news production but are developing policies. Others have declined to discuss internal practices publicly. The uneven disclosure reflects a broader industry challenge: balancing experimentation with new tools against the need to preserve accuracy, accountability, and public trust.
- News5 says AI use is restricted to approved tools and governed by company oversight.
- The station describes AI as a support mechanism for editing and workflow tasks, not automated reporting.
- When AI-generated demonstrations are used to explain synthetic media risks, News5 says it discloses that content.