Not Live Status

This science report aims to explore the reasons for changing the status of websites from “Live” to “Not Live” in Google Publisher Center. The study investigates the common signs, such as title length, authority, indexed URLs, content language, transparency, and content quality, and their impact on the status of websites.

The status of a website in Google Publisher Center can greatly influence its visibility and accessibility to users. A “Live” status means the publication page is available to users, while a “Not Live” status indicates the publication page is not accessible. This study seeks to identify the reasons behind this change in status, focusing on the common signs that can trigger such changes.

A sample of websites with a history of changing from “Live” to “Not Live” status was analyzed. We have collected data from the Google News help community and analysed it using the Netpeak Checker. The following factors were considered for each website: home page title length, authority metrics (Majestic’s Trust Flow and Moz’s Domain Authority), indexed URLs in Google Search Engine Results Pages (SERP), content language, transparency, and content quality.

Main reasons for not live status
Main reasons for “Not Live” status.

Shifted to “Not Live” Status


Google’s algorithms could not confirm a reliable reputation in all the domains we selected.

When Google looks for a description of a website on “About source” feature and cannot find a good match, it means that there are no notable, reputable sources that have published information about the website that could provide insights into its reputation. This is important for Google because it wants to make sure that it provides accurate, useful information to its users.

Read more: Google’s “Topic Authority” System

Based on the “First indexed by Google” feature, which measures the number of years a website has been indexed by Google, we found that the spammy news sites in our sample have a shorter history of being indexed. This suggests that these sites are either new or have low-quality content.

Title Length:
It was observed that the median title length was 44 characters, and the maximum title length was less than 70 characters. Ideal headlines range between 60 and 100 characters in length. Inadequate title lengths may result in reduced visibility and click-through rates, which can affect a site’s status. On the other hand, similarities in the length of home pages headers are likely to be coincidental. We therefore recommend that you focus on other features.

Lack of Authority:
Google News rankings are also based on authority, so if a news site does not have a lot of authority, it will not appear in Google News rankings. Authority is a measure of how trustworthy a website is, and is determined by a number of factors, including the number of links to the site from other authoritative sites, the quality of the content on the site, and the age of the site. You can check your site’s authority by using a tool such as Moz (Domain Authority) or Majestic (Trust Flow).

Websites with spammy link profiles

Most of the sites that received ‘Not Live’ status were created at the turn of the year 2020 or later, and in most cases had low authority scores, with a Trust Flow score of 6 or less. Low authority scores can indicate a lack of credibility or trustworthiness, which may lead to a change in status to “Not Live.”

Trust flow not-live sites.

Websites with spammy link profiles might have a lower Trust Flow, higher Citation Flow, and a higher number of low-quality backlinks and referring domains.

Indexed URLs:
Websites in the sample had fewer than 1500 indexed URLs in Google SERP. Limited indexed URLs can be a sign of low online visibility or insufficient content, which may result in a change in status. Pages indexed within the past 2 years have a slightly higher spam score. Spam score has a moderate negative correlation of -0.43 with Google Indexed URLs. Websites that are indexed more tend to have lower spam scores, as Google is less likely to index truly spammy pages.

Content Language and Country DNS:
The most sites in the sample used English as their content language and had a Country DNS of the USA. In our view, this is an additional sign that the status of sites mostly with English content aimed at US audiences has changed to ‘Not Live’.

Lack of Transparency:
In the context of Google’s content policies, “transparency” refers to the requirement that publishers must be open and honest about their sources and methods. This means that they must disclose the sources of the information they publish and must provide evidence to support the accuracy and reliability of their reporting.

Transparency is an important aspect of journalistic integrity and helps ensure that the information included in Google’s news experiences is accurate and reliable. It also helps users understand the context and sources of the information they are reading, and allows them to make informed decisions about the credibility of the information.

The sampled websites exhibited a lack of transparency in their:

  • Dates and bylines
  • Information about the authors, publication, and publisher
  • Information about the company or network behind the content
  • Contact information

Transparency is essential for establishing trust with users, and its absence can negatively impact a website’s credibility and status.

Lack of Quality Content:
Google News rankings are based on the quality of the content, so if a news site does not have high-quality content, it will not appear in Google News rankings.

Publishers that deliver informative, timely, original, and relevant content that adheres to Google’s content policies are eligible for consideration in Google’s news surfaces.

The websites in the sample were found to have poor content quality. In many cases, this is non-news content that has been created by artificial intelligence without any proper verification or human effort.

Using automation, including AI, to produce content for the primary purpose of manipulating search rankings is a violation of Google’s spam policies. If you use automation extensively to produce low-quality content on many topics, your content may be seen as search engine-first rather than people-first, which is not aligned with what Google’s ranking systems seek to reward. This can result in a drop in search engine rankings and a decrease in traffic to your site. It’s important to create helpful, reliable, people-first content that aligns with Google’s E-A-T principles to avoid these consequences.

Poor quality elements in website design can indicate poor site and content quality. This may include low-resolution images, unappealing color combinations, inappropriate fonts, unsuitable element arrangement, or an abundance of elements that obstruct user navigation, slow site loading, or cause confusion. For instance, an excess of advertising banners and pop-ups that disturb users could signify poor site quality. Low-resolution images that are too small to view may cause a user to question the content’s quality. Moreover, if the color schemes on a site differ drastically and impede readability, users may experience accessibility issues.

High-quality content is vital for user engagement and search engine optimization. Poor content quality can significantly influence a website’s status in the Publisher Center. If you don’t know how to improve the quality of your content, read Creating helpful, reliable, people-first content.

In the realm of evaluating content quality, one might argue that the process is inherently subjective, subject to the whims and biases of individual observers. Nonetheless, the fruits of empirical investigation suggest that there exist certain objective criteria by which one might assess the caliber of a piece of content. These factors, as they emerge from the data, are as follows:

Veracity: In order to withstand scrutiny, content must be grounded in factual accuracy, devoid of errors, and buttressed by sources and data of unimpeachable credibility.

Pertinence: A salient piece of content must address the topic at hand and cater to the interests of its target demographic, thereby offering genuine value and addressing the queries or apprehensions held by its readership.

Lucidity: Content of a high standard should be comprehensible and well-structured, adhering to the tenets of proper grammar and spelling while employing clear and succinct language.

Immersion: An engaging piece of content not only captivates its audience but also presents itself in an aesthetically pleasing manner, utilizing various multimedia elements, such as images, videos, and infographics, to enrich the reader’s experience.

Expertise: The authorship of a reputable piece of content should be traceable to an individual possessing authority and expertise in the relevant field, with the content itself reflecting thorough research and presenting unique insights or viewpoints.

By incorporating these facets into one’s evaluation, the quality of content may indeed be assessed. It is crucial, however, to acknowledge that the determination of content quality is inextricably linked to the specific context and objectives that the content seeks to fulfill.


This study revealed that factors such as authority (trust), transparency, and content quality play a crucial role in determining a website’s status in Google Publisher Center. By addressing these factors, website owners can improve their chances of maintaining a “Live” status and ensure their publication pages remain accessible and visible to users.

Google’s algorithms also analyze the overall search patterns and behavior of Google users to determine which news brands and sites people actively search out to stay informed. The sites that receive the most news-related search interest and traffic are more likely to end up as news sources in Google News.

Google’s news aggregation service, Google News, does not include all news sites and sources on the internet. It only includes news stories from a subset of sites that meet certain criteria.

The main criteria are:

  • The sites are popular news sources that many Google News readers already frequently visit to get news. For example, well-known mainstream media sites like The New York Times, The Washington Post, BBC, CNN, etc. These are sites people specifically search for to find news reports and coverage.
  • The sites publish a high volume of original news stories and reporting. Google News prefers sites that generate a lot of their own journalistic content, rather than just aggregating news from elsewhere or focusing on opinion and commentary.
  • The sites follow certain quality guidelines like proper news writing style, source transparency, accuracy, and objectivity. Google News aims to filter out low-quality “news” sites that publish false stories, conspiracy theories, or very biased propaganda.

Google News also has an open submission process for news sites and blogs to be added to the platform. But the sites still have to meet the overall quality guidelines and also achieve a level of readership and prominence to be permanently included.

So in summary, Google News only aggregates from an algorithmic-selected set of news sites that actually focus on real journalism and reporting, and leave out many smaller or questionable sources in an effort to prioritize accuracy, objectivity, and reader interest.

By John Morris

John Morris is an experienced writer and editor, specializing in AI, machine learning, and science education. He is the Editor-in-Chief at Vproexpert, a reputable site dedicated to these topics. Morris has over five years of experience in the field and is recognized for his expertise in content strategy. You can reach him at [email protected].

Leave a Reply

Your email address will not be published. Required fields are marked *