Skip to main content

Command Palette

Search for a command to run...

A Developer's Guide to Mastering Google's Indexing API and News Sitemaps with Python

Updated
2 min read

As developers, we know that shipping is only half the battle. If Google doesn't discover our content, it might as well not exist. For time-sensitive content like news, documentation updates, or blog posts, relying on Google's standard crawl cycle is inefficient.

The professional solution is a dual-pronged approach: a push notification via the Indexing API and a structured data source via a News Sitemap. Here's a technical breakdown of how to implement both.

A diagram showing the complementary "push" of the Google Indexing API and "pull" of the XML News Sitemap for rapid content discovery by Google.

1. The News Sitemap: The Source of Truth

First, you need a properly formatted News Sitemap. This XML file lists articles published in the last 48 hours and is a primary signal for inclusion in Google News.

Here’s a valid structure. Note the required <news:news> tags and the W3C date format.

codeXml

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
        xmlns:news="http://www.google.com/schemas/sitemap-news/0.9">
  <url>
    <loc>https://news.seosiri.com/your-new-article-url</loc>
    <news:news>
      <news:publication>
        <news:name>Your Publication Name</news:name>
        <news:language>en</news:language>
      </news:publication>
      <news:publication_date>2025-11-06T13:00:00Z</news:publication_date>
      <news:title>Your Article Title</news:title>
    </news:news>
  </url>
</urlset>

2. The Indexing API: The Real-Time Trigger

Once an article is in your sitemap, you should immediately notify Google via the Indexing API. This requires a GCP service account with "Owner" access to your property in Google Search Console.

You can send an authenticated HTTP POST request to the API endpoint. Here is a simple and effective Python script to handle this, which can be adapted to submit URLs in batch.

codePython

import sys
from googleapiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials

# --- Configuration ---
# Your service account JSON key file.
KEY_FILE = 'service_account.json'
# --- End of Configuration ---

def main():
    # Expects URL as a command-line argument
    if len(sys.argv) < 2:
        print("Usage: python submit_url.py <your-url>")
        return

    url_to_submit = sys.argv[1]
    print(f"Submitting URL: {url_to_submit}")

    # Create credentials from the service account file.
    creds = ServiceAccountCredentials.from_json_keyfile_name(
        KEY_FILE,
        scopes=['https://www.googleapis.com/auth/indexing']
    )

    # Build the service object.
    service = build('indexing', 'v3', credentials=creds)

    # Prepare the request body.
    request_body = {
        'url': url_to_submit,
        'type': 'URL_UPDATED'
    }

    try:
        # Execute the API request.
        response = service.urlNotifications().publish(body=request_body).execute()
        print("API Response:", response)
    except Exception as e:
        print(f"An error occurred: {e}")

if __name__ == '__main__':
    main()

A Necessary Integration

By combining a persistent News Sitemap with real-time API calls, you provide Google with the two signals it needs most: authority and timeliness. This integrated approach is the definitive best practice for any developer working on a site that publishes time-sensitive content.

IMPORTANT: In your post settings, find the "Canonical URL" field and paste in your original article URL: Indexing API vs. News Sitemap