How to use Google Autocomplete API and Places API for Keyword Suggestions with Python

Google Autosuggest + Query Autocomplete and Place Autocomplete modules from the Place API for keyword research - mlforseo

Keyword suggestions are one of the most powerful ways to very quickly, with very minimal effort incorporate query semantics, and specifically to understand some of Google-suggested query paths and query refinements and augmentations for the terms you’re analysing. This can help you uncover not only what people are searching for as top searches for your seed terms, but also what Google is suggesting as next steps in their search journey. In the process, you’ll also stumble onto some key ngrams (keyword patterns), entities, and other characteristics that may provide you an alternative and enhanced view of your research topics.

All of this, can of course, help you create a better keyword universe, and in turn – enable you to more comprehensively understanding these queries, and craft content that better resonates with your audience.

In this guide, I’ll walk you through three different methods that I use to gather keyword suggestions:

  • Google Search Autosuggest,
  • YouTube Search Autosuggest, and finally,
  • Query and Place Autosuggest using the Places API.

Each method has its own strengths and best use case scenarios, and together, they can become a great source of new terms to consider, when conducting keyword research.

Rest assured – this tutorial is completely beginner-friendly, and all the code mentioned will be provided in a handy Google Colab Template for you to get started with it straight away.

✨ If you’re interested about taking your keyword research to the next level, consider my course on Semantic ML-enabled Keyword Research, out now on the MLforSEO Academy.

About Google’s Query Autocomplete APIs

Google offers several query autocomplete features, designed to predict and display search queries as you type into the search bar. The user-facing product (autosuggest) is a quick and intuitive way for users to find the information they are looking for faster, but for us, organic marketers, it can be a goldmine of insights into real user search behavior and query semantics.

Autocomplete is a feature for predicting the rest of a query a user is entering, which has several benefits:

  • improves the user search experience
  • accelerates the shopping process before checkout
  • improves the search response quality
  • create higher revenue by providing well-formatted queries.

How autocomplete and predictive text models work

Autocomplete and predictive text technologies predict and complete words as users type, improving speed and efficiency. It originated in the 1950s to improve Chinese typewriters’ efficiency, and its development eventually influenced modern word processors and machine learning applications like Google Search. Initially created to help people with disabilities, these models are now commonly used to assist all types of users, especially in fields with specialized vocabulary like medicine.

The basic concept involves language modeling, where the system predicts the next word based on the letters typed. For example, if a user types “med”, the model might predict “medical” or “medicine”. These predictions are driven by statistical probabilities, considering which words or phrases are most likely to follow based on previous inputs.

In addition, a “frecency” model is often employed, which prioritizes words the user has typed frequently or recently. This is especially useful for organic marketers, as it can tailor suggestions based on past query formulation and implicit user feedback Google collects for users, which can be a way for organic search marketers to incorporate user search behaviour in semantic keyword research. Over time, via reinforcement learning, these models learn from the user’s input, adapting to their unique language patterns.

Google’s APIs and endpoints that can be used for autocomplete suggestions

Here are the different APIs and API endpoints that can be used to collect query autocompletions:

How to get Google Search, YouTube Search, and Query Maps Search autocomplete keyword suggestions in Google Colab (Python)

In the following sections, I’ll show you several use cases for implementation of text prediction technology (autocomplete) via the different endpoints and APIs that Google offers, with the aim of enhancing your keyword collection and keyword research efforts in digital marketing.

Make a copy of the Google’s Autocomplete APIs and Endpoints – Keyword Research for Marketers Use Cases – Google Colab Template by Lazarina Stoy for MLforSEO , as that’s what we’ll be working with for the rest of this guide.

Prerequisites

There really isn’t much you need to get started working with these APIs. But the basics apply:

  • Google Account: Ensure you have a Google account for accessing Google Colab.
  • Google Colab: Get familiar with Google Colab as it will be the platform for executing Python code.
  • Basic Python Knowledge: Have a fundamental understanding of Python programming to follow the code snippets and their execution.

How to get Google Search Autocomplete Keywords from a list of seed keywords

Inspired by this approach, demoed by Michael Van Den Reym, I created a function that enables you to input a list of keywords in a csv file, and get an export of your seed keywords, mapped to the Google Autocomplete keyword suggestions, and a shared keyword characteristic from all of the suggested keywords for the term in a third column.

The goal is to generate a comprehensive list of keyword suggestions based on user-provided seed keywords and organize them into clusters. Each suggestion is linked back to its seed keyword for easy analysis.

By clustering these suggestions (clusters are built in the function, based on common ngrams from the suggestions Google gives), you can identify common characteristics for terms that may not be directly related to your seed keyword but are frequently associated with it. This is particularly useful for uncovering missed opportunities. For instance, when entering “business loan,” Google might suggest related terms like “calculators” or “credit scores.”

This script is great for:

  • SEO Professionals and organic search marketers to build better semantic keyword universes and get data-backed suggestions on related keywords
  • digital marketers to popular search terms to improve things like ad or content targeting.
  • Content Creators to identify long-tail keywords for blog posts, videos, and other media.

To use the script:

  1. Make a copy of the Google Colab.
  2. Prepare a CSV file with a column named ‘Keywords’ containing your seed keywords.
  3. Run the relevant function in the notebook, and upload the file when prompted.
  4. The script will process the keywords, generate suggestions, and download a clustered keyword file.
Screenshot 2024 11 13 at 10.33.13

Below is the Python code for this section, if you’d like to edit anything in it. You can modify the script to adjust how keywords are clustered, increase the batch size, or refine the stop word list. By doing this, you can scale up your research, making it easier to explore potential keyword opportunities, including semantic clusters, keyword engrams, or entities.

Click to expand – copy or review for the Python Code for only this section of the guide on How to get Google Search Autocomplete Keywords from a list of seed keywords
## Step 1: Upload Keyword File

import pandas as pd
import requests
import json
import time
import string
from nltk.tokenize import word_tokenize
from collections import Counter
from google.colab import files
import nltk

# Download necessary NLTK data
nltk.download('punkt')

# Prompt user to upload a CSV file with keywords
uploaded = files.upload()
df = pd.read_csv(list(uploaded.keys())[0])  # Read the uploaded file
keywords = df['Keywords'].tolist()  # Assume the column is named 'Keywords'

## Step 2: Define Helper Functions

# Basic stopwords list for English
basic_stopwords = {
    'i', 'me', 'my', 'myself', 'we', 'our', 'ours', 'ourselves', 'you',
    'your', 'yours', 'yourself', 'yourselves', 'he', 'him', 'his', 'himself',
    'she', 'her', 'hers', 'herself', 'it', 'its', 'itself', 'they', 'them',
    'their', 'theirs', 'themselves', 'what', 'which', 'who', 'whom', 'this',
    'that', 'these', 'those', 'am', 'is', 'are', 'was', 'were', 'be', 'been',
    'being', 'have', 'has', 'had', 'having', 'do', 'does', 'did', 'doing',
    'a', 'an', 'the', 'and', 'but', 'if', 'or', 'because', 'as', 'until',
    'while', 'of', 'at', 'by', 'for', 'with', 'about', 'against', 'between',
    'into', 'through', 'during', 'before', 'after', 'above', 'below', 'to',
    'from', 'up', 'down', 'in', 'out', 'on', 'off', 'over', 'under', 'again',
    'further', 'then', 'once', 'here', 'there', 'when', 'where', 'why',
    'how', 'all', 'any', 'both', 'each', 'few', 'more', 'most', 'other',
    'some', 'such', 'no', 'nor', 'not', 'only', 'own', 'same', 'so', 'than',
    'too', 'very', 's', 't', 'can', 'will', 'just', 'don', 'should', 'now'
}

def get_google_suggestions(keyword, lang_code, letterlist):
    """Fetch suggestions for a given keyword and language from Google Suggest."""
    suggestions = []
    headers = {'User-agent': 'Mozilla/5.0'}
    for letter in letterlist:
        URL = f"http://suggestqueries.google.com/complete/search?client=firefox&hl={lang_code}&q={keyword} {letter}"
        response = requests.get(URL, headers=headers)
        result = json.loads(response.content.decode('utf-8'))
        if result:
            suggestions.extend(result[1])
        time.sleep(0.5)  # Reduced sleep for faster processing
    return suggestions

def clean_and_cluster_suggestions(all_suggestions, stop_words, seed_words):
    """Clean suggestions by removing stopwords and tokenize them for clustering."""
    wordlist = []
    for suggestion in all_suggestions:
        words = word_tokenize(str(suggestion).lower())
        for word in words:
            if word not in stop_words and word not in seed_words and len(word) > 1:
                wordlist.append(word)
    return [word for word, count in Counter(wordlist).most_common(200)]

## Step 3: Process Keywords in Batches

lang_code = "en"  # Language code
batch_size = 5
letterlist = [""] + list(string.ascii_lowercase)  # Include empty and alphabetical combinations
all_clusters = []

# Process keywords in batches
for i in range(0, len(keywords), batch_size):
    batch_keywords = keywords[i:i + batch_size]

    # Filter out empty keywords and tokenize seed words
    batch_keywords = list(filter(None, batch_keywords))
    seed_words = [word_tokenize(keyword.lower()) for keyword in batch_keywords]
    seed_words = [item for sublist in seed_words for item in sublist]  # Flatten the list

    # Get suggestions for each keyword in the batch
    for keyword in batch_keywords:
        suggestions = get_google_suggestions(keyword, lang_code, letterlist)
        most_common_words = clean_and_cluster_suggestions(suggestions, basic_stopwords, seed_words)

        # Assign suggestions and common words to their seed keyword
        for common_word in most_common_words:
            for suggestion in suggestions:
                if common_word in suggestion:
                    all_clusters.append([suggestion, common_word, keyword])  # Include the seed keyword here

## Step 4: Save and Download the Result

cluster_df = pd.DataFrame(all_clusters, columns=['Keyword', 'Cluster', 'Seed Keyword'])
cluster_df.to_csv("keywords_clustered.csv", index=False)
files.download("keywords_clustered.csv")
cluster_df

How to get YouTube Search Autocomplete Keywords from a list of seed keywords

Extending the functionality into YouTube Search as well, I created a function that enables you to input a list of seed keywords in a CSV file and automatically fetch YouTube’s autocomplete suggestions for each term. The result is an export of your original keywords, along with a set of related suggestions retrieved from YouTube’s search engine, all neatly organized into a downloadable file.

How to get YouTube Search Autocomplete Keywords from a list of seed keywords - demonstration of approach, MLforSEO

For each seed keyword you provide, the script pulls YouTube’s autocomplete suggestions, links them back to the original keyword, and formats them into an easy-to-analyze CSV file. By organizing the suggestions this way, you can also spot common patterns or emerging search trends that might not be obvious at first glance.

For example, if you input a seed keyword like “funny videos,” the script will return related search queries like “funny videos 2024,” “funny videos for kids,” and “funny videos try not to laugh.” This allows you to quickly spot key video content ideas, helping you create videos or blog posts around these popular search terms.

This script is great for:

  • YouTube Content Creators who want to identify trending topics and optimize their video titles, descriptions, and tags for maximum reach.
  • Digital Marketers looking for data-backed suggestions on keywords to improve ad targeting or to boost content discovery on YouTube.
  • SEO Professionals aiming to uncover long-tail keywords and related search terms for better YouTube video optimization.
  • Businesses that want to gain insights into customer preferences through YouTube’s autocomplete data.

How to Use the Script:

  1. Make a copy of the Google Colab notebook to get started.
  2. Prepare a CSV file with a column titled ‘Keywords’ containing your seed keywords.
  3. Run the provided function in the notebook and upload the file when prompted.
  4. The script will process your keywords, fetch YouTube’s autocomplete suggestions, and generate a CSV file with the results.
  5. Download the CSV file, which will contain your seed keywords, and the related YouTube suggestions organized for easy analysis.

Below is the Python code for this section, if you’d like to edit anything in it. Here are some quickfire ideas for editing the script:

  • merge with the first function to automatically pull autosuggest keywords from Google Search and YouTube search and highlight overlaps
  • Incorporate clusters per shared keyword characteristics of the suggestions
  • Incorporate entity-based clusters
Click to expand – copy or review for the Python Code for only this section of the guide on How to get YouTube Search Autocomplete Keywords from a list of seed keywords
import requests
import pandas as pd
import json
import time
from google.colab import files

def get_youtube_suggestions(keyword):
    """
    Fetch YouTube autocomplete suggestions for a given keyword.
    
    Parameters:
    keyword (str): The search query string.
    
    Returns:
    list of tuples: Each tuple contains the seed keyword and its suggestion.
    """
    suggestions = []
    try:
        url = "https://suggestqueries.google.com/complete/search"
        params = {
            'client': 'youtube',
            'ds': 'yt',
            'q': keyword,
            'hl': 'en'
        }
        response = requests.get(url, params=params)
        response.raise_for_status()
        raw_data = response.text
        
        # Extract JSON-like content from the JavaScript response
        start = raw_data.find('[')
        end = raw_data.rfind(']') + 1
        json_data = json.loads(raw_data[start:end])
        
        # Process suggestions
        seed_keyword = json_data[0]
        for item in json_data[1]:
            suggestions.append((seed_keyword, item[0]))
    except Exception as e:
        print(f"Error fetching suggestions for '{keyword}': {e}")
    
    return suggestions

def process_keywords(file_path):
    """
    Process a list of keywords from an uploaded file and fetch YouTube suggestions.
    
    Parameters:
    file_path (str): Path to the uploaded CSV file containing a 'Keywords' column.
    
    Returns:
    DataFrame: A DataFrame containing the seed keywords and their suggestions.
    """
    df = pd.read_csv(file_path)
    print("Uploaded file columns:", df.columns)  # Debug: Print column names
    
    if 'Keywords' not in df.columns:
        raise ValueError("The uploaded file must contain a 'Keywords' column.")
    
    all_suggestions = []
    
    for keyword in df['Keywords'].dropna():
        suggestions = get_youtube_suggestions(keyword)
        all_suggestions.extend(suggestions)
        time.sleep(0.5)  # To prevent hitting rate limits
    
    result_df = pd.DataFrame(all_suggestions, columns=['Seed Keyword', 'Suggestion'])
    return result_df

# Step 1: Upload the keywords file
uploaded = files.upload()
file_path = next(iter(uploaded.keys()))

# Step 2: Process the keywords and fetch suggestions
try:
    suggestions_df = process_keywords(file_path)

    # Step 3: Save and download the results
    output_file = "youtube_autosuggestions.csv"
    suggestions_df.to_csv(output_file, index=False)
    files.download(output_file)

    # Display first few rows of the DataFrame
    suggestions_df.head()
except Exception as e:
    print(f"Error: {e}")

How to get Query and Place Autocomplete (Places API) Keyword Suggestions from a list of seed keywords

Google’s Query Autocomplete service helps users find location-based suggestions as they type. By using the Places API, you can implement autocomplete suggestions for geographical searches, such as searching for “pizza near New York” and getting relevant suggestions like “pizza near Paris” or “pizza near Disneyland.”

Key Features of this API of why it’s so useful for keyword research:

  • Provides query predictions for geographical searches.
  • Returns suggestions in real-time based on partial input.
  • Can include geographic details, like addresses and locations.
  • Supports language customization for better local results.

Here’s a summary of how it works.

A Query Autocomplete request is made via a URL, where users provide an input (search term) and, optionally, other parameters like language, location, and radius. The API returns location-based suggestions based on user input.

Example request format:

https://maps.googleapis.com/maps/api/place/queryautocomplete/json?input=your_search_term&key=YOUR_API_KEY

You can specify additional parameters, such as:

  • language: Set the desired language for results.
  • location: Bias results to a certain location.
  • radius: Define the distance within which to return results.

There are different functions in the sheet for the Query Module and the Place Suggestion module (see below).

Screenshot 2024 11 13 at 10.31.18
Screenshot 2024 11 13 at 10.30.26

Note – there is a prerequisite for using this API and function. In the associated video lesson with this tutorial, I’ve demonstrated the process of enabling the API in a Google Cloud Project (one, in which there’s already billing enabled), and getting your API key. Also, prepare your replies for language code, location, search radius.

Click to expand – copy or review for the Python Code for only this section of the guide on How to Query Autocomplete (Places API) Keywords from a list of seed keywords
import requests
import pandas as pd
import json
import time
from google.colab import files

def get_place_autocomplete_suggestions(input_keyword, api_key, language='en', location=None, radius=50000):
    """
    Fetch Google Places Query Autocomplete suggestions for a given input keyword.
    
    Parameters:
    input_keyword (str): The input text string for autocomplete.
    api_key (str): The Google API key.
    language (str): The language for the query results (default is 'en').
    location (tuple): Latitude and longitude to bias the search (optional).
    radius (int): Search radius in meters (default is 50,000 meters).
    
    Returns:
    list of tuples: Each tuple contains the input keyword and a predicted place description.
    """
    suggestions = []
    
    try:
        url = "https://maps.googleapis.com/maps/api/place/queryautocomplete/json"
        params = {
            'input': input_keyword,
            'key': api_key,
            'language': language,
            'radius': radius
        }
        
        # Add location bias if provided
        if location:
            params['location'] = f"{location[0]},{location[1]}"
        
        response = requests.get(url, params=params)
        response.raise_for_status()
        data = response.json()

        # Process the suggestions from the response
        if data.get('status') == 'OK':
            for prediction in data.get('predictions', []):
                suggestions.append((input_keyword, prediction['description']))
        else:
            print(f"Error fetching suggestions for '{input_keyword}': {data.get('error_message', 'No suggestions found')}")
    
    except Exception as e:
        print(f"Error: {e}")
    
    return suggestions

def process_keywords(file_path, api_key, language='en', location=None, radius=50000):
    """
    Process a list of keywords from an uploaded file and fetch Google Places query suggestions.
    
    Parameters:
    file_path (str): Path to the uploaded CSV file containing a 'Keywords' column.
    api_key (str): The Google API key.
    language (str): The language for the query results (default is 'en').
    location (tuple): Latitude and longitude to bias the search (optional).
    radius (int): Search radius in meters (default is 50,000 meters).
    
    Returns:
    DataFrame: A DataFrame containing the seed keywords and their suggestions.
    """
    df = pd.read_csv(file_path)
    print("Uploaded file columns:", df.columns)  # Debug: Print column names
    
    if 'Keywords' not in df.columns:
        raise ValueError("The uploaded file must contain a 'Keywords' column.")
    
    all_suggestions = []
    
    # Fetch suggestions for each keyword in the 'Keywords' column
    for keyword in df['Keywords'].dropna():
        suggestions = get_place_autocomplete_suggestions(keyword, api_key, language, location, radius)
        all_suggestions.extend(suggestions)
        time.sleep(0.5)  # To prevent hitting rate limits
    
    result_df = pd.DataFrame(all_suggestions, columns=['Seed Keyword', 'Suggestion'])
    return result_df

# Step 1: Request user input for API key and parameters
api_key = input("Please enter your Google API key: ")
language = input("Enter language code (default is 'en'): ") or 'en'
location_input = input("Enter location (latitude,longitude) or press Enter to skip: ")
location = tuple(map(float, location_input.split(','))) if location_input else None
radius = int(input("Enter search radius in meters (default is 50000): ") or 50000)

# Step 2: Upload the keywords file
uploaded = files.upload()
file_path = next(iter(uploaded.keys()))

# Step 3: Process the keywords and fetch suggestions
try:
    suggestions_df = process_keywords(file_path, api_key, language, location, radius)

    # Step 4: Save and download the results
    output_file = "place_autosuggestions.csv"
    suggestions_df.to_csv(output_file, index=False)
    files.download(output_file)

    # Display first few rows of the DataFrame
    suggestions_df.head()
except Exception as e:
    print(f"Error: {e}")

Additional No-code alternatives to Google Autocomplete

Finally, I also wanted to highlight some additional no-code alternatives to Google Autocomplete:

As well as also to highlight the Vertex AI autocomplete feature under Retail, which is not no-code but might be useful to explore for those working in e-commerce and web retail.

Key Takeaways on using Google Autocomplete APIs for Keyword Research

Google’s Autocomplete APIs—across Search, YouTube, Google Maps, and Google Merchant—offer easy access to real-time keyword suggestions based on user queries. These APIs are simple to use and provide valuable insights for keyword research, content creation, and digital marketing strategies.

Easy Access to Rich Data

Google’s Autocomplete APIs pull suggestions from a wide range of platforms, including:

  • Google Search: Get insights into trending and related search queries.
  • YouTube Search: Optimize video content by understanding what users are searching for.
  • Google Maps: Gather location-based search data for local SEO.
  • Google Merchant: Access product-related search suggestions for e-commerce businesses… and more!

These APIs allow you to quickly integrate search insights into your strategy without heavy technical work.

Machine Learning-Driven Insights

The power of these APIs comes from Google’s machine learning, which:

  • Analyzes search trends: Suggests rising queries based on popularity.
  • Predicts future searches: Understands query structure and context to offer relevant suggestions.
  • Uses historical data: Tailors suggestions based on past user behavior and search patterns.

Unlocking Keyword Research Potential

Using these APIs, you can:

  • Discover long-tail keywords: Uncover search terms users are typing but might not have thought of.
  • Find related terms: Gain insights into terms semantically linked to your seed keywords.
  • Stay on top of trends: Spot emerging keywords early and adjust your strategy.
  • Improve targeting: Refine your SEO and ad campaigns based on real user data.

In short, Google Autocomplete APIs give you predictive, machine learning-powered insights to optimize your keyword research, content strategy, and marketing efforts.

Author

  • photo blur

    Lazarina Stoy is a Digital Marketing Consultant with expertise in SEO, Machine Learning, and Data Science, and the founder of MLforSEO. Lazarina’s expertise lies in integrating marketing and technology to improve organic visibility strategies and implement process automation. A University of Strathclyde alumna, her work spans across sectors like B2B, SaaS, and big tech, with notable projects for AWS, Extreme Networks, neo4j, Skyscanner, and other enterprises. Lazarina champions marketing automation, by creating resources for SEO professionals and speaking at industry events globally on the significance of automation and machine learning in digital marketing. Her contributions to the field are recognized in publications like Search Engine Land, Wix, and Moz, to name a few. As a mentor on GrowthMentor and a guest lecturer at the University of Strathclyde, Lazarina dedicates her efforts to education and empowerment within the industry.

    View all posts

Share this post on social media:


Leave a Reply

Your email address will not be published. Required fields are marked *