cloro
Technical Guides

Google search parameters: the complete guide

#Google#Scraping

You are probably using Google wrong.

Most users treat the search bar like a magic box. They type a keyword, hit “Enter,” and accept whatever results Google’s algorithm decides to show them based on their current IP address and history.

For developers, SEOs, and data scrapers, that isn’t good enough.

To get clean, unbiased, and location-specific data, you need to stop using the search bar and start manipulating the URL parameters.

Google’s URL is essentially a public API. By appending specific codes to the end of your search string, you can force Google to simulate a user in Tokyo, filter for results from the last hour, or even toggle the new AI Overviews on and off.

If you are building a scraper or monitoring your brand’s global presence, these parameters are your toolkit.

Table of contents

The anatomy of a Google URL

Before we dive into the specific codes, let’s look at the structure.

A standard Google search URL looks like a mess of gibberish. But it follows a standard query string format:

https://www.google.com/search? + parameter=value + & + parameter=value

The most basic example:

https://www.google.com/search?q=coffee

  • ? starts the query string.
  • q stands for Query. This is your keyword.

Everything else you add refines that query.

The “Big three” for localization

If you are scraping Google Search results, the biggest challenge is geolocation.

If you search for “pizza” from a server in Germany, you get German results. If you want to see what a user in New York sees, you cannot rely on proxies alone. You must tell Google exactly where you are.

1. gl (Geo location)

This sets the country of the search results.

  • Usage: &gl=us (United States), &gl=uk (United Kingdom), &gl=jp (Japan).
  • Why use it: It forces Google to return results relevant to that country’s index.

2. hl (Host language)

This sets the interface language of Google.

  • Usage: &hl=en (English), &hl=es (Spanish), &hl=fr (French).
  • Why use it: This is critical for parsing. If you don’t set this, Google might return the UI in the language of your proxy IP, breaking your CSS selectors.

3. lr (Language restriction)

This restricts the actual search results to a specific language.

  • Usage: &lr=lang_en
  • Difference from hl: hl changes the buttons and menus; lr changes the blue links.

Cheat Sheet for Localization:

ParameterFunctionExample
qThe search termq=best+vpn
glCountry of origingl=us
hlUI Languagehl=en
lrResult Languagelr=lang_fr

The magic of uule

The gl parameter is great for countries. But what if you need city-level precision?

What if you need to track rankings for “plumber near me” specifically in Austin, Texas?

Enter uule.

The uule parameter is a base64-encoded string that represents a specific canonical location from Google’s own “Geotargeting API.” It allows you to spoof your GPS coordinates without actually moving.

How it works:

  1. You find the canonical name for your target city (e.g., “Austin, Texas, United States”).
  2. You calculate a special length character.
  3. You encode the name into base64.

The result looks like this: &uule=w+CAIQICIaQXVzdGluLFRleGFzLFVuaXRlZCBTdGF0ZXM

When you append this to your URL, Google believes you are physically standing in Austin.

Why this is critical for SEO: Local rankings vary wildly by zip code. A business ranking #1 in North Austin might be #10 in South Austin. Without uule, your rank tracking data is just an average, not a reality.

Note: Tools like cloro handle uule generation automatically, so you don’t have to do the base64 math yourself.

Controlling the AI

In 2025, Google isn’t just a search engine. It’s an Answer Engine.

Google now injects AI Overviews (formerly SGE) at the top of many results. Sometimes you want to see them. Sometimes you want to kill them.

There is a parameter for that: udm.

1. Force the “web” view (udm=14)

If you want the “Old Google”—just the 10 blue links, no AI, no maps, no fluff—use this parameter.

  • Usage: &udm=14
  • Use case: Great for extracting pure organic rankings without the noise of SERP features.

2. Force “AI mode” (udm=50)

This is less documented, but highly potent. It forces Google into a conversational, AI-heavy interface, often triggering the Google AI Mode layout.

Filtering and time parameters

Sometimes you don’t want the “best” result. You want the “newest” result.

The tbs (To Be Searched) parameter is a powerful container for advanced filters.

Time-based search (qdr)

You can restrict results to a specific timeframe using tbs=qdr:X.

  • Past hour: &tbs=qdr:h
  • Past 24 hours: &tbs=qdr:d
  • Past week: &tbs=qdr:w
  • Past month: &tbs=qdr:m
  • Past year: &tbs=qdr:y

Why this is useful: If you are monitoring a PR crisis or a product launch, you don’t care about articles from 2023. You need to see what is being indexed right now.

Verbatim mode (li:1)

Google loves to “help” you by correcting your spelling or including synonyms. Sometimes, you hate that.

  • Usage: &tbs=li:1
  • Effect: Forces “Verbatim” search. Google will search for exactly what you typed, no fuzzy matching.

Advanced search operators

While parameters live in the URL, operators live in the search box (the q parameter).

Combining these with URL parameters gives you X-Ray vision.

site:

Restricts results to a specific domain.

  • Query: site:cloro.dev
  • Use case: Check how many pages of your site are indexed.

filetype:

Restricts results to a specific file extension.

  • Query: filetype:pdf "annual report"
  • Use case: Finding whitepapers, datasets (csv), or presentations (ppt) that aren’t indexed as normal web pages.

before: and after:

A cleaner alternative to the tbs parameter for date ranges.

  • Query: AI search after:2024-01-01 before:2024-12-31

intitle: and inurl:

Ensures your keyword appears in a specific part of the page.

  • Query: intitle:"guest post"
  • Use case: Finding link-building opportunities.

Why precision matters for scraping

If you are manually searching Google, you can correct mistakes. If you are building an automated system to monitor ChatGPT mentions or Google rankings, mistakes are expensive.

The cost of bad parameters:

  1. Polluted Data: If you don’t set gl=us, your “US Rankings” report will be contaminated by the location of your proxy server (which might be in France).
  2. Broken Parsers: If you don’t set hl=en, Google might serve the page in Arabic because your rotating proxy is in Dubai. Your scraper looking for the English “People Also Ask” text will fail.
  3. Missing AI Features: If you don’t understand udm, you might completely miss the AI Overview box that is stealing 40% of your traffic.

The solution:

Don’t rely on defaults. Explicitly define every parameter in your request.

Better yet, use a tool that abstracts this complexity.

cloro is built on top of this infrastructure. When you ask cloro to track your brand in “London,” we handle the uule, the gl, the hl, and the udm parameters automatically. We ensure that the data you get is the exact data a real user in that location would see.

Stop guessing the URL. Start engineering it.