Defending Your Brand:
How Web Scraping
Counters Brand Bidding in
Google Ads

single blog background
 author`s image

Oleg Boyko

What is brand bidding?

When potential customers search for your brand in search engines, they typically click on the first result to reach your website. However, what if your competitor starts using your keywords in their SEM (search engine marketing) strategy? In that case, their advertisements will appear in the top search results on Google, even if your website ranks first in organic search results.

Brand bidding is the practice of purchasing advertising in search engines using keywords that are the names of other brands. This is done to display the company’s advertisements when users search for information about competing products or services. 

This can pose a significant problem for large brands, as brand bidding can divert a substantial portion of traffic originally intended for the main brand — between 10% and 30%. This effect is particularly noticeable in highly competitive industries, where brands compete for visibility in search engines.

How to deal with it?

1. Legal approach

To address this issue, you can turn to legal measures. Registering your trademark provides a legal basis to demand search engines to exclude your brand from the keywords used by competitors in their advertising campaigns.

However, your trademark must be registered in all the regions of interest to you, as protection applies only to those countries where the trademark is registered. In particularly complex situations, legal action may be required.

2. Technological approach

At first glance, the technological solution seems simple: you just need to outbid your competitors’ advertisements with your own. For example, you can track the presence of competitors’ ads by regularly collecting Google search results (Google SERP) every hour in various geographic locations. If competitors are not bidding on your branded search queries and your organic ranking is high, there is no need to activate advertising campaigns. However, if their activity is detected, it’s worth reactivating corresponding campaigns to ensure your visibility.

However, simply activating ads for all your brand keywords can significantly impact your budget, especially if you have a large amount of organic traffic. And what if you need to address this issue within a group of companies where there are dozens of brands?

The technological approach to solving this problem involves activating and deactivating ads at the right time. This allows you to spend the necessary minimum resources while maintaining all your brand traffic. Today, I’ll talk about the challenges of this approach.

Factors influencing Ad display in Google Ads

Advertising on Google Ads depends on various factors, including bid size, ad quality, display time, and others. However, for us, the key parameter is location.

This means that brand bidding can only be applied in specific countries or even cities. For example, you might rank first in search results in California, but in Berlin, your competitors could successfully outbid you, targeting the German market.
The second most important parameter is the type of device the user is using. This means that brand bidding can only be applied to mobile devices, for example.

How to determine if you’re a victim of brand bidding?

If you have only one brand and you’re interested in just one location, you can easily manage this manually. You just need to check several times a day to see if there’s any advertising for your brand on Google.

However, what if you have 400 brand keywords and you’re interested in 200 locations? In that case, to activate advertisements on time, you need to constantly know your position in Google SERP at least every hour. In this situation, automated monitoring is indispensable.

Google SERP monitoring

Since brand bidding is a fairly common phenomenon, many companies provide information about your position in SERP. Additionally, there are companies like ours that develop custom platforms for monitoring and managing advertising campaigns.

The main challenge in creating such systems lies in ensuring data quality. We conducted market research and will discuss the pros and cons of each solution.

1. Ready-made solutions

There are ready-made SAAS solutions that monitor your keywords and provide ready-made reports. By reviewing these reports, you can manually activate advertising for the keywords you need. Such solutions allow you to relatively affordably address the issue of brand bidding but often come with limitations and restrictions in functionality.

2. SERP Data Providers

If you’re interested in greater automation, such as automatically launching ads in Google Ads when needed or checking your keywords more frequently than six times a day, then it’s better to develop your own system. It all depends on how much money you’re losing due to brand bidding.

Many companies provide SERP data for your keywords in your desired locations. Payment is made per request, and discounts are available for large volumes. Typically, providers offer data via API. With this data, you can develop a system that analyzes it and provides recommendations for your ads.

We conducted a comparative analysis of three major SERP Data providers to check the quality of the data they provide. We won’t mention names to avoid promoting or discrediting these companies. We tested them on a sample of 50 queries, and the results of our study are denoted as Data Provider 1, 2, and 3.

In addition to the three data providers, we also manually checked the results using a browser and different locations. We also tested whether using a VPN affected the results. For the test, we chose a well-known brand in the hotel industry.
In theory, the data from different providers should have been identical, considering the test was conducted simultaneously, and we provided the same data to all three providers.

We contacted each company on the list and found out the locations where their service is available. Then, we selected the top 15 companies that most comprehensively covered the locations we needed. The result of this work was a table that turned out to be very large, so we won’t display it in its entirety.

However, we encountered unexpected results. The data we obtained was not consistent and differed from the testing methodology. On average, the results only matched in 70 percent of cases for all three providers. This suggests that these providers differ not only in price but also in the quality of the data they provide.

For instance, when querying “hotels com login” in Germany, one provider indicates that we hold the first position, implying that ad activation is unnecessary. Meanwhile, another provider states that we hold the second position in organic search results. Consequently, we need to activate ads for this keyword to avoid losing traffic.

We conducted further analysis and began investigating the reasons for these discrepancies by scrutinizing all available information on the internet and leveraging our 15 years of expertise in developing large-scale data collection platforms.

3. Custom Solution

A custom solution eliminates all limitations in solving this problem and allows for achieving maximum results.

It turned out that, In reality, no one knows exactly what factors affect Google’s search results. The extent to which IP address, cookies present in the browser, or browser language locale influence the results is unclear. This complexity complicates data collection, but by mitigating these parameters, it’s possible to achieve approximately 90% accuracy in data. This is the approach adopted by many SERP data providers. The quality of data is difficult to verify because even two queries made one after the other can yield different results.

What factors influence SERP? One of the most important parameters is Uule data. This parameter is used in URL requests to Google to specify the specific geographic location from which the search is presumably being conducted. It helps determine which localized results should be shown to the user.

Additionally, the IP address affects the delivery of Google search results (SERP). Google uses the user’s IP address to determine their geographic location, which in turn influences the localization of search results. This is done to provide more relevant, up-to-date, and useful results based on the user’s location.

During our analysis, we found that many companies do not use IP addresses in the selected location, limiting themselves to using Uule parameters, with the encoding a+ being more accurate for cities in countries compared to w+.
Therefore, to ensure data quality, we need high-quality and fast proxy servers in the locations of interest.

Finding suitable proxy services

We compiled a list of industry leaders providing proxy servers to examine the locations where these companies can offer their services. Data is as of the beginning of 2024.

As a result of our research, we discovered that no company covers the entire list of locations we require. Some companies limit themselves to only countries and do not provide IP addresses in specific cities.

Developing your own proxy service balancing system

To achieve a high-quality result, it is necessary to combine proxy service providers.

The data collection speed directly depends on the number of quality proxies. We cannot rely solely on specific providers, hence the need for continuous search and testing of new proxy servers to ensure the necessary speed and quality of data. When anomalies are detected, it is essential to assess the performance of proxy providers to identify and address any shortcomings.

Cost of proxy services

To calculate the cost, we took 400 keywords in 20 locations that we would like to monitor from two types of devices 24 times a day. Considering that the average Google search request size for desktop devices is 80 kilobytes and for mobile devices is 16 kilobytes, we need to budget for a minimum of 3 terabytes of traffic per month.

The price of proxy services at the beginning of 2024 ranges from $0.7 to $3 per 1 gigabyte. Additionally, when calculating the cost, it’s necessary to consider the redundancy of IP addresses to ensure an adequate number of quality proxies at any given time.

In the most basic and cheapest option, proxies will cost $2.5k per month. Factoring in a small reserve and the average price, it amounts to $10k per month. Under realistic conditions with redundancy, we think this sum can be multiplied by at least approximately x3 or even x5 more. This makes it $30k+ per month.

Architecture of a custom solution

In brief, the custom platform enabling SERP monitoring for advertising management consists of the following components:

Proxy server management system, discussed in the article itself. This ensures data quality.

Web scraper management and monitoring system. It should provide high-speed data collection and reliability.

A system that analyzes the collected data and automatically activates or deactivates specific advertisements.


In our view, the following conclusion stands: the choice of approach to solving the problem should depend on the amount of damage that brand bidding causes to your brands. Perhaps, for some, 70% accuracy will suffice.

The more popular your brand is, the more people will be inclined to employ brand bidding against you. Monitoring your keywords allows you to nullify attempts of brand bidding and prevents other attacks, such as Ad hijacking.

single blog background

Looking for a data-driven solution for your retail business?

Embrace digital opportunities for retail and e-commerce.

Contact Us