How to Bulk Remove URLs from Google

Bulk Remove URLs from Google

Sometimes you need to get multiple URLs out of the Google search results, either because you’re cleaning up old content, or perhaps you got hacked, or for whatever reason, you want them gone.  Google Search Console lets you remove them, but only one at a time, which can take forever.

***Update 2020*** – This still works, but it is now a PAID extension, for $15 per month. After installation, you’ll be prompted to pay for it when you try to use it.

A couple of years ago a guy in San Francisco wrote this Chrome extension at Github for bulk URL removal. It’s easy to use and there’s no limit to how much time this COULD save you.

To be clear, this will  *not* remove them from Google’s index  – only from Google’s search results.

The URL removal tool does not remove URLs from the index, it removes them from our search results. The difference is subtle, but it’s a part of the reason why you don’t see those submissions affect the indexed URL count. Google’s John Muller can clarify the semantics.

How To: The Process

  1. Open Chrome
  2. Visit Github to download the Bulk URL Removal Chrome extension
  3. Visit Google Search Consoles > Google Index > Remove URLs
  4. Enable the Chrome Extension (Menu > More Tools >
  5. Refresh Search Console “Remove URLs” page
  6. Select your choice from the dropdown
  7. Upload your list of URLs

Here’s a 2 minute video showing how to use the extension:


 Download the Chrome Extension from Github

Categories News
Scott Hendison:

View Comments (62)

  • I did a different way, valid for any browser, and is very easy to remove from LIVE fetching URLs as high as 1 per second.
    Let me explain how i did.
    First of all you need to collect your indexed URL you want to remove, instead to figure what is indexed and send some requests for no indexed pages.
    You can do some script to collect that info, just open a sock to Google with some Certificate and SSL, and send a GET for your searh ... or call lynx --insecure https://www.google.com.....
    Don't forget to add "filter=0" !!!
    Now you can collect up to 100 results from Google ( if you put over 100 you will get just 100 )
    The search string must be something like "site:yourdomain.xxx + SOME_STRING" where SOME_STRING is dependant on what you are seeking for remove.
    Then repeat it to fech as much as possible URL from Google, tipicaly over 1000 ( I usually got around 10K URL on each pass ).
    You need to enter some delay of course, or Google wll claim you are a robot, use at least 45 seconds delay.
    Or may be you have enough public IP and can do "ifconfig $new_ip" for each search.
    I spend around 24 hours ( 1 search per second, 255 public IP ) to collect a lot of URL from Google index for my site.
    Well, now filter the results, order it and uniq it, to avoid duplicates and to get a valid list of the URL you want to remove.
    Now the interesting part ... I use Xlib to :
    open the prefered browser ( Firefox, Chromium, Seamonkey ) , go to the Sumit button , wait 2 seconds and close the window.
    With 3 more or less fast computers you can remove around 1 URL per second, that is in 15 seconds you will get the anoying message "you have reached limit"
    With faster computer you can save that page, and seek for "reached" and then put your script to sleep one hour.
    You can play in a continous loop ( take care your software does a good job !! ) between the fetching part and the removal part.
    Now, the collected URL you can also prepare for build a sitemap.xml or some page.html to send to Google in a way if you already removed those pages from your site or return 401 you can go faster no just with search and cache removal, but also with index removal.
    I did all these tasks when due some mistake I need to remove around 240000 pages as fast as possible.
    Here is my running script ( Perl ) for the removal:

    use X11::GUITest qw/

    open (IN, "list_url.txt");
    while (!eof(IN)) {
    $a = substr(,0,-1);
    $a =~ s/=/\%3D/g;
    $a =~ s/ /\%20/g;
    $a =~ s/&/\%26/;
    $a = $pre_string.$a;
    if ($a !~ /'/) {
    print "$counter $a\n";
    if ($counter > 999) { exit; }

    close IN;

    sub send() {
    StartApp("/YOURBROWSERPATH/seamonkey -new-window '$a'");
    sleep 2;
    my ($GEditWinId) = WaitWindowViewable('Search Console - Remove URLs - http://YOURSITE.com/ - Seamonkey'); # modify if you use other browser
    if (!$GEditWinId) {
    die ("Couldn't find the window in time !!!");

    # send several tabs
    for ($g=0;$g<22;$g++) {
    sleep 0.2;
    Sendkeys(" "); # Send SPACE over the Submit button

    sleep 2;

    #Close the application with Control-W


    Now I am able to fetch live index from Google, split it on 3 fast computers and do a full removal requests in less than 15 minutes as the maximum allowed per day is less than 1000 request ( limited by Google )

    All this runs automatic , in continuous process fetching and removing, building index removal to submit, and even on faster computer watching for response like "limit reached. Try Later" or collecting the already requested for removal URL

    Your suggestions, hacking and comments are welcome.

  • Yay, Thank you for sharing, its helpfull for my problem, because in search console i can't found this bulk feature for upload.

  • Thanks for this wonderful extension guide, was searching the same guide. Makes my life easy .

  • Cannot visualise the google index dropdown. I am blocked at point 3 of the process. Is there any further action to be taken? Thanks!

    • You mean you can't see the new field added after activating the add-on? The screen doesnt change at Search console? Maybe try removing it and re-installing / activating it?

        • If you're not running any ad blocker browser add-ons like Ghostery that can be disabled, then I'd report the issue at their Github page where you downloaded from.

  • All I can say is thank you. I had a client with a huge number of URLs that needed to be deindexed and for the past 2 months, I was doing this manually and it was becoming hugely frustrating. Thank you for saving so much of my time :)

  • Your post saved hours of time. By mistake got duplicate urls indexed due to messed up .htaccess and had to bulk remove all the links.

  • Thank you for helping me i have 6000 404 and server errors on my site, but i have not take care about this pages because of my lazness, today i have got a warning mail from google about this issue, i have searched in google finally i have found your article, thanks for sharing this to us..

Related Post

We are all slaves to Googles wishes, like it or not.