How to Bulk Remove URLs from Google

Bulk Remove URLs from Google

Sometimes you need to get multiple URLs out of the Google search results, either because you’re cleaning up old content, or perhaps you got hacked, or for whatever reason, you want them gone.  Google Search Console lets you remove them, but only one at a time, which can take forever.

Update 6/2019 – Still works 😉

A couple of years ago a guy in San Francisco wrote this Chrome extension at Github for bulk URL removal. It’s easy to use and there’s no limit to how much time this COULD save you.

To be clear, this will  *not* remove them from Google’s index  – only from Google’s search results.

The URL removal tool does not remove URLs from the index, it removes them from our search results. The difference is subtle, but it’s a part of the reason why you don’t see those submissions affect the indexed URL count. Google’s John Muller can clarify the semantics.

How To: The Process

  1. Open Chrome
  2. Visit Github to download the Bulk URL Removal Chrome extension
  3. Visit Google Search Consoles > Google Index > Remove URLs
  4. Enable the Chrome Extension (Menu > More Tools >
  5. Refresh Search Console “Remove URLs” page
  6. Select your choice from the dropdown
  7. Upload your list of URLs

Here’s a 2 minute video showing how to use the extension:


 Download the Chrome Extension from Github

Categories News
Scott Hendison:

View Comments (49)

  • "Manifest file is missing or unreadable
    Could not load manifest. "

    getting the mentioned error. Unable to find any solution. :-(

  • Manifest is not loading while I'm uploading the unpacked version of this Google bulk url removal.. please help me how to fix this???

  • Hi Scott,
    Can we remove duplicate URL's by using this techniques?
    I have many duplicate URL's indexed in Google due to which am not able to rank.

    Please help me.

    Thank You

    • Well yes, you can remove any URL this way - in your case you must mean you have dupilicate CONTENT - not duplicate URLs - that's when you have the same content on multiple URLs, yes, you could remove all of the unwanted URLs from the index like this.

  • Hey guys, Dmitrii here. Anyone had any problems with this tool at all? Any issues or death spirals? :)

    • Hi Regex SEO. I'm the original author of this tool. Please ping me on github should you run into any issues.

  • We run an online store where we sell primarily matching couple shirts and after switching e-commerce platforms a couple times, we were left with a bunch of broken links left behind. I've been having a heck of a time trying to remove all the outdated links that google had for our site and thankfully this trick worked! I literally had hundreds of extra product pages. Thank you!

  • Scott, this looks brilliant! Unfortunately, I'm running into an issue getting this to work at the moment. When I try to upload a list of URLs for temporary removal, the URLs are consolidated and submitted as one URL, rather than being submitted individually.

    Could you share any notes on required formatting within the Excel doc that might be causing this issue?


    • Hi Pat - Use a straight .txt file, rather than an Excel document. the video shows .txt but my Windows adds a stupid little green flag on the icon that made it look like Excel.

  • Thanks for the help. I'm currently trying to recover from a hack that added about 2800 bogus pages to my site's index. The hackers found a backdoor in a WP plugin, then claimed ownership of my site in the Search Console. They then uploaded a spoofed sitemap and submitted it in the Search Console.

    Google indexed all of them without ever crawling them, then tells me a day later that I suddenly have a 404 problem. Everywhere I read, someone working for Google says it's no big deal and they'll drop the 404 pages eventually, but then people reply that they can't get them to drop for months, even years. Then people suggest making them all 410, but people say it doesn't matter, Google doesn't seem to believe it and will still try to crawl the fake pages, over and over again, skipping your real pages.

    I'm glad I found this tool you mention here, but it's still not enough. I've apparently hit a limit. The Search Console is telling me "You have reached your submission limit. Please try again later." I think I got about 800 submitted before this popped up.

    I don't understand why Google doesn't have an undo button. Seriously, this kind hack is common enough that I should be able to tell Google "please undo the last x days of crawling and indexing."

    • Wow, I'm sorry to hear that. yeah, only 800 means you may have to do this a few days in a row to get all the URLs removed. Honestly, you should be able to bulk remove without a 3rd party tool too, but Google just doesn't have that on their priority list.

We are all slaves to Googles wishes, like it or not.