Whereas most SEOs concentrate on getting the best ranks, there are situations when the absolute reverse is required.
These URLs occasionally need to be entirely deleted from Google altogether.
For example, whether you’re dealing with indexed sites that contain classified subjective data, searchable staging environments, or obsolete or redundant information.
You might be here as a consequence of spam sites created as a result of your website being hacked appearing in Google, confidential content that shouldn’t have been exposed to Google becoming indexed, or the entire staging environment getting indexed.
Whatever the circumstance, you will be wondering how you can remove these sites completely.
There’s no need to panic, and there is certainly no reason for you to look any further.
You have come to the right place, as we will be able to help you through these issues in extremely quick and easy manners.
You will be able to swiftly get rid of these URLs from Google using the approaches that we have listed in this article.
Continue reading to discover how you can easily remove URLs from Google Search, in little to no time at all.
So, let’s jump straight into it. Here is how you can remove URLs from Google searches in a snap.
Getting Rid Of URLs That Have Duplicated Or Old Content
One of the most frequent causes of URL removal from Google – if not the most frequent cause of removing URLs from Google – is probably duplicate or outdated content on your site.
The majority of out-of-date information is useless to those viewing your pages, yet it still has value from an SEO perspective.
Duplicate content, on the other hand, can negatively impact your SEO efforts since Google may become unsure of which URLs to prioritize and analyze.
As we will describe below, the specific steps you need to get rid of these URLs from Google vary according to the circumstance of the pages you will desire to have withdrawn.
When Visitors Need To Have Access To Content
There are occasions when URLs must still be reachable to those viewing your page, but you do not want Google to index them, since doing so could harm your SEO. This, for instance, applies to corresponding content.
Say, for instance, that you run a shop online, and you sell identical sweaters – or any other kind of clothing – with the exception of their various color schemes and sizes.
These theoretical pages, displaying the sweaters for sale, only differ in name and picture, rather than having distinctive product details.
In this situation, Google might view the information on the product pages as being nearly identical.
Presenting nearly identical sites forces Google to determine which URL should be indexed as the official one, using up your valuable crawl money on webpages that do not improve SEO.
You must tell Google which URLs in this scenario should be indexed, and which ones should be de-indexed.
Considering The Value Of The URLs
You should approve any inbound links from different websites that are arriving to the URL that you want to get crawled, if it is gaining organic traffic.
The favored URL will then receive the highest ranking from Google, while those clicking on your pages can still view the other URLs.
Simply use the noindex robots attribute if the URL is not getting any organic traffic and does not have any inbound connections from other websites.
This clearly tells Google not to index the URL, thus they don’t display it in the SERPS (the abbreviation for the phrase ‘search engine results pages’).
It is absolutely crucial to realize that Google will not accumulate any revenue in this situation.
When Visitors Do Not Need To Have Access To Content
Based on the circumstances of the URLs, there are two different approaches to take if there is obsolete subject matter on your site that no one should be seeing. In general, it depends on whether or not the URLs have traffic and/or links.
Use 301 redirects to point traffic and/or links to the most important URLs on your website.
Redirects to irrelevant URLs should be avoided because Google can interpret them as soft-404 errors, and, as a result, Google would not give the redirect target any weight.
Return the HTTP 410 status code, informing Google that the URLs have been furthermore deleted, if the URLs have no links or traffic.
When you utilize the 410 status code, Google typically takes very little time to remove the URLs from its index.
Google Search Console’s Method For Deleting Cached URLs
Google often stores a cached copy of your webpages, and updating or removing it can take a while.
You will need to use Google Search Console’s ‘Clear Cache URL’ tool to stop visitants from perceiving the page’s cached version.
All you need to do is follow the steps below to successfully use Google Search Console to prevent anyone else in the world from viewing these pages.
- To begin with, you must log into your Google Search Console account.
- Next, you will need to choose the appropriate property.
- In the right-column menu, select the Removals option.
- Select NEW REQUEST from the menu.
- The CLEAR CACHED URL tab should then be selected.
- Next, you can specify whether Google should clear its cache for a single URL, or for all URLs that begin with a specific prefix.
- Finally, you will need to enter the specific URL for the chosen page, and then click Next.
Please be advised that the noarchive meta robots tag can be used to tell Google not to save cached versions of your webpages.
Getting Rid Of URLs From A Staging Environment
Releases are tested and approved using staging environments.
These environments are not intended to be search engine accessible, but they frequently are by mistake, leading to the creation of staging URLs that have been listed by Google.
This section will describe how to promptly and successfully remove those annoying staging URLs from Google.
When Production URLs Do Not Outrank Staging URLs
Your staging URLs will not often rank higher than your production URLs.
All you need to do follow the step-by-step guide below to solve the problem if it applies to you as well.
If not, you can simply ignore the following section, and just move on to the next part of this article (When Production URLs Fall Behind Staging URLs).
- First, you will need to access your Google Search Console account by logging in.
- If you have not done so already, choose the staging property or confirm it. It should open up in a separate tab.
- In the right-column menu, choose the Removals option to be taken to the next step.
- When you click the NEW REQUEST option, you will be swiftly taken to the TEMPORARILY REMOVE URL tab.
- Select the Remove all URLs with this prefix option, type ‘/’, and then click the Next button. The URLs will still be included in Google’s index, so you will still need to take steps to remove them. Google will now hide the URLs for 180 days.
- By following the instructions in the section titled Remove cached URLs, you can delete Google’s cached copies of the information.
- Use the noindex robots directive by adding it to your HTML source code or to your HTTP header using the X-Robots-Tag.
- The noindexed URLs should be included to an XML sitemap so that Google can quickly find them and execute the noindex robots directive.
- As you are certain that Google has deindexed the staging URLs, you may protect your staging environment by removing the XML sitemap and adding HTTP authentication. This will stop it from happening in the future.
When Production URLs Fall Behind Staging URLs
You must make sure Google assigns the signals from the staging URLs to the production URLs if they are outranking the production URLs. You must also take steps to prevent visitors from landing on the staging URLs.
Below, we have listed the steps that you will need to follow to prevent others from visiting these staging URLs.
- As we instructed you to do in the previous section (When Production URLs do Not Outrank Staging URLs), follow the first 6 steps of the instructions before moving on.
- After you have completed the first 6 steps, set up 301 redirects from the staging URLs to the production URLs.
- Create a fresh staging environment on an alternative subdomain from the one that was indexed, and be careful to enable HTTP authentication there to stop it from being indexed once more.
It is crucial to remember that you should never attempt to use a Disallow: / in your robots.txt file to block staging environment URLs from Google.
The reason for this is that it would prevent Google from visiting the staging URLs and gaining knowledge of the noindex robots tag. This means that the staging URLs will continue to appear on Google.
Getting Rid Of Spam URLs
If your website has been hacked and there are numerous spam URLs on it, you should remove them as soon as you can to avoid further harming your SEO performance and your visitors’ perception of your reliability.
For a speedy repair of the damage, you will need to follow the procedures listed below.
Using The Removals Tool In Google Search Console
You can rapidly eliminate fraudulent websites from Google SERPs using Google’s Removals Tool.
Remember once more that this tool simply momentarily hides the sites; it does not deindex the pages.
Resist inserting a Disallow: / to your robots.txt file, since that would prevent Google from indexing the URLs again, just like with a staging environment. Google must be able to identify the removal of the spam URLs.
Below, we have listed a step-by-step guide on how to use the removals tool in Google Search Console to conceal sites from visitors.
- Access your Google Search Console account by logging in.
- Next, you will need to choose the appropriate property.
- In the right-column menu, select the Removals option.
- When you click NEW REQUEST, you are taken to the TEMPORARILY REMOVE URL tab.
- Select Remove this URL option only, type the URL you wish to delete, and then click Next. Google will now hide the URL for 180 days, however keep in mind that the URLs will still be included in Google’s index, requiring further steps to conceal them.
- Repeat as often as necessary. We advise concentrating on hiding the spam pages that appear in Google the most frequently if you are dealing with a lot of them. Be careful when using the Remove all URLs option with this prefix option because it may hide any URLs (perhaps thousands) that match the prefix you typed in the Enter URL field.
- Delete the spam URLs from Google’s cache as well by following the instructions in the section titled Remove cached URLs.
Serve A 410 After Removing The Spam URLs
Use a backup to restore the prior state of your site. Deploy updates, then add further security precautions to make sure your website is no longer susceptible.
Next, verify that your website is free of all spam URLs. In order to make it quite apparent that these URLs are gone and never to be found again, it is advisable to return a 410 HTTP status code when they are requested.
Make A Second XML Sitemap
Make a second XML sitemap with the spam URLs in it, and upload it to Google Search Console.
By doing this, Google can swiftly consume the spammy URLs, and Google Search Console makes it simple to keep track of the cleaning procedure.
Getting Rid Of Sensitive Content URLs
It is crucial to protect sensitive data that you acquire online, such as customer information or resumes from jobseekers. Under no conditions should Google, or any other search engine for that matter, index this material.
It is important to remember that accidents do happen, and unfortunately, sensitive information might wind up in Google’s search results.
The next section of this article will provide instructions on how to rapidly have this content deleted from Google.
Using The URL Removal Tool In Google Search Console
The quickest approach to block Google from displaying sensitive URLs in its SERPs is to hide them using the GSC’s removal tool.
Bear in mind that the program does not actually erase the uploaded webpages from Google’s index. Instead, it will only conceal the sites for a total of 180 days.
This is, around, 6 months, or half a year. While this is quite a long time, it would be more preferable to conceal the pages indefinitely, but, unfortunately, this is not at option.
You may be able to repeat these steps after the 180 days to ensure that the pages continue to remain conceals for a longer amount of time, but you may want to set a remained so that you remember to do so.
Below, we have listed a step-by-step guide on how you can use the URL removal tool in Google Search Console to temporarily conceal sites from visitors.
- Access your Google Search Console account by logging in.
- Choose the appropriate property.
- In the right-column menu, choose Removals.
- When you click NEW REQUEST, you will be taken to the TEMPORARILY REMOVE URL tab.
- Select Remove this URL only, type the URL you wish to delete, and then click Next. Google will now hide the URL for 180 days. However, since the URLs will still be included in Google’s index, you will still need to take extra actions to hide them, as described in the steps that follow. Repeat as often as necessary.
- We advise utilizing the Remove all URLs with this prefix option if the vulnerable content has been contained in a particular folder because you can then instantly conceal all URLs contained in that directory. We advise concentrating on covering the URLs that appear most frequently in Google if you are going through a lot of URLs featuring sensitive content but without a shared URL prefix.
- By following the instructions in the Remove cached URLs section, you should also remove Google’s cached copies of the sensitive content.
Serving A 410 After Removing The Content
You can remove the URLs and return the HTTP status code 410 if you no longer require the sensitive content to be available on your site.
Google is then notified that the URLs have been permanently deleted by this.
Utilizing A Second XML Sitemap
Finally, you will need to take the necessary security precautions to avoid the indexing and leaking of sensitive content in the future.
Preventing The Breach Of Sensitive Data
Take the necessary security precautions to stop classified content from being indexed and exposed once more.
Getting Rid Of Content Not Found On Your Website
Here are a few options for getting your content removed from Google if you discover that it is being used on other websites.
Contacting The Owner Of The Website
Contacting the website’s administrators should be your initial step. In many of these instances, an intern most probably copied your text inadvertently, and they will act quickly to fix this in those cases.
You can request a 301 redirect to your own URL, offer to indicate a cross-domain canonical to your material along with a link, or simply ask them to withdraw it.
What To Do When The Website Owners Do Not Respond/Will Not Take Action
In certain cases, the owners of the website may not reply to your request, leaving you to be left stuck in a form of limbo.
Or, even worse, the website owners may actually refuse to help you through this process, leaving you stuck.
Thankfully, there are a few ways to ask Google to remove a website if the proprietors are uncooperative and will not abide by your request.
Here is a step-by-step guide you can follow to deal with the issues that follow when the owner of a website does not respond to your request, or if they point-blank refuse to take action on your request.
- You can submit an online form to a third party to ask for the removal of personal information.
- You can urge that Google investigate a removal request made in response to a legal violation.
- You can submit a DMCA takedown notice if you discover that certain content is infringing on your copyright.
Additionally, you can use the Remove old content tool to hasten the removal process if content has been removed from another website, but Google has not yet caught up.
When the material has previously been updated, but Google is still displaying the previous snippet and cache, you can also utilize this technique. In these scenarios, they will be forced to update it, and your issue will be solved.
Getting Rid Of Images From Google Search
Google advises utilizing the robots.txt file to remove indexed pictures, but not to remove pages from Google Search that have been indexed.
Google’s material is not very clear on this, so this can be difficult information to find answers for.
Again, you don’t need to panic! We are here to help you through this seemingly difficult process. In fact, it is not difficult to solve at all.
If you look at the Removals Tool manual, you will see the line Do not employ robots.txt as a blocking mechanism in the section where they are also talking about both HTML and non-HTML files.
Imagine that some of the photographs in the /images/secret/ folder have unintentionally been indexed.
To successfully remove these images from Google search, you will need to carry out the below actions.
- To rapidly hide the URLs in Google Search, follow steps 1-6 in the section above.
- Next, add the following lines to your robots.txt file:
a. User-agent: Googlebot-Image
b. Disallow: /images/secret/
The photos will be removed from Google’s index the next time it downloads your robots.txt, since it will see the Disallow directive for them.
There are several circumstances where you will want to rapidly remove URLs from Google.
Remember that there is no one solution that applies to all situations because each one calls for a distinct strategy.
It is also crucial to remember that the majority of circumstances in which you need to remove URLs are actually avoidable.
We hope you found this article useful and informative, and we hope that you found everything you were looking for within this step-by-step guide to removing URLs from Google searches.
- 1 Million Views on YouTube Money: Unveiling the Earning Potential - March 22, 2023
- Youtubera: A New Generation of Female Content Creators - March 21, 2023
- The 50 most subscribed YouTube Channels in 2022 (Plus 20 Youtubers To Watch Out For in 2023) - December 26, 2022