The first word that pops into many website owners head’s when they see a drop in organic traffic is… “penalty”. While that may very well be the case – you need to first determine if the issue is in fact “technical”. In my personal experience, 8 out of 10 times a site loses organic traffic – it’s due to a technical issue.
Technical issues can be scary, because the solution isn’t clear. You’ll need to do some investigating in order to figure out where the technical issue lies. If you aren’t very technical this may seem like an impossible task. It doesn’t need to be. I’ve compiled some of the more common as well as some unique issues I’ve seen over the last 16 years as an SEO.
Check Your Robots File
One place that you can look to as a possible source could be the CDN service you use. I’ve seen instances when a CDN accidentally blocked Google via their own robots.txt file and that rule transferred over to the client’s website.
Spike In Pages Crawled
Check search console for an increase in crawled pages. If your site only has 100 pages but search console is reporting 1,200 there is an issue. Especially if the spike is all at once and traffic from Google is declining around the same time as the spike in pages being indexed.
In many cases Google has started crawling pages it shouldn't. Typically, these could be search results from your own site's search function. Google shouldn’t crawl them. Also check to see if Google has gained access to the back end of your site.
Soft 404s can become problematic if they are in large numbers – especially if these redirects are to the home page. If you’ve removed a section of your site or gone through a recent redesign don’t generate massive bulk redirects to the homepage.
A soft 404 error in search console means that Google feels the redirects aren’t relevant. Don’t get lazy. It’s also okay to have 404 pages if there aren’t any relevant pages to redirect them to.
Blocking scripts is another culprit that many site owners are unaware of. If you’re serving users scripts that are required to render a webpage and you block Google from seeing those scripts it can impact your search visibility.
You can use Google’s “fetch as Google” tool within search console. Google will show you how they see your web page versus what users see. They will also highlight any scripts that are blocked. Not all script blocking is bad, but those that change how a site is viewed are.
Check Your Site’s Navigation
When a site goes through a redesign, pages are sometimes removed from the main navigation. It could be intentional or an oversight. This can cause ranking to those pages to drop. When pages are in the main navigation it gives pages that are there importance. If pages are removed from the navigation it decrease the value of that page. Pages that aren’t tied to a navigation or sub navigation can become “orphaned”.
Now you may be saying to yourself…”I’ll just add those pages to my xml sitemap. Problem solved”. Think again. Just because a page is indexed doesn’t mean it will rank in Google. As I stated above, pages that aren’t tied to a navigation won’t rank as well. Years ago you could place pages in a sitemap and they would rank. Today, they won’t rank at all. Creating an xml sitemap and uploading it to search console is not a guarantee those pages will rank. Indexing and ranking are two totally different things.
Maybe, you’ll add them to the footer? Wrong again. Google put less weight on pages in the footer as Google knows that pages there have no real SEO value. You get the idea.
Hopefully, this article will give you some ideas of where to look if you’re site has dropped in indexing or search visibility. If you have any of your own personally experiences, I’d love to hear about them.