Content Marketing is still a shiny new thing in the search industry today, and many online marketers are claiming that one should just build great content and traffic (and links!) will naturally come. But let me break it to you, no matter how great your content is, none of that matters if search engines aren’t crawling your pages properly.
Optimizing your website’s pages still matters a great deal when it comes to increasing a website’s search visibility.
Technical SEO is an essential part of your digital marketing campaign. First things first, a typical partner that offers SEO services will do an audit of a client’s website to gauge current performance and identify issues. Through this audit, they will be able to find issues that might be directly or indirectly affecting the overall organic performance of our clients’ websites.
The Importance of Good Technical SEO
SEO optimization efforts for your website won’t matter if its technical aspects aren’t well-tuned. If you don’t get your site’s technical SEO right, even the three key search engine ranking factors (i.e., content, links, and relevance) might not help you rank well on search engine results pages (SERPs). Good technical SEO powers your page speed, improves crawlability, site structure, and more.
Additionally, a solid technical SEO foundation complements your website content. It would be a waste if users can’t find or see your valuable and expertly written content, wouldn’t it? Simply put, if you have poor technical SEO, search engines may fail to crawl, index, and access your site.
I’ve listed some tips on how to increase your website’s online visibility by dealing with some technical SEO issues that business owners typically ignore. This is specifically helpful for huge campaigns like e-commerce sites or news sites that naturally generate hundreds or thousands of pages at a time.
How to Fix Common Technical SEO Issues
Use Canonical Tags
I often get asked what a canonical tag is and how it works. 9/10 of our clients didn’t have a single clue what canonical tags are.
A canonical link or tag helps search engines like Google choose the right page that you want to target for your campaign. This tag is mostly used when you have mirror sites or duplicate pages inside your website. It looks like this:
<link rel=”canonical” href=”http://www.yourwebsite.com” />
Let’s say you have a massive list of indexed pages on your website. Sometimes you don’t care if your site has duplicate pages because your site is doing well anyway.
Here’s some breaking news for you, your site can actually do better when you use canonical tags!
When search engines detect duplicate pages to a website, they tend to dilute the authority across those pages, and as a result, these two duplicate pages will compete with each other in SERPs.
How to fix this:
- Check for duplicate pages using . Another way is by searching for the content using tools like .
- Check for the authority of both pages using Moz’s Link Explorer. This will be your determining factor in choosing the right page—the one with more authority, of course.
- Ask your webmaster to include the canonical tag in the header for all duplicate pages to your preferred page.
Once you do these steps, Google and other search engines will now recognize this particular page as your preferred page. That way, all links, traffic, and authority will now funnel to that page.
Decrease Page Load Time
There are a lot of reasons why you should consider decreasing your webpages’ load time. The most important one is, Google counts page load time as a ranking factor. This article from Calibre App explains how Google measures page speed and why it matters.
On a more practical note, website visitors have a tendency to bounce or leave the page when the page takes too long to load. That said, dealing with this issue will not only increase your search visibility, it will also decrease your bounce rate and maximize engagement.
How to fix this:
There are many ways to increase page speed. You start off by making your pages “lighter”.
- Forget about flash and use HTML5 instead.
- Cut the file size of large images.
- Limit your image content, if possible. Or, if you must really include images, make sure that they are optimized for the web.
- Monitor your pages’ speed using.
Technical SEOs would recommend:
- Improve server response time. Monitor where your server is spending the most time and then reduce it less than 200ms.
- Enable compression on your web server.
- Compress images and properly format them. Ex. logos are .png, content images are .jpeg.
You can find more suggestions at Google’s PageSpeed Insights.
Correct Broken Links
Let’s say you have thousands of pages on your website and you regularly update your content. Often, because of site revisions and updates, there’s a possibility that there could be broken links to your site. This usually happens when your webmaster deletes old pages or outdated blog content.
SEMrush conducted a study and found that 30% of websites have broken internal links, with 26.5% of websites were littered with 404 errors. These server error codes devalue your SEO efforts, hurt your site’s crawlability, site structure, and user experience.
How to fix this:
You’re going to need Link Explorer, Ahrefs, and ScrapeBox for this. If you’re a full-time tech SEO guy, I suggest you get premium accounts for these tools. Trust me, they’ll make your life so much easier.
Now, get a list of backlinks from Link Explorer & Ahrefs. Here’s how to do it:
Using Open Site Explorer
Input your website and click Search.
Scroll down to quick downloads. Look for “All Links” and export to CSV.
You will have a list of links that looks like this:
Copy all the Target URLs then check for 404 pages using ScrapeBox. You can check manually but it’ll take forever, so I suggest you buy this tool. I used to check 404s using SEOTools for Excel and yes, it’s free. Distilled posted an awesome blog about this tool.
Going back, there’s an add-on in ScrapeBox called Alive Checker. Use this to check for the 404 pages.
Open ScrapeBox, then go to Addons, click on Alive Checker. You’ll then be redirected to another window.
Go to Options and then type 404. Click OK.
Save all the URLs in a notepad. I use a single notepad for all my ScrapeBox activities.
Go to ‘Load urls’ and click ‘Load urls from file’. Choose the notepad file where you saved all the URLs. Click Start and wait until it’s done.
Save it in an excel file and filter all “Alive” links (Don’t be confused, when ScrapeBox detects a status code you put in Options, they’ll label it as “Alive”. In this case, all 404 pages are listed as “Alive”).
Compare all the 404 links to the excel file you got from Link Explorer. And there you have it; you now have all the 404 pages to your site and their backlinks. One last thing, make sure you check the Domain Authority of the backlinks pointing to the 404 pages of your website. Just filter for DA40 and above. “40” is just a personal estimate of a domain authority that’s good enough (this depends on your website).
It’s almost the same thing as Link Explorer. In Site Explorer, input your website and click Search Links.
Click External on the Backlinks Menu and then Export.
You’ll get a list of links almost similar to Open Site Explorer only this time, the links are on the Link URL column.
Repeat the ScrapeBox process and it’s all done!
This technique will give you an idea of what websites are linking to your dead pages. Imagine if all this time, a high DA website is linking to one of your 404 pages. You’re not only losing a link opportunity, you’re also potentially losing a lot of referral traffic!
So what now? Get the list of your dead URLs and ask your webmaster to 301 redirect them to new or similar pages in your website. This way, search engines will count the backlinks to the new pages.
Check if there’s an indexation issue
This may seem like a no-brainer, but it’s still worth checking. Go to Google and type “site:yoursitename.com” to see if your website pops up in the search engine results. If it doesn’t appear, chances are there’s an issue with your indexation. If it shows up, you’ll be able to see the number of indexed pages for your site.
How to fix this:
If your website isn’t indexed or has indexation problems, here’s are some tips to fix it:
- If your site isn’t indexed, add your URL to Google via search console.
- If your site is indexed but you find it suspicious that there are more results than you expected, check if there are old versions of the site that are indexed instead of redirects that point to your updated site. Also, check for site-hacking spam.
- Meanwhile, if you see fewer results than you expected, run an audit of the indexed content compare it to the pages you wish to rank. If you’re unsure why some of your pages aren’t ranking, review Google’s Webmaster Guidelines to make sure your site follows the regulations.
- If the results aren’t what you expected in any way, check if your important web pages are blocked by your robots.txt file and if you have improperly configured NOINDEX meta tags.
Use structured data
Structured data, also referred to as schema markup language, can help bring “wings” to your content for better visibility. It’s a type of organized micro-data that you can inject into every post or page on your website’s HTML code. This helps search engines further understand your site content and data for better contextualization and crawlability.
Here’s a JSON-LD structured data snippet example from Google. The data contains info about contact page of a website describing their contact details.
“name”: “Unlimited Ball Bearings Corp.”,
“contactType”: “Customer service”
How to fix this:
- Each time you put out new content, find opportunities to insert structured data in the page (work with content creators and your SEO team) for a chance to appear in rich result display in Google Search (i.e., Featured Snippet). See Google’s Structured Data Guidelines for compliance.
- One implemented, review your Google Search Console report regularly to monitor and to ensure that Google has no issues with your data markup language.
A good URL has been considered a ranking factor by many SEO experts. Some created case studies to prove that it can increase rankings and can lead to more traffic in a website. I personally think it’s a ranking factor considering that Google bothers to talk about it.
Some say dynamic URLs are not SEO friendly, and I have to agree with that, but there are certain cases when you have to consider these dynamic URLs. Ex. Product filters in your Ecommerce website, forums & some content management websites. I think that as long as you keep the parameters short in your URL, Googlebot won’t have any problems when it comes to crawling them. Indexing might take a while though.
Here’s an example on how to rewrite your Dynamic URL (source Google). Consider this URL:
- language=en – indicates the language of the article
- answer=3 – the article has the number 3
- sid=8971298178906 – the session ID number is 8971298178906
- query=URL – the query with which the article was found is [URL]
Not all of these parameters offer additional information. So rewriting the URL to www.example.com/article/bin/answer.foo?language=en&answer=3 probably would not cause any problems as all irrelevant parameters are removed.
Static URLs are indexed faster and rank fairer than dynamic URLs. Dynamic URLs are hard coded in HTML, while Static URLs don’t have variable strings; they usually contain keywords describing the page itself. Experts love static URLs because they get a lot of SEO benefits from it.
According to Rand Fishkin, the advantages of Static URLs are:
- Higher click-through rates in the SERPs, emails, web pages, etc.
- Higher keyword prominence and relevancy
- Easier to copy, paste and share on or offline
- Easy to remember and thus, usable in branding and offline media
- Creates an accurate expectation from users of what they’re about to see on the page
- Can be made to contain good anchor text to help the page rank higher when linked-to directly in URL format
- All 4 of the major search engines (and plenty of minor engines) generally handle static URLs more easily than dynamic ones, particularly if there are multiple parameters
Let’s say you are redesigning your website and you are changing every page and migrating to new URLs. There are a lot of things to consider when doing this and you have to keep in mind you lose all the SEO work you’ve invested on if you don’t do them properly. This is where 301 redirects come to place.
Passing on the authority is one of the benefits you can get from redirecting your old webpage to the new one. Google and other search engines would then see all the links to the new page and consider it as the most authoritative one.
See what Matt Cutts has to say about 301 Redirects:
In some cases, Matt said 301 redirect is better than Rel canonical. For starters, 301 redirect is recommended also because it’s easier to implement.
Summing it up
Website owners, knowing that the elements going on behind-the-scenes of your website is just as important as regularly monitoring and fixing what you and your visitors see on the surface.
Perform a technical SEO audit to identify and address the areas where you fall short to fix your rankings and visibility. This is especially important when you’re planning or going through site migrations.
Find out if your website is optimized and search-friendly with a FREE SEO Audit!