9 Easy Technical SEO Tips to Boost Your Site’s Search VisibilityTuesday May 23rd, 2017 More Information, Z.com News, Events, Information
by Glen Dimaandal
At this point, the value of optimizing a website for greater visibility in search engines can no longer be understated. Search engines push the purest, most highly motivated kind of web traffic that’s easier to extract more conversions from. While a lot of SEO campaigns rightfully expend efforts on content creation and link acquisition, SEO professionals sometimes neglect the technical aspect of the optimization process.
Generally, good technical SEO is all about facilitating deep and thorough crawls of search bots on your site. This allows for optimum indexation of pages and good PageRank distribution throughout the site. These days, though, technical SEO has become a little more nuanced than that. As Google expands its list of ranking signals, we now have to take things like site speed, SSL certificates and structured data into account.
All of these elements combine to give Google and other search engines a better idea of which sites provide the best user experiences from a technical perspective. Unlike in years past where technical SEO was an afterthought, it now holds significant sway in how well your pages rank for its target keywords. In some cases, good technical SEO can make all the difference between having a successful online business and an obscure one that folds a couple of years after its launch.
If you want to make sure that your site’s technical SEO is clicking, follow these simple tips:
- Use Secure URLs
A few years ago, Google announced that they were officially making secure URLs a ranking factor. Through Google Search Console analyst Gary Illyes, the search giant cited data integrity as the main reason why Google would reward sites that use HTTPS URLs instead of the regular HTTP.
With plain HTTP URLs, the data that goes back and forth between a user and the website he’s accessing could be compromised by an entity lurking between them. HTTPS greatly increases the chances of the user receiving information from a website the way the webmaster intended tit to be. This means lower chances of security breaches and you won’t be bombarded by ads even if you’re using public Wi-Fi.
Initially, website owners were hesitant to make the switch between HTTP and HTTPS. After all, significant work is involved along with the extra cost of purchasing SSL certificates. Fortunately, the web development community responded to the challenge and plenty of plugins are now available for popular CMS platforms to make the transition to HTTPS a lot easier.
- Use an SEO-Friendly CMS
One of the most important decisions you’ll make regarding your online business is the selection of a content management system (CMS). If you get it wrong, you could be in for years of headaches and frustration. If you get it right, you’ll spend more time strategizing about business growth rather than banging your head over getting your website to run properly.
Your choice of CMS will also have a direct impact on the success of your SEO campaigns. Choose a platform that’s built with SEO in mind and you can execute just about every SEO tactic you can think of. Choose a CMS that’s not SEO-friendly and you’ll likely have trouble implementing the simplest of optimizations.
If you’re starting a new website, your first option should always be WordPress. Once merely a blogging platform, WordPress has enjoyed such popularity and support from its community that it now has themes and plugins for almost every conceivable function. This makes WordPress very powerful and flexible from web development, design and SEO standpoints.
Whether you choose an SEO-friendly theme such as StudioPress’s Genesis Framework or if you prefer using Yoast’s WordPress SEO extension, you can execute just about any SEO strategy you can think of on WordPress. If you’re building a blog, a static site or a small online store, I highly recommend this CMS. If you’re building a big ecommerce business, your best bet for SEO would be Magento.
- Optimize for Site Speed
Site speed is the length of time that it takes for your site’s pages to load in browsers on average. Similar to what it did with secure URLs, Google made site speed an official ranking factor a few years ago. As you might guess, webpages that load faster tend to give users more pleasant experiences. It’s also a signal of good technical health that’s hard for most webmasters to fake, earning it a spot on Google’s list of major ranking factors.
You can check your site speed with a couple of free tools. Google’s PageSpeed Insights allows you to enter the URL of any page and receive a grade for it on a scale of 1-100. Ideally, your pages should score 85 or better to be considered fast. Anything below that will require improvements.
Fortunately, the tool also has you covered on that end. Within the same report, Google lists possible areas where you can implement technical tweaks and improve the speed at which a page loads. Recommendations usually range from leveraging browser caching, compressing images, and minifying a page’s CSS.
Alternatively, you can also use Pingdom’s website speed test tool. Instead of a page-to-page assessment, this one gives you a general view of your site’s loading performance.
- Pay Attention to Your Robots.txt File
The robots.txt file is a small web document that almost every site has by default. It tells search engines which of them are welcome to crawl the site and which pages can and can’t be accessed.
Less experienced webmasters may think that it’s a good idea to have all their pages crawled by search engines and put on display in Google’s search results. However, this isn’t necessarily the case. Some pages are better off being kept off limits to help increase site security. Examples of these are backend pages, administrator login pages and other pages that aren’t meant to be seen by the general public.
You’ll also want to restrict bot access to inconsequential pages that don’t cater to the interests of your general pages. Dynamic pages, checkout pages and others that are generated by custom parameters set by users should be blocked with robots.txt. Leaving them open to indexing risks getting your crawl budget consumed by inconsequential URLs that don’t aid your conversion process. It also dilutes and thins out the PageRank that’s passed from page to page, leaving your main landing pages with less potent ranking power.
To check your site’s robots.txt file, just go to www.yourdomain.com/robots.txt. You should see a simple file like the one above. Make sure that the file allows all the search engines you want to be seen on to crawl your site. You should also check if your site’s public-facing pages are open to access or not. If they aren’t, you’ll need to perform an audit to determine which pages can be displayed on search and which ones are off-limits.
- Optimize Your XML Sitemap
The XML sitemap is an optional but important part of a website from a search visibility perspective. Like the robots.txt file, search engines read this document to guide their crawls. A properly written XML sitemap provides information on which pages you want search engines to index and how frequently they’re updated by the webmaster. It should also give search engines some valuable clues on the hierarchy of pages in your site.
The XML sitemap is not to be confused with the HTML sitemap that some websites have. The former is an XML file that’s made for search engines specifically while the latter is more akin to a webpage that lists the pages of a site and is meant more for human users. An XML sitemap would look something similar to this:
To check whether your site has an XML sitemap or not, try typing one of these in your browser’s address bar:
Most sitemaps will be in these addresses, but some others might not be. If you don’t find it in these addresses, you can always check your CMS. Most modern platforms either have XML sitemap generators out of the box or have plugins to support that functionality.
After generating your sitemap, set up a Google Search Console account (if you don’t already have one for your site). You can let Google know about your sitemap by submitting its address under Crawl>Sitemaps. If successful, you’ll get a report on your sitemap similar to this in a few days:
This report will give you an idea on whether or not Google is able to crawl and index the pages that you want appearing in its search results.
- Deal with Crawl Errors
A crawl error is reported in Google Search Console when Googlebot is unable to read a webpage that it was once able to crawl and cache. When this happens, normal indexing cycles can be disrupted and a site’s search visibility could suffer especially if the crawl errors affect important pages in your site. If crawl errors persist for months, search engines sometimes take the URLs of affected pages off their search result pages.
While crawl errors are a normal occurrence in just about every site, keeping them down to a minimum and preventing them from happening to your site’s vital pages is essential in maintaining good search visibility. There are two main types of crawl errors, specifically:
- Site errors – These issues often emanate from major technical hiccups that affect a website’s uptime. As such, they tend to disrupt entire sites rather than individual pages. Site error subtypes include:
- DNS errors
- Bot failure
- Server errors
These issues can usually be addressed by addressing problems with domain name and hosting providers. In some cases, the issues may be due to improper site configuration. Enlisting the help of a professional web developer should address sitewide crawl errors.
- URL errors – As the name suggests, URL errors affect only individual web addresses. This type of error usually happens when:
- An indexed webpage is deleted and yields a 404 Not Found error
- An indexed URL is now being restricted by login requirements
If a page was deleted intentionally but has a new version that’s up and running, 301 redirect the old URL to the new one. If the page has no replacement but once held an important place in your site, consider redirecting it to the page that’s closest to it in terms of contextual relevance. If the page has no replacement and was not very important to begin with, just let it remain in its 404 Not Found status and let Google flush it out of its cache in a few months.
If you want to get down to the nitty-gritty of this subtopic, I suggest going over my post on how to address crawl errors. The article will provide you everything you need to know about this aspect of on-site SEO maintenance.
- Establish Clear Navigation Pathways
Good internal linking is essential to SEO excellence. Search bots need to find their way around your site easily to allow for better indexing and optimum PageRank flow. The more logical your internal linking scheme is, the easier it is for search engines to determine which pages in your site hold the greatest degrees of importance.
You can get this started by making sure your navigation menus are clear and indicative of your site’s page hierarchy. Links from main navigation menus carry the most potent equity within a site, so make sure every major section of your website is represented here.
Typically, the best way to approach this is by following the classic pyramid site architecture model where your home page represents the “tip” of the pyramid and cascades to sets of more specific categories, subcategories and product/content pages. In this model, there is a “chain” of internal links starting from the home page down to the site’s innermost pages.
In practice, you can do this by having your most important categories in the main navigation menu bar and using dropdown-style menus for subcategories. Within the category pages themselves, you can link to the subcategories and individual pages under them.
It’s not just in the navigation and sidebar menus that you should link to internal pages. You can also do this in your site’s articles whenever a relevant product, category or person is mentioned.
- Eliminate Broken Links
Links are the conduits by which bots pass and discover new pages. They’re also the “pipes” through which PageRank flows to give pages the ranking power they need. When links are rendered non-functional, PageRank does not flow and ranking performance can take a hit. This is why you need to make sure that links in your site are regularly checked and working properly.
To quickly scan your site for broken links, download and install the Screaming Frog search engine spider application for Windows or Mac. This tool allows you to simulate a search engine crawl on any site to collect valuable SEO data that can be exported to a CSV file. This application has a ton of great uses, but right now we’ll focus on its ability to find broken links.
To do this, enter your site’s URL in the web address field and hit the Start button. You’ll notice that the tool will start collecting data on every page it finds. Wait until the crawl finishes and click Bulk Export in the menu. Click on All Anchor Text.
You will then be asked to name the file. Name it whatever you want and save it. Open it in Excel afterwards.
You’ll see several columns including:
- Type – The value HREF denotes that it’s a hyperlink.
- Source – This is the URL where the link can be found.
- Destination – The URL that the link is pointing to.
- Alt Text – The alternate text that describes the link
- Anchor – The text where the URL is attached to form the link
- Status Code – The response codes that bots receive when they try to access the link
- Status – The meaning of the status code received
- Follow – Indicates whether or not the link has the rel=”nofollow” attribute.
You only need to be concerned about the values under Status Code. Sort them from largest to smallest and see if you get any values that range from 404 Not Found or higher. Investigate each one. Typically, broken links happen when there’s a typo in the link’s URL or if the destination page has been deleted. Links with typos should be corrected while links pointing to non-existent pages can either be deleted or edited to point to functional pages.
- Add Schema.Org Markups
In their push towards delivering richer and more accurate search results, Google, Bing, Yahoo, Yandex and other players in the search industry formed Schema.org. This is a joint effort to unify structured data markup styles across the Web, making it easier for search engine algorithms to understand human language better than ever.
In plain English, Schema markups are HTML elements that are affixed to certain important words in your site’s content. This helps search engines understand that a phrase like sous vide chicken is a recipe rather than just a random string of words. This also helps search algorithms establish and understand contextual relationships between words they find on the web.
There are lots of Schema markups and each of them help search engines make sense out of human language regardless of what language you speak. The most common application of Schema would be in identifying your local businesses. You can use Schema to help search engines identify your brand name and associate it with an address as well as a local phone number.
Take the text above for instance. On the surface, it looks like a perfectly normal block of text stating the name, address and phone number of a business. On the code level, however, this is how this text looks like:
You can see that Schema code identifies the business name, address and phone number exactly as they are and refers them back to Schema.org. This data is then used by search engines to better understand the information displayed and understand that these are the basic details of a local business.
Schema data for local businesses is believed to influence Google=s local search results as well as its My Business three-packs. In this case, we applied both Schema.org markups to our website and optimized our Google My Business listing. The result is a top ranking for us when you search for an SEO company in our area.
Adding Schema markups isn’t the easiest thing in the world, but you can study how to do it in the Schema.org site. Also, you’ll be happy to know that some theme developers have already started automatically adding Schema markups to their creations. Plugin developers have also started developing extensions that automate the addition of Schema elements in content like recipes, addresses, tables, etc.
There’s plenty more that goes into SEO, but applying these nine technical optimizations should put you in a very competitive position whatever your niche is. Winning in the search engine arena is for organizations that sweat the details and do all the little things. If you want to run a campaign that ultimately dominates your market, technical SEO is as good a starting point as any.
About the Author
Glen Dimaandal is the founder and CEO of the GDI SEO Company, a search marketing agency based in the Philippines. He is a former SEO manager at Fortune 500 corporations and is now a full-time entrepreneur.