Ranking on Google’s page 1 search result is vital, especially now as users rarely view search results beyond the first page. Read here how to get your content on page 1 in four steps, starting with technical SEO.
Only those who can stand out from the mass of search results can achieve visibility. Even through paid search engine advertising (SEA), it’s increasingly difficult to achieve this: Google lets you and your competitors play off each other and bid each other up. However, there is still the possibility to rank very high in organic search, i.e. without advertising budgets: through search engine optimization (SEO) of websites and their content.
Targeted search engine optimization is based on four pillars. If all four are well maintained and optimized, the relevance of your content increases automatically. Google rewards this with a corresponding ranking for technical SEO
- Step 1: Technical SEO
- Step 2: Domain Authority
- Step 3: Relevance
- Step 4: User Experience (UX) and Usability
In a four-part series, we will show you how to create the best conditions for your content to land at the top of Google’s organic search. In the first step, we will show you how to create the best conditions for a successful technical SEO ranking.
Step 1: Technical SEO
Technical SEO primarily means technical optimization for your website. In the following, we present the nine most important factors of technical search engine optimization. These are:
- Good PageSpeed
- No Crawling Errors
- No Broken Links
- No Duplicate Content
- HTTPS
- Clear URL Structure
- Optimized XML Sitemap
- Optimized robots.txt file
- Structured Data
1. Improve your page speed to boost technical SEO
Google’s new page experience signals combine Core Web Vitals with their existing search signals, including mobile-friendliness, safe-browsing, HTTPS security, and intrusive interstitial policies.
Google’s Core Web Vitals consist of three factors:
First Input Delay (FID) – FID measures when someone can interact with the page for the first time. To ensure a good user experience, the page should have an FID of less than 100 ms.
Largest Contentful Paint (LCP) – LCP measures the loading performance of the largest contentful element on the screen. This should happen within 2.5 seconds to provide a good user experience.
Cumulative Layout Shift (CLS) – This measures the visual stability of elements on the screen. Websites should strive to ensure that their pages maintain a CLS of less than 0.1 seconds.
These ranking factors can be measured in a report in Google Search Console that shows you which URLs have potential problems.
There are many tools available to help you improve your page speed and core web vitals, including Google PageSpeed Insights, Lighthouse and Webpagetest.org. Some optimizations you can make include:
- Implement lazy loading for non-critical images
- Optimize image formats for the browser
- Improve JavaScript performance
2. Crawl your website and look for crawling errors
Secondly, you should make sure that your website is free from crawl errors. Crawl errors occur when a search engine tries to reach a page on your website, but fails.
You can use, for example, Screaming Frog SEO Spider, DeepCrawl or seoClarity. Many tools will help you with this. Once you have crawled the page, look for crawl errors. You can also check this with Google Search Console.
When you check for crawl errors, you should…
- Implement all redirects correctly with 301 redirects.
- Go through all 4xx and 5xx error pages to find out where you want to redirect this to.
- You should also look for redirect chains or loops where URLs redirect multiple times to another URL.
3. Fix broken internal and external links
Poor link structure can cause a bad user experience for both humans and search engines. It can be frustrating when a user clicks on a link on your website and finds that it doesn’t lead to the correct – or working – URL.
You should make sure that you pay attention to several different factors:
- Links that are redirected to another page via 301 or 302.
- Links that lead to a 4XX error page
- Orphaned pages (pages that are not linked to at all)
- An internal link structure that is too deep
To fix broken links, you should update the target URL or remove the link altogether if it no longer exists.
4. Remove duplicate content to improve technical SEO
Make sure there is no duplicate content on your website. Duplicate content can be caused by many factors, including replication of pages from faceted navigation when multiple versions of the site are live, and scrapped or copied content. It’s important that you only allow Google to index one version of your site.
Thus, search engines see all these domains as different websites and not as one website:
- https://www.website.com
- https://website.com
- http://www.website.com
- http://website.com
Fixing duplicate content can be implemented in the following ways:
- Set up 301 redirects to the primary version of the URL. So if your preferred version is https://www.abc.com, the other three versions should 301 redirect directly to that version.
- Implementing no-index or canonical tags on duplicate pages
- Setting up the preferred domain in Google Search Console
- Setting up parameter handling in Google Search Console
- Deleting duplicate content if possible
5. Migrate your website to HTTPS protocol for technical SEO
Back in 2014, Google announced that the HTTPS protocol is a technical SEO ranking factor. So if your website is still based on HTTP in 2021, it’s absolutely the time to make the switch.
HTTPS protects your visitors’ data and ensures that data is encrypted to prevent hacking or data leakage.
6. Make sure that your URLs have a clean structure
Straight from the mouth of Google: “The URL structure of a website should be as simple as possible.”
Overly complex URLs can cause problems for crawlers by generating an unnecessarily high number of URLs that link to identical or similar content on your site. As a result, the Google bot may not be able to fully index all the content on your site. This could include:
- Sorting parameters
- Some large shopping sites offer multiple ways to sort the same items, resulting in a much higher number of URLs
- For example: http://www.example.com/results?search_type=search_videos&search_query=tpb&search_sort=relevance&search_category=25
- Irrelevant parameters in the URL
- An example is reference parameters:
- http://www.example.com/search/noheaders?click=6EE2BF1AF6A3D705D5561B7C3564D9C2&clickPage=OPD+Produkt+Seite&cat=79
If possible, you should shorten URLs by trimming those unnecessary parameters.
7. Make sure your website has an optimized XML sitemap
XML sitemaps inform search engines about the structure of your website and what should be indexed in the SERPs (Search Engine Result Pages).
An optimized XML sitemap should include:
- Any new content added to your site (recent blog posts, products, etc.)
- Only URLs with 200 status
- No more than 50,000 URLs. If your site has more URLs, you should create multiple XML sitemaps to maximize your crawl success.
You should exclude the following from the XML sitemap:
- URLs with parameters
- URLs that have 301 redirects or contain canonical or no-index tags
- URLs with 4xx or 5xx status
- Duplicate content
You can check the index coverage report in Google Search Console to see if there are index errors on your XML sitemap.
8. Make sure that your website has an optimized robots.txt file
Robots.txt files are instructions for search engine robots on how to crawl your website. Each website has a “crawl budget“, i.e. a limited number of pages that can be included in a crawl. Therefore, only your most important pages are indexed.
On the other hand, you should make sure that your robots.txt file does not block pages that you absolutely want to be indexed. Here are some example URLs that you should avoid in your robots.txt file:
- Temporary files
- Admin pages
- Shopping cart and checkout pages
- Search related pages
- URLs containing parameters
Finally, you should include the location of the XML sitemap in the robots.txt file. You can use Google’s robots.txt tester to check if your file is working correctly.
9. Add structured data or schema markup to boost technical SEO
Structured data helps provide information about a page and its content. It gives Google context about the importance of a page and helps your organic listings stand out in the SERPs. One of the most common types of structured data is called schema markup.
There are many different types of schema markup for structuring data about people, places, organizations, local businesses, reviews, and much more.
You can use online schema markup generators, like this one from Merkle, and Google’s structured data testing tool to create schema markup for your site.
Preview – Step 2: Domain Authority
In the next post of our four-part series, we describe how you can demonstrate authority to Google. In other words, how you can enhance the relevance of your content by having other relevant sources refer to it and vice versa.
Keep reading the series for more great insights on how to boost your content and achieve Page 1 rankings.