In this digital era, most organizations look for accessibility, functionality, and versatility to gain an edge over their competitors. This has made software as a service (SaaS) a viable option for companies. SaaS offers firms access to complicated software and IT infrastructure and combines simplicity, flexibility, and affordability.
However, because of the rapidly developing technology and the changing nature of customer needs, the SaaS domain is rife with competition from tech-savvy firms.
The keyword ‘software as a service’ not only has a high keyword difficulty but also has a high cost per click.
Further, the domain rating of the top ten websites ranking for this keyword is high, nearly 90.
All this indicates that as SaaS grows in popularity, so does the competition in this domain. In this increasingly competitive landscape, finding, attracting, nurturing, and converting qualified leads is a big challenge for SaaS marketers.
That’s where the SaaS technical SEO health of a business website comes into the picture. However, for most of our SaaS clients, content marketing and link-building strategy are the priority. The technical side of SEO gets little attention, usually because technical SEO for SaaS companies is rather too ‘technical’ for them.
This continues till one day their webpage ranking and traffic tank for an unknown reason or they want to migrate their site with little impact on SEO. Firms then tend to make reactive decisions that are either not effective or take a long time to work.
This is why we recommend our clients do a periodic technical SEO health check that enables us to take proactive measures to improve the site’s technical health.
By the end of this post, you’ll get to know how vast and complex technical SEO for SaaS businesses is. Also, you’ll learn the significance of technical SEO optimization.
If any of the information shared below sounds too convoluted or you are unsure of how to implement it, our team of technical experts is here to help. We are just a call or email away!
So, let’s begin.
Chapter 1: What Is Technical SEO?
Technical SEO focuses on improving the technical aspects of a website, thus ensuring that your site meets the critical technical requirements of search engines. The primary aim of investing resources in SaaS technical SEO is to boost organic rankings by prioritizing technical search aspects like crawling, indexing, rendering, and website architecture.
It goes without saying when businesses fail to invest in technical SEO their online presence takes a hit. Google and other search engine bots crawl websites and evaluate pages on a variety of factors. If your website isn’t crawl-friendly, these search crawlers will never be able to crawl and index critical pages.
Thus, technical SEO for SaaS firms is a sure-fire way to increase your organic reach and improve visibility.
Most businesses, especially SaaS companies, find it challenging to manage technical SEO issues on their site.
- SaaS business leaders are too focused on other priorities like lead generation, customer acquisition, inbound marketing, hitting the number, and others. So, technical SEO is hardly on their mind.
- The SaaS domain is consistently producing new content and assets. This makes technical website management challenging.
- Unearthing technical SEO issues is tough and time-consuming. Often, by the time the issue is spotted, it’s too late. The strategies that follow (reactive strategies) take weeks or months to show effect. Hence, it is wise to proactively take measures to determine your site’s technical health.
- Technical SEO usually refers to the elements that aren’t visible (unlike on-page). For instance, issues in the site structure, load time, and XML sitemaps aren’t known until they hit you hard (or unless you proactively monitor them!).
All this makes technical SEO for SaaS brands tricky.
Growfusely’s Approach to SaaS Technical SEO
At Growfusely, we approach technical SEO for SaaS businesses by investigating four aspects of a website’s health and performance. We ask ourselves these four questions.
- Are the pages crawlable by the search engines?
A technically sound website is organized and structured for search engine crawlers to easily crawl and analyze all the content and code.
Here, we look at your site’s cleanliness in terms of the information architecture and sitemaps. We apply various tools to boost crawlability and minimize website errors. We also review a few other aspects like the effective use of robots.txt, crawl rule implementation, internal linking structure, sitemap updation, and more.
- Are the pages indexable by the search engines?
Indexing is a critical aspect of crawling that ensures that the search bots have effectively assessed and rendered your content. If a page is indexed it means that the crawlers have gauged the relevance and context of the content for inclusion in the search results.
- Is the website secure?
Website security is a key aspect of a site’s technical health. Lapses in a SaaS website’s security can erode the trust of not just the visitors but also the search engines. This can hugely limit the website’s visibility and ranking in the SERP.
Here, we check if the website is meeting the SEO standards and implementing the security best practices like HTTPS.
- How well is the site performing (speed)?
The speed of a website is a key factor in determining a site’s user experience and performance. Your SaaS website may have great content but if it’s not fast enough it will fail to rank in the SERPs.
In the upcoming chapters, we will talk about the key aspects of technical SEO audit for SaaS businesses and how website crawlers function. We will also share how to equip yourself with enough data to initiate technical SEO analysis for your SaaS website.
Chapter 2: Technical SEO Audit for SaaS Businesses: What, Why, How?
As a SaaS business owner, it’s quite common to feel overwhelmed by the sheer volume of things one needs to take care of when performing a technical SEO audit. But you will be amazed to know that solving the smallest technical issues properly can result in a huge spike in ranking and traffic.
Let’s begin with understanding the various sections of technical SEO. This will help you know what’s not working properly and what needs to be done.
1. Site Structure/ Information Architecture
Your website structure tells Google which pages are important and which aren’t. Search bots use the site structure to crawl and rank content on your site.
The site structure deals with how your content is organized, linked, and presented to a site visitor. Thus, if you structure your website properly, it will benefit your audience and allow the search bots to index your pages well.
Simply put, your site structure can influence which content ranks higher in the SERPs. The site structure is critical because –
- It serves as a guide for Google as it tells the search engine where to find essential information and the relationship between pages.
- The internal linking structure tells Google which posts are important, thus preventing all your content from competing against each other.
- It improves UX because if your structure is clear it means your site’s navigation is good. This means people are easily getting what they are looking for on your website.
Internal links, navigation, taxonomies like categories and tags, and breadcrumbs are often used to structure the website.
Here are a few best practices for improving your site structure.
- Use Pillar-Clusters or Hub-Spoke for Effective Internal Linking
With so much SaaS content available online, Google wants to know whether or not you are an authority on the domain. One of the best ways to establish authority is to organize your content into pillar pages and topic clusters.
A pillar page gives a comprehensive overview on a topic. Cluster is a collection of interlinked articles centered around an umbrella topic.
Our blog lives by the pillar-cluster content strategy.
This model ensures an interconnected experience for our readers, thus delivering value and establishing us as an authority in the domain. This tells Google that we are the best resource for the topic ‘SaaS SEO.’
- Fix Keyword Cannibalism
Keyword cannibalization is a big issue for SaaS websites where two or more pages compete for the same keyword and intent.
Say Canva, the graphic design platform, wants to rank for the keyword ‘social media graphics.’ So, they share a post titled ‘Best Practices to Enhance Your Social Media Graphics.’ A few years later new social tactics emerge, for which they create another post but target the same keyword.
This is a classic case of keyword cannibalization. Once the new post is live, Google will struggle to decide which page to rank. In most cases, it will rank neither.
Such issues can adversely affect your site’s ranking.
Google Search Console and SEMRush’s Position Tracking Tool are great tools to spot cannibalization issues. Alternatively, you can use the ‘site:[domain] keyword’ search operator on Google to see if multiple pages have the same intent.
- Go for Category-Organized URL Structure
Having a pillar page live on the root domain and then building supporting content off that URL can not just help improve user experience but also pass the link equity through the website.
Check out how Hotjar does this effortlessly.
Pillar page domain – https://www.hotjar.com/heatmaps/
Supporting content domain – https://hotjar.com/heatmaps/examples
Notice how they continue the URL of the root domain.
- Create a Website Taxonomy
Website taxonomy allows webmasters to classify content in a logical manner. It defines relationships and similarities among descriptive terms, thus making it easier for users to navigate the pages.
Remember – 38% of visitors will stop engaging with your pages if the content or its layout is unattractive.
Visually, taxonomy looks like sections in a website or categories in a blog.
Let’s take a hypothetical example of a marketing attribution software website. You are aware that visitors are coming to your website for ‘multi-channel attribution solutions.’
In this case, you will want to set up categories that will help the visitor quickly find what they are looking for. So, you may offer marketing attribution solutions as per
- The organizational role – CMO, director, head of marketing, or marketing manager
- The industry segment – Agency, ecommerce, financial services, legal, travel, or SaaS
Thus, a well-planned taxonomy can completely transform how your customers or prospects interact with your website.
As mentioned earlier, crawling is Google’s discovery process where the search engine sends out its spiders to scour the web for new and updated content. The bots fetch web pages and follow the links present on them to spot new URLs. As these crawlers find new content they add it to their index (Caffeine) to be retrieved later when the searcher looks for something similar.
Thus, making your pages crawled and indexed is critical to showing up in the SERPs. To determine whether or not your pages are getting crawled and indexed, use the search operator – ‘site:yourdomain.com.’
The number of results shared above will give you a rough idea of the pages indexed in the site. For more accurate information, use the Index Coverage Report in GSC.
Here are a few reasons why your web pages may not be showing up in the SERPs.
- Your site is just launched and Google’s crawlers haven’t crawled it yet.
- Your site isn’t linked by external websites.
- Your site’s navigation isn’t crawl-friendly.
- Your site carries crawler directives that are blocking search engines.
- Your site has recently received a Google penalty for using spammy SEO tactics.
One way to get your important web pages indexed and in the SERP is by telling Google how to crawl your site.
- Check the Robots.txt Files
Robots.txt files suggest which part of your site should be crawled by the search bots and which shouldn’t be crawled. Through specific robots.txt directives, they also share the speed at which they should crawl your website.
The robots.txt files are located in the root directory of websites.
Here’s how search bots treat robots.txt files.
- If a bot doesn’t come across the robots.txt file, it proceeds to crawl the website.
- If the bots find the robots.txt file, they will abide by the suggestions shared and proceed with the crawl.
- If the bots encounter an error when trying to access the robots.txt file and cannot determine if one exists or not, it will not proceed with the crawl.
Though Flash and JS navigation can make your site look great, they are pretty bad for the search engines. For the bots, JS frames and Flash are like broken links that do not allow them to access the content on a page.
Similarly, though Google can index them other search engines cannot. Hence, if you want to improve the crawlability of your site you need to make sure that your JS navigation and Flash are appealing to the search bots.
You may create an awesome Flash animation and share a strong brand message in it. But if search engines it is only a set of images without any keyword-rich content or anchor texts that offers context.
So, if you have Flash animations on your website, it’s wise to get rid of it or just duplicate your navigation bar without flash.
To tackle this issue, ask your technical SEO team to get rid of the harmful scripts or include a noscript section.
- XML/ HTML Sitemaps
As you are already aware, every website has two types of visitors – humans and bots.
Google’s spiders use XML sitemaps to create machine-readable sitemaps or indexes while humans refer to HTML sitemaps to navigate websites.
XML sitemaps are critical because they –
- Ensure that the search engine knows all the URLs on your website
- Offer additional context about the URL, enabling Google to understand your website better
- Improve your site’s visibility in the index
- Monitor the number of sitemap URLs that have been indexed
- Tell search engines about the priority pages
So, are HTML sitemaps needed?
Yes! Users can immensely benefit from an HTML sitemap. Moreover, this sitemap links to archived content. So, this sitemap can help if you publish a lot of content that’s not linked to.
Here’s what Google has to say about them.
- Broken Internal Links
Broken internal links or dead links prevent Google’s crawlers from scanning through and indexing your content. Error codes like 404 Page Not Found, 400 Bad Request, and Timeout among others point to a broken link.
Having too many such links on a page may indicate that your website is neglected. Further, when crawlers run into broken links, they waste time and budget to verify if it’s broken for categorizing them.
So, use GSC to detect pages that return errors. On the dashboard, go to
Index > Coverage > Errors
Now, it’s time to fix these links. Start by examining the reason for the broken links. Use these questions to guide you in this matter.
- Is it caused by a typo error?
- Have you recently deleted an image, video, file, or an entire web page?
- Did you rename or move a page and forgot to update the links?
- Are you linking to content that’s been moved or deleted?
- Have you changed the domain name or moved the site to a new URL?
Knowing the reason can help you find a solution to fix the broken link.
For instance, if you’ve deleted an old page and created a new one as a replacement, the old page will show a 404 Page Not Found error when a user tries to access it.
In this case, when you remove a page, make sure you use a 301 redirect that will send visitors to a new location and inform search bots that the page has been permanently removed.
- Internal Redirects
SaaS websites often change content and move pages to make sure their readers get the latest updates. Hence, redirects are relatively common in these sites.
Redirects are often used when –
- Merging websites
- Updating or removing content
- Fixing pages returning 404
- Changing the information architecture
- During site migration.
Redirects are necessary in many cases. For instance, they are needed if the original URL is indexed or frequented by users. Redirects may also be necessary when the original URL is used in a piece of content like newsletters or white papers.
However, avoid using too many redirects as they add an ‘extra hop’ for search engine crawlers, making them work more and waste crawl budgets to find content.
- Redirect Chains (2 clicks and beyond)
Though redirect chains occur naturally, they significantly affect the UX (slow site speed) and crawl budget and cause indexing issues. Each time the search engine gets an unnecessary 3XX status code, bots have to wait and have less time to crawl priority pages.
Google bots give up crawling if they encounter too many redirects (after 5 hops or clicks). Hence, John Mueller recommends fewer hops than this number,
When the browser requests a page, Googlebots retrieve the web pages, run the code, and access the required content to understand the website structure. This process is referred to as rendering. The information these bots collect during the process is used for ranking the web pages.
Rendering occurs between two states of every page –
- The rendered HTML that’s commonly known as the DOM (Document Object Model). It represents initial HTML plus the changes JS made called on by HTML.
To find the DOM, you can go to your browser’s Developer Tools and click on ‘Console.’
At a Duda Webinar, Martin Splitt explains rendering by giving us the analogy of HTML as a recipe.
If HTML is a recipe, it may have various ingredients like a bunch of text, images, and other stuff. But the recipe is merely a piece of paper with instructions.
The website you see and interact with the browser is the final food preparation.
In all this, rendering is the process of cooking the dish.
Efficient rendering is critical for securing a healthy Core Web Vitals score.
- Critical Rendering Path/ Lazy loading
As per Google, optimizing critical rendering paths is about prioritizing the display of content that relates to current user action.
By optimizing the critical rendering path, you can quicken the time to first render.
Lazy loading is a web and mobile apps optimization technique that renders only critical UI items. Later, it renders the non-critical items (non-blocking). That way, lazy loading is an effective way to shorten the length of the critical rendering path, thereby leading to reduced loading time.
- Invalid HTML elements in the <head>
<head> often contains invalid HTML elements because the URL contains <noscript> tag. This tag defines alternate content for users that has disabled scripts in the browser or are on browsers that do not support the script.
- Resource URLs (Image, CSS, JS) Redirects Broken
This URL is a resource URL that’s redirected to another URL that’s not accessible. A redirected URL (301 or 302) means that the location of the page was changed and the user is sent to a new URL instead of the original one.
When the resource is no longer available, it may affect the rendering and cause a poor user experience.
Indexing is the process of adding the web pages to Google’s index or web search after they have been crawled and analyzed for content and its context. The process helps search engines organize information, allowing them to offer superfast responses to user search queries.
So, after the search bot crawls a page, it renders it just like a browser would. While doing so, the search engine analyzes the content and stores it in its index or database.
Let’s understand how you can tell search engines to index your web pages.
When search engines crawl the same content on multiple pages, they do not know which one to index. That’s where canonicalization helps. Canonicalization allows you to instruct search engines about your preferred pages.
So, the rel=”canonical” tag allows search engines to better index your preferred version of content while ignoring the duplicates.
When you add the rel=”canonical” tag, it’s like telling the search engine, ‘Hey Google! Don’t index this page. Instead, index ‘this page’ because that’s the master version.’
When done properly, canonicalization makes sure that each piece of content has only one URL. Though there’s no penalty for duplicate content, it can cause indexing issues, thus harming your SEO strategies.
Hence, it’s wise to use the rel=”canonical” tag to encourage Google to choose a canonical and filter the others out of search results.
- Robots Meta Tags
These tags are used within the <head> of the HTML. Here are the common meta directives for robots meta tags.
- index/noindex: This tells the search engines whether or not the page should be crawled and kept in the index for retrieval. If you opt for ‘noindex,’ you are telling the bots to exclude the page from the search results.
Use ‘noindex’ when you don’t want Google to index your thin pages but still want them to be accessible to visitors.
For SaaS websites, ‘noindex’ can be used for user-generated profile pages.
- follow/nofollow: This informs search engines whether or not the links on a page should be followed. If you choose ‘nofollow,’ search engines will not follow the link or pass any link equity.
‘Nofollow’ is often paired with ‘noindex’ when the webmaster is trying to prevent a page from being indexed and crawlers from following the links on that page.
- noarchive: It tells the search engines not to save a cached copy of a page. By default, all search engines maintain a visible copy of the indexed pages.
But you may choose ‘noarchive’ if you have multiple services with changing prices. This will prevent visitors from seeing outdated pricing.
5. Status Codes
HTTP status codes analysis is a critical part of technical SEO audit. Tracking them helps in spotting the errors within the website structure.
As a part of the SEO audit, you need to check these status codes to see if they are as expected and if needed, apply corrections to improve your internal linking structure.
Let’s look at the major status codes.
- 301 Moved Permanently or 301 Redirect
This status code indicates that the resource has been permanent. Hence, the requests must be redirected to another URL in place of the requested resource.
These status codes are used in cases of site migrations or situations where you need to permanently transfer the SEO value from one URL to another.
- 302 Found
302 found is a temporary redirect that indicates that a resource has been temporarily moved to another location.
A good example of 302 status code: You are running a gated content campaign for a month and use 302 to send users from URL A to URL B. After 1 month when the campaign ends, you remove the 302 redirects.
A poor example of 302 status code: During website migration, most developers implement 302 instead of 301. Thus, signals are not passed to new URLs immediately (it may take months). Consequently, the new URL will not be as successful as the previous one.
There are a few other 301 redirection status codes you may come across.
- 303 See Other: This intends to provide an indirect response to the initial request.
- 304 Not Modified: This indicates that the requested resource has not been modified since it was last requested. Hence, it will not be returned to the client. Instead, its cached version will be used.
- 307 Temporarily Redirect/ Internal Redirect: This temporary status code explains that the target page is temporarily residing on another URL.
It lets the user know that they shouldn’t make any changes to the method of request if there’s an auto-redirect to that URL.
- 401 Unauthorized
This is an error that shows that the HTTP authentication has failed. This means, the requested page needs a username and password combination or isn’t being allowed access based on the IP address.
- 404 File Not Found
This status code indicates that the requested source cannot be found. It is a popular status code that can be temporary or permanent. To ensure a good UX, create a custom 404 that –
- States the page does not exist
- Is integrated into website design
- Offers links to accessible content
- Isn’t indexable
Besides, set up a 404 when the page doesn’t exist, it doesn’t have important backlinks, or there’s no equivalent content available.
- 410 Gone
These status codes say that the requested page isn’t available. This status code differs from 404 lies in this subtlety. It says that the page has already existed but has been removed and will not be replaced. This gives a precise and definitive message to the search bots.
- 500 Internal Server Error
This status code indicates that the server had an issue processing the request but isn’t able to explicitly point out anything specific.
- 503 Service Unavailable
This status code indicates that the service is temporarily unavailable and will be available later. Though many developers use it when the website is scheduled for maintenance, we do not recommend this practice.
Correct use of 503: The server is too busy and cannot process requests at present.
Incorrect use of 503: Serving 503 with “Retry-After” value (Try again at a later time) in the past or far in the future.
Responsive designs use the same code to adapt the website as per the screen size it is to be displayed on. Mobile-friendliness is a big part of technical SEO because Google bots index the mobile version of a website instead of the desktop.
Thus, having a mobile-friendly design isn’t an option today. Google introduced a mobile-first index early on, telling webmasters that failing to have one will cause their ranking to suffer.
Let’s look at how you can make your website mobile-friendly for users and Google.
a. Use AMP
We all know that the mobile trend has been overtaking the desktop for some time. This trend was enough for Google to come up with the Accelerated Mobile Pages (AMP) project which is Google’s open-source effort to optimize sites for mobile browsing.
AMPs use a specific framework based on existing HTML to streamline information exchange with browsers. This creates a seamless and quick user experience by reducing the loading time.
AMPs work well so long as you use them with caution.
- AMPs are accompanied by serious design restrictions. You can use inline styles because AMPs allow a single stylesheet, thus reducing the HTTP requests to a single one.
Also, the CSS size is restricted to 50 kilobytes. Though it is good enough for creating a decent design it still forces developers to use efficient coding.
- Link-building can be complicated with AMPs. When a website links to your AMP page, the backlinks point to Google’s domain instead of yours.
AMPs are all about speed. So, the other aspects of UX like an ideal web design come second to performance. However, considering the mobile-friendly era we live in, faster loading pages are a valued asset in terms of UX.
b. Progressive Web Apps
Let’s be realistic – users will not install an app for every site they visit on the internet. That’s why progressive web apps were made.
PWAs are quite popular in the SEO realm because offer UX akin to that of a native app. They result in improved retention and performance without the complication of maintaining an app.
For instance, features like push notifications appearing in the notification panel allow easy audience re-engagement while ensuring that users add their favorite websites to their home screen without visiting app stores.
PWAs render quickly, thus offering a rich experience on mobile devices and desktop. Moreover, they are responsive, discoverable, independent of connectivity, and ensure better engagement compared to other pages.
Uber, the transport company, build m.uber, a progressive web app that allows riders on low-end devices or slow networks to book a ride.
Similarly, Flipkart Lite is a PWA that combines the best of web and the native app.
c. Responsive Design
Here’s how responsive web design, can complement your SEO efforts.
- Site Usability
Responsive design optimizes sites for mobile search, thereby improving your site’s functionality and offering consistent UX across devices. Thus, your site’s usability and Google will favor you with improved ranking.
- Quick Page Loading
Websites that follow a responsive design usually load fast, thus boosting user experience and ranking in the search results.
- Low Bounce Rate
Having a responsive design for your website will not just make your website more appealing than others but also deliver content in an organized manner. Thus, your website will be easy to comprehend and visitors will stay on longer.
All this ensures improved UX and a low bounce rate.
- No Duplicate Content
Following a responsive website design ensures you use a single URL regardless of the device used. This takes care of the problem of duplicate content which otherwise can confuse Google and harm your rankings.
- Social Sharing
Though social media isn’t a direct ranking it helps in building an audience. Having a responsive website makes it easy for visitors to share content on social media, thus improving your visibility and authority in a domain.
7. Website Security
Google makes sure that its users find search results that are safe to use and do not compromise user information or personal data. Thus, website security can significantly impact a site’s SEO performance.
Without a secure website, you cannot rank well in the search result pages or dominate your keyword category.
Website security encompasses all the precautionary measures taken to protect the site against malicious threats or attacks. More often than not, unsecured websites allow hackers to access the site code, allowing them to hide keywords, redirects, and links and create subpages that boost the income of their nefarious websites.
Google and other search engines flag malware and unscrupulous websites, thereby warning users not to access them.
Here are a few steps you can take to protect your website from such attacks and improve your rankings in the SERPs.
HTTPS is an encrypted version of the HTTP that protects the communication between your browser and server against being intercepted by attackers. IT offers confidentiality, integrity, and authentication to website traffic.
The security comes with the use of SSL (Secure Sockets Layer) that encrypts the information being transmitted from the user’s computer to the website they are on. Today, almost all websites use SSL to protect their information, especially when users need to share sensitive data like credit card numbers or personal data.
Installing HTTPS and allowing indexing are the top measures you should take, allowing Google to read your website ensure that the data transmitted is secure. Further, HTTPS enables the use of modern protocols for enhancing security and website speed.
To encourage sites to implement this security, Google uses HTTPS (HyperText Transfer Protocol Secure) as a (lightweight) ranking signal. Simply put, if your site uses HTTPS, it will get a small boost in its search rankings.
Follow these tips to implement HTTPS –
- Install an SSL Certificate from a Certificate Authority like DigiCert, Comodo Cybersecurity, or Verisign.
- Decide on whether you need a single, multi-domain, or wildcard certificate.
- Use relative URLs for resources on the same secure domain and protocol relative URLs for other domains.
- Update your website’s address in Google Webmaster Tools. This will ensure that Google indexes your site’s content under the new URLs.
- Avoid blocking HTTPS sites from crawling using robots.txt and allow indexing of your pages by search engines.
Besides improving site security, HTTPS improves site speed when protocols like TLS 1.3 and HTTP/2 are used. Let’s learn more about these protocols in the next section.
When you do a Google Lighthouse audit report, you will come across HTTP/2. In this screenshot, you can see that it’s green for our website; however, it often pops up as an opportunity to improve page loading speed for several other sites.
HTTP/2 is a protocol that governs the communication between the browser making requests and the server holding the requested information.
Tom Anthony from Distilled, amazingly explains HTTP/2, using the truck analogy.
The truck represents the client’s request to the server and the road traveled is the network connection. Once the request is made, the server will load the response and send it back to the browser.
The problem with HTTP is that anyone can see what the truck is carrying. With HTTPS, there’s a layer of protection to the responses. So, it’s like the truck is going through a tunnel and no one is able to see what’s inside.
However, the problem is –
- Small requests take time and multiple requests means some have to wait
- Mobile connections increase the latency and round trip time
HTTP/2 solves this issue by allowing multiple requests per connection while ensuring security when used with HTTPS. Further, HTTP/2 has a feature – server push, where the server can respond to a request with multiple responses at once.
If your server supports HTTP/2, you can ensure it automatically serves content on the new protocol by using a CDN.
c. TLS Protocols
TLS or Transport Layer Security is a protocol that ensures data privacy and security of information when it is transmitted. It offers server authentication and encryption of communications between the web application and server.
TLS is like an upgraded/ redesigned version of SSL that supports more encryption algorithms. Though the implementation of TLS doesn’t guarantee increased visibility, the website gains user trust, thereby reducing your site’s bounce rate and having a positive impact on its rankings.
d. Cipher Suites on Server
Cipher suites are sets of methods or cryptographic algorithms used to encrypt messages between users or servers and other servers. They secure a network connection through SSL/TLS and are typically made of the following algorithms –
- The key exchange algorithm that dictates how symmetric keys will be exchanged
- The authentication algorithm that dictates how server and client authentication will be carried out
- The bulk encryption algorithm tells which symmetric key algorithm will be used to encrypt the actual data
- Message Authentication Code (MAC) algorithm that dictates the method used to perform data integrity checks.
Cypher is critical for the security level, compatibility, and performance of your HTTPS traffic. They determine how secure it is, who can see it, how it’s displayed on the browser, and how fast users can see your site.
e. HTTP Response Headers
HTTP headers often get ignored during technical SEO audits. These are the directives passed along through HTTP header response. Websites using these headers are known to be hardened against notorious security vulnerabilities.
Reviewing HTTP headers can offer valuable assets about your site’s technical SEO performance. Here are five security headers you should be aware of.
- Content-Security-Policy (CSP) helps protect the website from Cross Site Scripting (XSS) and data injection attacks.
- Strict-Transport-Security Header prevents attackers from downgrading the HTTPS connection to HTTP. This is also known as HTTP Strict Transport Security header (HSTS).
- X-Content-Type-Options stops a few exploits that can happen through, say user-generated content.
- X-Frame-Options stops click-jacking attacks, thus making it a useful security measure to implement.
- Referrer-Policy allows the webmaster to control what information is sent when a site visitor clicks a link to visit another website. This prevents any sensitive information present on the URL of the site referring a visitor from being leaked to a third party.
8. Structured Data
Structured data is the language of the search engines, allowing them to understand the content better. It is an authoritative vocabulary (machine-readable context) that helps search engines contextualize, understand, and accurately match online content with the search queries.
Though structured data isn’t a ranking factor, it allows search engines to rank your content accurately and present more information on directly in the SERPs in the form of rich snippets.
In the SEO world, structured data is about implementing some form of markup on the page to offer additional context around the content. Adding structured data allows websites to benefit from enhanced results like rich snippets, rich cards, carousels, knowledge graphs, breadcrumbs, and more.
Usually, search engines support three syntaxes, namely microdata, JSON-LD, and microformats. They also support two vocabularies – Schema.org and Microformats.org.
Let’s learn more about this.
a. Schema Markup
Schema markup is also referred to as schema. It is a semantic vocabulary of standardized tags that describes the content of page in an organized way, making it easier for search engines to get the context of the content.
Though there are 797 schemas available today, we recommend that you add the following schemas that contribute to the most SEO value.
The Merkle Schema Markup Generator can help you generate schema markups for your website.
- Person: It shares the basic information about an individual mentioned on page with the search engines. You can apply this markup on the author’s bio section in a blog or a page that mentions the team details.
- Organization/ LocalBusiness: This is an important schema to be added to your company website’s About Us page, or the Contact Us section.
- Product: This markup is added to product pages, thereby earning you a detailed search snippet.
- Breadcrumbs: Breadcrumbs allow users navigate your website with ease and see the page position in the hierarchy. This markup gets this experience to the search snippet.
- Article: This is a popular markup added to website articles and blog posts. It allows Google to get basic information about the content. For instance, Google gets data related to the headline, the author, the publishing date, and so on.
- HowTo: If your SaaS website has how-to content, you must leverage this markup. This markup helps Google understand how to achieve a specific goal using a step-by-step approach.
- FAQPage: The FAQPage markup is used on pages sharing a list of frequently asked questions with comprehensive answers.
- VideoObject: This markup is added to the videos hosted on your website because it offers Google the basic information about the video content. The elements that need to be specified are video description, video duration, thumbnail URL, and upload date.
b. Microdata and JSON-LD
Google supports both JSON-LD and microdata structures but since the former is lightweight, it is Google’s preferred structured data format.
As mentioned above, JSON-LD is the most popular structured markup script and the most preferred version for Google. That’s because it can be implemented as a block of code without disturbing the rest of the HTML doc.
Microdata allows you to label various HTML elements on webpages using machine-readable tags. In fact, it is based on a set of tags that highlight items and values on web pages for the structured data.
Since it is embedded in the HTML source code, it takes time to implement, edit, or remove microdata.
c. Rich Snippets
Any form of organic search result listing holding information displayed along with the URL, title, and description is a rich snippet.
It is important to remember that the implementation of structured data to the relevant web pages does not guarantee that it will be displayed in rich snippets.
Google considers factors, such as website authority, trustworthiness, and other factors when showing a particular result as a rich snippet or a rich card.
Changing your domain name or implementing HTTPS is a wise move but all this while maintaining your website ranking isn’t easy. It demands rigorous and meticulous research, planning, execution, and monitoring for migrations to be completed smoothly without the loss of website traffic or conversions.
That’s what SEO migrations are all about – transferring search ranking, authority, and the indexing signals when conducting a migration.
Let’s look the various types of migrations that affect SEO if not performed properly.
a. Domain Migrations
This is when you change your present website URL to a new one along with all content and resources. Domain migrations are usually done when rebranding, changing to a location-specific domain, or moving the hosted service to your domain.
They involve content changes, namely content consolidation and rewriting, content pruning, and more. All this affects the website’s taxonomy and internal linking, thereby impacting the site’s organic search visibility.
Here are a few quick tips to change domains without losing SEO.
- Ensure that you know the history of the new domain. For instance, learn about its backlink network (use Ahrefs Backlink Checker for this) and check if it has been red-flagged in the past.
- Redirect the old domain to the new one. This can be coordinated through your domain registrar or hosting provider.
- Verify the new site using Google Search Console. The platform will offer information about keyword performance of each page and point to the top technical issues.
- Update all mentions of old URL on your business profiles.
- Update Google Analytics with the new URL.
Relaunching a website is an exciting time for businesses but it can cause serious harm to your site’s SEO performance. The site experiences issues like missing content, faulty or missing redirects, internal linking structure changes, inbound links pointing to error pages, changes to JS-rendered content, and other technical issues.
For instance, one of the worst issues faced after a website relaunch is crawling and indexing challenges.
Here are a few things you can do to prevent losing traffic during a relaunch.
- Migrate priority content to the new website
- Implement 301 redirects where needed correctly and completely
- Improve internal linking structure and inbound links
- Migrate all on-page optimizations
- Perform a pre-relaunch technical SEO audit
c. CMS Changes
CMS re-platforming occurs when you move from one CMS to another. This is usually done to make the website more interactive and feature-rich and improve its performance.
For instance, moving from WordPress to Drupal or upgrading from Drupal 7 to Drupal 9.
When changing a CMS, a webmaster makes several changes to the website. Though all CMS changes do not entail traffic loss, the process does impact a site’s SEO performance. A CMS change is accompanied by issues like the inability to maintain the old URL structure, different templates for meta tags and H1, and different approaches to service files (robots.txt and XML sitemap).
Here’s how you can overcome these issues.
- Run a crawl using one of the crawlers we’ve mentioned in this post. This will point to the major technical issues.
- Assess CMS features like meta information, URL structure, site architecture, canonical tags, robots.txt, structured data, Hreflang, and more.
- Indicate the URLs to redirect on the URL map.
d. HTTP to HTTPS
This is when you are changing the site protocol from an unsecured ‘http’ to a secure ‘https.’
Here’s how you can ensure a successful SEO-friendly HTTPS migration.
- Use a crawler like JetOtopus or Screaming Frog to crawl the URLs that exist on your current http site and the internal links.
- Leverage SEMRush or Ahrefs to monitor current keyword performance.
- Purchase SSL certificate from a reliable source and make sure it’s up-to-date.
- Map each URL to outline which the ones on the HTTP site are going to be redirected where on the new HTTPS site.
- Content and design changes can wait.
- Update your hard-coded links to HTTPS within your site. Redirect users and search engines to the HTTPS pages with server-side 301 redirects.
- Scan your website for any non-secure content that might link to third party sites using HTTP.
- Update robots.txt file to get rid of hard-coded links or blocking rules that may still be pointing to http directories or files.
- Avoid performing migration when it is a busy time for your site.
- Put the HTTPS version live.
- Add the HTTPS version to GSC and set your preferred domain status.
- Submit your sitemap on GSC for the new version of the site.
- Verify that site is being crawled and indexed correctly.
- Update your Google Analytics profile from HTTP to HTTPS in Admin and Property Settings to avoid losing history.
- Routinely monitor performance for targeted keywords but expect fluctuation for a few weeks.
10. Core Web Vitals
Page experience and user experience are two factors central to Google when ranking pages. Core web vitals (CWV) are specific factors that assess page experience scores based on Google’s parameters.
CWV evaluates the loading performance, responsiveness, and visual stability, the three important aspects of page experience. These metrics measure factors directly related to UX when a visitor lands on a page.
- Largest Contentful Paint (LCP): Loading measures the time taken for the largest asset on a web page – an image or a large block of text – to be viewable.
- First Input Delay (FID): Interactivity measures the time before the website visitor can interact with a page. It is the delay before a link is clickable or before a field can be completed.
- Cumulative Layout Shift (CLS): This metric relates to the visual stability of a page. Any moving or unstable element when a page is loading can cause the user to click incorrect links, leading to user frustration and poor page experience.
Besides, Google also considers factors like HTTPS, mobile-friendliness, safe browsing, and lack of interstitial popups.
If you want to know more about Core Web Vitals, we recommend reading our detailed post that will throw light on all the aspects of CWV you cannot afford to miss.
Now let’s look at the various measures you can take to improve your CWV.
a. Rich Media and Script Compression
Compression simply means to minimize or minify code by removing unnecessary characters from the source code. For instance, getting rid of white spaces, optimizing conditional statements, joining closely placed variable declarations, and removing new line characters and redundant data.
Rich media and script compression is an effective technique that improves the load time of an application and the overall web performance because of the small file size.
b. CSS Sprites
CSS sprites help in combining multiple images into a single image file for use on a website, thereby improving website performance.
Though it may sound counterintuitive to combine smaller images into a large image, sprites do help in improving performance. Check out these files.
When we add up these sizes, it totals up to 13.2 KB. But if you add these images to a single file they’ll weigh more. Yet, these images are loaded with a single HTTP request. Thus, the browser limits the number of concurrent requests made by a website, thereby improving the site performance.
By caching static content, a CDN can enhance the overall experience for end-users while boosting the CWV scores
- How Does CDN Improve LCP?
The CDN reserves the site’s static content in server farms, located near the end-users. Thus, the content has less distance to travel to and from the user devices. This significantly reduces the load time.
- How Does CDN Improve FID?
The lesser time the website takes to load the faster users can interact with the elements. Using CDN ensures that the users engage with the website elements on the lowest latency, thereby reducing overall delay.
- How Does CDN Improve CLS?
Cumulative Layout Shift is often caused by images with improper sizing or when images fill in slowly. When you use a CDN, static assets are optimized and resized before they get delivered to the user.
Another point to note here is that users are accessing the website using multiple devices and varying internet quality.
d. Server Speed Optimization
The faster a server responds to the requests made the more is the improvement seen in a site’s speed metrics. More often than not, complex enterprise websites have servers that are busy handling multiple requests and serving files and scripts.
It’s best to optimize these processes to bring the loading time down. Here are quick tips for server speed optimization.
- Upgrade your hosting plan. Pick one that offers great performance at a fair price.
- Use a recent version of PHP.
- Research how your databases work and spot opportunities for optimization.
e. Parallel Downloads and Minify
One of the best ways to reduce the page loading time is to reduce the number of resources to be downloaded through sprites. Alternatively, you can combine and minify external resources like CSS/JS.
However, where these optimizations don’t work parallel downloads work!
Parallel downloads are multiple downloads that are carried out concurrently across several domains but hosted by one host. The primary benefit of parallel download is that it reduces the download time, thereby improving the overall page performance.
Browsers usually limit the number of concurrent downloads. But enabling parallel downloads splits the resource requests among several domains. Thus, the loading time reduces significantly.
If your site is taking much time to block or most of your users use older browsers (like IE6/7) implementing parallel downloads can offer performance gains.
When a user visits your page, the browser talks to the server. This process involves a series of tasks, namely PHP processing, making requests to your database, and sending files back to the browser to be assembled into a fully-formed page.
This process can take several seconds or even an eternity.
Page caching can speed up the process of server response and reduce the TTFB (Time to First Byte) or server response time. This translates into super-fast response time, thus improving the page experience.
11. Content Optimization
Though technical SEO is hugely non-content related, a part of it overlaps with on-page SEO (heavily content-centric). Hence, we will be talking about content optimization and how it impacts search rankings.
a. Entity Optimization –
The role of entities in SEO is huge. With time, Google has become smarter in identifying the true meaning of keyword searches and queries. It has now shifted from term-based search to entities.
Simply put, search engines now discover content only through entity search, a method used by bots to understand user intent when mapping related sources to search queries. Search professional failing to align with this framework are bound to fail.
Categorizing ideas into entities, you can add additional context and relevance to the content. Here’s what marketers need to do to optimize content for entity search.
- Integrate verified semantic elements in the web infrastructure.
- Spot and eliminate ambiguous language in content.
- Engage searcher’s interests in context.
b. Duplicate Content
Duplicate content adds zero value to visitors and confuses search engines. It can also cause several technical SEO issues because it interferes with the way search engines consolidate metrics like authority, relevance, and trust for content.
The best way to tackle duplicate content is by implementing 301 redirects from the non-preferred URL version to the preferred page.
Also, many times the duplicate content is because of technical issues like not properly redirecting HTTP to HTTPS. This need to be checked too.
GSC’s Index Coverage report is quite useful when it comes to spotting duplicate content. With this report, you can easily find –
- Duplicate without user-selected canonical – URLs that aren’t canonicalized to a preferred version.
- Duplicate where Google chose to ignore your canonical on URLs they found on their own. Instead, it assigns Google-selected canonicals.
- Duplicate where the submitted URL is not selected as canonical – Here, Google chose to ignore the canonicals you defined when submitting through an XML sitemap.
c. Thin Content
Thin content is when a site’s content, text or visual elements aren’t relevant to the user intent or doesn’t offer them what they are searching for. Since it offers little value to the user thin content often leaves the them looking for more.
Thin content can be easily spotted using an SEO crawler.
Once you’ve spotted these pages, the first thing you should do is to improve your old content. Check out what Gary Illyes from Google had to say about this a few years back.
Besides, use no-index meta tag on these pages. Instead of removing them entirely, no-index them, thus telling Google not to count these when crawling.
12. User Interface
Technical issues can have a negative impact on a site’s UI and user experience. For instance, at times content will not display properly, a CTA button may not work, or your main conversion page may not get picked up in search results.
The result is not just poor user experience but also lack of user activity and conversions. Hence, it’s important to get your technical SEO audit done in a timely manner.
For instance, you should check for indexing issues, robots.txt, duplicate content, content display issues, structured data errors, and code errors that can reduce the effectiveness of your user interface.
Here are some factors you should pay attention to.
One of the best ways to spot technical errors is through usability testing. During these tests, webmasters recruit people to see how they interact with their website elements. They observe where the user struggles, what confuses or frustrates them, or whether they run into an impediment.
If people are struggling, chances are that the search engine bots are having issues crawling the site.
Here are a few usability testing questions that will help you identify technical SEO issues.
- Are your easily finding a page on the topic?
- Do the website elements work fine as you click through?
- Are there pages that don’t seem fit?
- Is the website loading quickly?
- Is the website layout easy to read and understand?
Bots navigate your site the same way a user would. Hence, it’s critical to improve your site’s navigation and make it accessible to all your visitors.
The best way to approach this is by creating a content hierarchy – breaking the content into broad categories and then making narrow subcategories. This will help visitors (and bots!) easily navigate to whatever content they are looking for.
You should also consider leveraging navigation breadcrumbs for optimal UX. Breadcrumbs are also a type of structured data that helps Google organize the content of the page in the SERPs.
c. Pop Ups
While non-intrusive pop ups are useful to users the intrusive ones can cause long-term UX/UI issues that finally affect a site’s SEO. if these pop ups or banner ads take up more than 15% of a page, they also cause CLS (cumulative layout shift) that significantly interferes with the UX.
Not to mention Google isn’t in their favor.
Avoid the interstitials that Google dislikes.
- Pop-ups that cover the main content, forcing users to dismiss them to continue exploring the page.
- Standalone pop-ups that need to be closed before accessing the main content.
- Deceptive page layouts that have sections that look like pop ups.
- Pop ups that launch unexpectedly when the user clicks on the page.
- Overlay modals that are tough to close and redirect visitors if they accidentally click on them.
UI is a prime mode for websites to communicate trustworthiness. For instance, the design quality of a site can say a lot about whether or not the website content can be relied on. Quality website design always stands out, making it look reputable and creating a deep impression on the audience.
Here are a few ways to communicate trustworthiness through UI.
- Design Quality: Factors like site organization and visual design when done properly can boost a site’s trustworthiness. On the other hand, an increased number of broken links, typos, and other errors quickly eat into a site’s credibility and communicate to the visitor that the website lacks attention to detail.
- Upfront Disclosure: Visitors appreciate websites being transparent about all the information they are looking for. For instance, they want sites to share contact information and customer service details upfront. Similarly, for ecommerce websites, users expect upfront disclosure of shipping charges.
- Correct and Updated Information: Sharing accurate and recent information is a sign that the website is an authority in the domain.
- Connected to Authority Publications: When a website has backlinks from and outgoing links to authoritative websites, they trust that the information shared is not existing in a vacuum. This significantly boosts the site’s trustworthiness.
Chapter 3: Everything about Crawlers and How They Crawl a Website
Before we get to the nitty-gritty of technical SEO for SaaS startups and businesses, it’s critical to understand that search engines do not crawl a logged-in application.
About Logged in Web Apps
Here’s how you can prevent logged-in web application data from getting indexed:
- Put it on a subdomain and block that subdomain in robots.txt
- Put any logged-in pages in a subfolder that you can then block in robots. txt (for instance, site.com/app/ and then Disallow: /app/)
Conversely, if you want the content to be crawled, don’t put it behind a login wall!
There are ways to show that you have logged-in content that can still be accessed sometimes by logged-out users (think about news publications with paywalls), but 99.9% of SaaS apps do not have to think about this.
Now, let’s move to the section on what’s crawling all about and how crawlers go about it.
As discussed earlier, building a website that allows search bots to easily crawl your website is critical to your page rankings. But to know how to make your website bot-friendly, you must understand how they crawl a site.
How Crawlers Crawl a Website?
Search engine bots, crawlers, or spiders systematically crawl the internet to index pages. They move rapidly reading one page, making copies, and moving to the next. These crawled copies are stored in their index with other pages.
Indexing helps the bots find pages quickly when needed. The crawlers also validate the links and HTML code to make sense of the content present on these pages.
In that sense, ‘crawling’ and ‘indexing’ are two different terms (often used interchangeably) but critical parts of the same process.
Here’s how web crawlers work to improve a site’s ranking in the SERP.
- They Dig Out URLs
One of the tasks that bots perform is to discover URLs. They do this through –
– The web pages crawlers have searched in the past
– Crawling a web link from a page that’s already crawled
– Crawling a URL using the sitemap shared by the webmaster
Excavating URLs helps bots to make sense of the content on your web pages, thus giving context to your pages and boosting the chances of them ranking higher in the SERP.
- Explore the Addresses
After finding the URLs, the bots check out addresses or seeds provided by the search engines. The bots visit each URL, copy the links on the page, and add them to their URL bank for future exploration.
The sitemaps help crawlers access information and links they discovered earlier, allowing them to determine the next URLs to explore.
- Entry in the Index
Once the bots have explored the addresses, they add them to the index. They store text documents, images, videos, and other files for future reference.
- Index Updation
Search bots not just index web pages but also monitor content keywords, content uniqueness, and key signals that offer context to a page. They also pay attention to new pages, changes in existing content, and dead links. This ensures that the index is updated at all times.
- Frequent Crawling
Search bots exist to crawl the internet continuously. The search engine software determines which sites and pages they are supposed to crawl and the frequency of crawling.
Factors that determine the crawl frequency are, the perceived importance of the pages, the website crawl demand, recent updates, and searcher interest level among others. So, if your website is popular, the crawlers will crawl it often, bringing it up in the SERP and allowing users to gain access to the newest content.
If you want your website to rank in the SERP, its pages should be crawled and indexed by the search engines.
How do you know if a new web page is indexed?
The quickest way to determine this is by using the “site:” search command. Search Google for “site:yourdomain.com.” This will show every page that’s indexed for that domain.
For more specific information regarding any crawl or indexing issue, submit a sitemap to Google Webmaster Tools.
To sum up, if your site structure and navigation are clear bots will be able to access the content on your site, increasing the chances of your page ranking higher.
But search engines aren’t the only ones using these secret crawlers. SEOs can use web-crawling tools to analyze the health of their website and maintain a strong backlink database. Among major issues in the website, these tools can spot broken links, duplicate content, and missing titles and meta data.
Let’s look at a few web crawling tools or web crawlers that can comb through site pages to analyze their SEO health and highlight any bottlenecks or errors.
1. Google Search Console (Free)
The Crawl Stats report on Google Search Console is a free tool that offers data on Google’s crawling history on a site. You can reach this report in Search Console by clicking Settings (Property settings) > Crawl Stats.
This report shares interesting insights on –
- Your site’s general availability
- Average page response time
- The number of requests made by Google
- Crawl responses and purpose
- Googlebot type
2. Bing Webmaster Tools (Free)
Microsoft’s Bing is the number two search engine with a diverse audience. The search engine has a monthly search volume of 12 billion globally. So, no SaaS businesses can afford to ignore Bing.
Bing Webmaster Tools is a free service that allows webmasters to improve their site’s search performance in the SERPs. Users can add their website to the Bing index crawler to monitor site performance in terms of clicks and impressions.
The tool allows users to –
- Monitor their site performance
- See how the Bing bots crawl and index websites
- Submit new pages to be crawled
- Get rid of content that you don’t want to be crawled
- Spot and resolve malware issues.
The tool’s Crawl Control feature allows users to control the speed at which Bingbots make page and resource requests.
3. Ahrefs Webmaster Tools or AWT (Free for website owners)
Ahrefs Webmaster Tools helps in improving a site’s search performance, allowing it to attract more traffic. The tool provides a generous amount of SEO insights
The platform has sections like Site Audit and Site Explorer that monitor, report, and improve a site’s health, keyword ranking, and backlink profile. The data on AWT is visualized in a user-friendly manner, making it easy for users to prioritize issues with ease.
The Crawl Log in Ahrefs Site Audit allows users to diagnose crawl issues that arise when search engine bots crawl the website.
Further, the Indexability tab under Reports shows which pages can/cannot be indexed. It also shows the causes for non-indexation with other details.
4. Screaming Frog (Free up to 500 URLs)
Screaming Frog SEO Spider is a fast and effective crawler that can help you improve your onsite SEO by spotting technical issues. The tool helps users analyze results in real-time, thereby allowing them to make informed decisions.
The SEO Spider is a flexible crawler that crawls small and big websites to spot broken links and duplicate content, audit redirects, analyze meta data, generate XML sitemaps, and visualize the site architecture. You can easily view and filter the crawl data as it’s gathered in the user interface.
Further, it can schedule site audits and track the progress of SEO issues and opportunities between crawls.
The tool can crawl 500 URLs for free after which you need to buy a license to get rid of the limit and access advanced features. It also allows users to export onsite SEO elements (URL, page title, meta description, and headings among others) to a spreadsheet. This helps in devising a robust SEO strategy.
Overall, Screaming Frog is an easy-to-use tool and ideal for small websites.
5. Sitebulb (14-day unrestricted access)
Sitebulb isn’t merely a website crawler. The tool audits the website and offers intuitive recommendations and visualizations, thus taking your technical SEO site audit to the next level. The visual reports offered by the tool prioritize the most important issues and opportunities, thus suggesting the next steps through the journey.
The prioritized hints allow users to see what’s most critical, thus saving time spotting issues and solving them.
Besides looking at the technical side of the website, the tool also dives into metrics like content readability and accessibility. The tool has Google Analytics and Search Console integration, making it a powerful tool to understand SEO opportunities in a website.
DeepCrawl is a great tool for SaaS websites because it offers a long list of features and deep-diving capabilities. It also offers functionalities that allow agencies to manage multiple client websites and their crawl budgets. Besides, the platform offers advanced settings like password-protected pages and the ability to ignore robots.txt.
Using DeepCrawl for monitoring your site’s technical health can take the guesswork out of spotting issues, especially if you don’t have a seasoned SEO professional on your team. The tool can crawl anywhere from 200-page websites to 2.5 million page websites while effectively unearthing technical issues.
7. Oncrawl (14-day free trial)
Oncrawl is an in-depth SEO crawler and log analyzer that analyzes websites like Google, regardless of their size. The tool relies on more than 600 indicators, advanced data exploration, and actionable dashboards to help you understand how search bots see a website.
Using this tool will give you a clear understanding of the technical issues harming your site performance. You can audit your site to improve the content, HTML quality, page speed, site architecture, and more, thereby driving revenue to your site.
Oncrawl Log Analyzer offers insights into bot behavior, thus allowing you to boost crawl budget and organic traffic.
8. JetOctopus (7-day free trial)
JetOctopus is an effective technical SEO tool that offers a range of flexible options for crawling and analyzing logs. The tool’s Google Search Console visualization makes it easy for SEO professionals to identify issues in a website and solve them.
The website crawler and log analyzer have no project limits and can crawl large enterprise websites within minutes. JetOctopus has a problem-centric dashboard that clearly shows the scale of issues, allowing users to prioritize them accordingly. The visualizations offer information by category, depth, and load times, thus making it easy for users to navigate between these data sets and visualizations.
JetOctopus’s dashboard is devoid of clutter and easy to navigate. The tool focuses on covering website analytics from various angles. Moreover, it offers several filters to drill through the required information. There’s also an option to compare crawls, allowing you to see which issues have been resolved and the ones that are yet to be addressed.
The GSC module in JetOctopus allows users to connect their accounts to Search Console. This helps them track their keyword performance on a daily basis.
All the website crawlers shared above are designed to effortlessly crawl websites and spot major crawl issues. Yet, I have shared a few factors that should be considered when investing in a website crawler.
- It should have a user-friendly and clutter-free interface.
- It should highlight the key issues, allowing users to prioritize the most pressing crawl issues.
- It should effectively locate broken links and pages, redirect issues, HTTP/ HTTPS issues, and other issues.
- It should easily detect the robot.txt file and sitemap.
- It should connect with Google Analytics and Search Console for complete data on site performance. This will help you get input about the site’s visitors and to investigate your site’s traffic patterns.
- It should support multiple devices and file formats.
- It should have a strong customer support team.
- It should have a transparent pricing structure with no hidden expenses.
Though using one of the above-mentioned crawlers can help you spot the key issues in the website, I would recommend using multiple tools. Doing so will audit your website from different angles, allowing no issue to escape.
Also, remember to configure your chosen crawlers in a way you want them to crawl and not to crawl portions of the website. They should be able to handle various web technologies your website is built in.
SaaS is a dynamic field where firms need to consistently come up with the latest content. This makes updating SaaS marketing websites and managing their technical aspects challenging.
Luckily, relying on an SEO-friendly CMS (content management system) makes website management straightforward. Failing to use a suitable CMS could send the wrong signals to search bots, thereby hurting your ranking and traffic.
By choosing the right CMS you can build a strong foundation for your technical SEO efforts. Hence, let’s discuss how to choose the right CMS to improve your site’s technical health.
Chapter 4: Platforms/CMS for Building SaaS Marketing Websites
A CMS that understands technical SEO features will allow you to control aspects like noindex tags, canonical tags, sitemaps, schema markup, pagination, status codes, and more. Thus, an SEO-friendly CMS will give you the power to control what can or cannot be crawled or indexed by search engines.
Trending Platforms/CMS SaaS Firms Are Using
WordPress was always believed to be a traditionally-designed CMS. However, over the years, this CMS has evolved to include core features that can cater to any SaaS platform, namely APIs, plugins, themes, extensibility, and scalability. Thus, WordPress is a lifesaver for people who cannot code and want to save their time and resources.
It also offers the multisite feature that allows website owners to create and manage a network of websites on a single dashboard.
Grytics, for instance, is an analytical tool for social groups on LinkedIn and Facebook. The service monitors these groups and finds influencers and people who are most engaged. Their marketing website is built on WordPress. Its WordPress backend is clutter-free and customized to the business needs.
- Used by thousands of SaaS companies across the globe.
- Ensures easy setup, right from installation to adding plugins of varying complexities.
- Offers a wide scope for customization with countless plugins and themes.
- The Yoast SEO plugin is great at assisting webmasters with all their SEO needs. It is good at managing the technical aspects of a website, including XML sitemaps, canonical URLs, meta descriptions, and more.
- The WordPress REST API is a plus for SaaS firms looking to build their marketing website. The API aims at making WordPress a full-fledged application framework. It allows users to move data in and out of applications using a simple HTTP request. Thus, SaaS could be working on a custom WordPress backend and admin panels.
- WordPress comes with several security concerns.
- The CMS over-relies on plugins. So, the user is constantly dealing with codes that aren’t theirs.
- Every update on WordPress adds to the cost.
Next, you need to think about an ideal WordPress hosting. Good WordPress hosting will make the installation and the website more secure, speedier, and simpler than before. In other words, it will take care of all the issues that webmasters do not want to deal with.
The top two WordPress hosting providers we recommend are –
Kinsta is a cloud-based WordPress host that offers plans with server power and optimization. It offers several WordPress management tools and delivers all the speed and scaling advantages you’d expect from a cloud-based host.
WPEngine is one of the fastest and most reliable WordPress hostings. It offers WordPress-specific support, automatic site backups, and free CDN and SSL. WPEngine tailors hosting packages as per business requirements while making sure that the site loads quickly and securely.
It is an ideal WordPress hosting service for businesses looking to be hands-off on the technical aspects of the website.
WordPress is trusted and easy to configure. Hence, for custom marketing websites, most SaaS firms use WordPress.
Besides WordPress, there are other CMS that are less famous and used to setup the CMS for early-stage SaaS marketing websites. Let’s look at them.
2. Craft CMS
Craft CMS is a newcomer in the SaaS space but is being increasingly applied by SaaS marketing sites because it’s easy to use.
Ever since it was launched, Craft has emerged as an impressive CMS with an innovative approach to managing content. The CMS allows users to control the entire process of content creation, strategy, and design and development. This dual-licensed CMS is known to empower the entire creative process while bringing a slew of usability and accessibility.
The platform is integration-ready as it can be easily connected to tools like MailChimp, Salesforce, and more. When it comes to technical SEO for SaaS companies, the CMS puts users in full control of the critical elements that need immediate attention.
- More flexible than WordPress as it can manage various types of content, namely events, products, venues, widgets, categories, and more.
- Has a simple and easy-to-use design interface.
- Delivers incredible load speed.
- Has great live preview and content modules. For instance, it can split the screen into two views, namely the CMS panel and page preview.
- Craft CMS websites aren’t notorious for vulnerabilities. It is built on the Yii PHP framework that offers robust security features.
- The Craft plugin ecosystem is well-built. New plugins are being introduced every week, thus extending the functionality of the platform.
- A single website license for Craft is about $300.
- Craft CMS offers full creative control, which means there aren’t any pre-built themes. So, beginners may find it tough to work with this CMS.
- It is tough to find Craft CMS developers because of their small community.
Usually, Craft CMS works well with any hosting that’s fast and stable. Here are two hosting providers that are suitable for SaaS marketing sites built on Craft CMS.
Arcustech is a fully-managed VPS hosting for Craft CMS and other PHP/MySQL applications and frameworks. Thus, Craft CMS developers do not have to be ‘around the clock’ server administrations.
The VPS plans offered are quite flexible, allowing webmasters to scale at any time. So, they can start small and grow as per their needs.
AWS is an unmanaged cloud service that offers EC2, Lightsail, S3, and others. This on-demand cloud computing platform can be deployed on Craft CMS.
DigitalOcean simplifies cloud computing for developers. It allows them to launch a website powered by Craft CMS that’s equipped with a great development environment and workflow.
Cloudways is a superfast Craft CMS hosting that allows developers to build, deploy, and manage apps on this CMS with ease. Its advanced hosting features are made for optimized app performance. Thus, you deploy all your CMS applications without worrying about the hosting hassles.
Hyperlane hosting is built for creative agencies and designers who use WordPress of Craft. It is a hyperfast cloud hosting that allows developers to build and maintain their Craft CMS site.
The hosting also offers developer tools needed to collaborate and manage a website.
Wix is a user-friendly platform that allows non-coders to build professional business websites. Its drag and drop functionality offers complete creative freedom, enabling site builders to completely customize the look and feel of their SaaS marketing website.
- It offers a massive collection of templates that proves to be a great resource for businesses that want to avoid the hassle of building a website from scratch.
- It offers an intuitive drag and drop interface to let you customize your website effectively.
- The Wix Apps market offers plugins to perform specific tasks.
- Wix has a built-in SEO management system that does all the SEO heavy lifting.
- It allows you to create social content.
- Wix includes an email marketing tool that allows businesses to build up a subscriber list with ease.
- The templates aren’t interchangeable. You cannot transfer content from one template to another. So, pick your template with a lot of deliberation.
- If you opt for the free plan, it forces you to have the Wix branding on the website.
- Tracking website performance and access to the analytics board requires a paid plan.
- Once you build your website on Wix, it isn’t transferable.
SquareSpace is a SaaS marketing website builder that offers all-in-one subscriptions for webmasters to design, host, and manage their sites from a single dashboard.
It isn’t self-hosted. Hence, it’s not as extendable as the WordPress or Magento which are self-hosted site builders. However, the CMS is easy to use and includes all the SEO considerations a SaaS marketer would need.
So, if you lack experience with WordPress and want your SaaS marketing website to be up and running in no time, SquareSpace is a good choice.
- Its admin section is built for easy navigation and application. It offers an incredibly simple interface with easy-to-find settings.
- Offers great SEO features and its pages follow all the SEO best practices required to rank high in the SERPs.
For instance, it allows you to manage titles, alt text, and keywords, thus making it easy for search bots to find your content.
- SquareSpace handles the basic technical aspects of a website, namely website security, software updates, backups, and more. So, you don’t need to worry about these
- Though SquareSpace offers several SEO features, it’s limited to the basics. It lacks advanced marketing features like A/B testing.
- The platform doesn’t support third-party apps, plugins, or extensions. This makes it tough for beginners to customize their SaaS marketing website.
- It lacks support for mega menus, making it less appealing for enterprise-level SaaS marketing sites that have hundreds of pages.
5. HubSpot CMS
SaaS marketers need a CMS that can help them effortlessly maange their marketing website while ensuring security, affordability, performance, scalability, convenience, and ease of operation. Choosing a CMS like HubSpot will not just ensure all of this but also save you money, limit growth adn scaling challenges, and better prepare you business for robust marketing as it matures.
HubSpot is the first and only CMS that allows SaaS marketing website creation while customizing it to the entire SaaS buying journey.
- An all-in-one platform (combines CMS and CRM), allowing you to create, manage, and execute all your SaaS marketing content in one place. It offers fully-integrated marketing tools to match your site’s marketing needs.
The platform offers several marketing tools like A/B testing, page optimization, visitor activity records, and more, allowing you to convert your SaaS marketing website into a growth machine.
- User-friendly CMS that can be used by beginners and non-technical users. The platform ensures a marketer-friendly content editing experience through its modules, templates, and drag-and-drop features.
- It offers advanced security features that includes a standard SSL certificate and globally hosted content delivery network (CDN). This ensures a secure browsing experience and 99.99% uptime.
- Usually SaaS content management systems demand huge monthly or annual fees. They also come with hidden fees for additional storage, visitors, or multiple users. HubSpot CMS comes with a flat fee with no extra costs.
- HubSpot has limited templates and modules on its marketplace. If you need to add a feature, you’ll need to hire a HubSpot developer which is an additional cost.
- It is a managed website content hosting service; hence, unlike WordPress, HubSpot users may need hosting. You cannot build a SaaS marketing site elsewhere and host it on HubSpot or viceversa.
- Though HubSpot’s interface is easy to use, working with templates can get challenging if you lack CSS knowledge.
Webflow is a cloud-based, zero-code, ‘visual’ CMS and hosting platform that allows SaaS businesses to build a structured, professional, and custom marketing website. So, by using Webflow, SaaS marketers can leverage the features of a visual editor with added flexibility of creating a custom website from scratch.
So, for SaaS website developers, Webflow can significantly speed up the website development process.
- It works for everyone in your team. Right from your designer and developers to content managers and strategists, everyone is enabled to achiever their objectives with Webflow.
For instance, developers can add, update, and delete content from the terminal using Webflow’s REST API. Similarly, content managers can create customer content structures to meet the business’s unique needs.
- You can write and edit in real time. At times, it easier to add and update content in the frontend. This allows you to see how these updates affect the overall flow of the page.
- Allows automatic and consistent updates. If you update the headline of a post the platform will automatically change it across the site.
- The platform allows limitless customization options without the need for coding.
- Webflow is more expensive than open-source CMS like WordPress.
- Though Webflow is no-code, not having prior knowledge or experience with HTML or CSS can limit what you can achieve when designing a perfect SaaS marketing website.
- The platform lacks build-in technical SEO controls.
Improving the technical elements of your SaaS website can remarkably improve your site’s performance and ranking in the SERPs.
But if you are new to the technical side of SEO, it’s natural to feel overwhelmed by the sheer complexity and scope of the subject (let alone implementing it!).
Take the sweat and stress out of the process! Get in touch with Growfusely’s team of technical SEO specialists to take your SaaS business to the next level. We will study your case carefully and suggest the best technical solutions to make your website faster, easier to crawl, and more understandable for search engines.
If you have read a lot about technical SEO, you’ll realize that most of the aspects it covers (except website speed) aren’t ranking factors.
So, why should you spend so much time spotting and resolving these issues?
Because the technical SEO has a significant indirect impact on whether or not the pages will get indexed, thereby influencing their place in the SERPs.
Think about it – If your website’s robots.txt file isn’t validated, search bots will waste your crawl budget in an attempt to access your content. Similarly, in the absence of canonical URLs, duplicate content will dilute your site’s link equity.
Hence, technical SEO is critical for SaaS businesses looking to get the required eyeballs to their content and services.
The question you may ask is – which is most important, technical SEO or on-page?
Answer – both!
So, here’s what you should do.
- Start with a technical SEO audit to lay a strong foundation for your content. Spend time on technical SEO in the beginning or when migrating/redesigning your site.
This will help you spot the top issues and resolve them, thus saving you a lot of heartburn going forward.
- Once the technical boxes are checked, invest in on-page SEO. Of course, technical SEO is a continuous process.
- Lastly, don’t miss out on off-page SEO which involves promoting your content on social channels and email and earning backlinks through outreach.
I am sure by now you are clear on the significance of technical SEO for your business and will use the tactics shared above to boost your online presence.
If you have any questions or concerns in this matter, we recommend visiting our blog which covers several technical SEO topics. Alternatively, you can get in touch with our team who will be more than happy to guide you on this subject.
Frequently Asked Questions (FAQs)
1. What is included in technical SEO?
Technical SEO includes all the technical aspects of a website that ensure that the modern search engines properly crawl, render, and index the content on it.
At times, the scope of technical SEO can be bewildering because it encompasses all the technical aspects of your website from hosting and website performance to meta robots tags and XML sitemaps.
But technical SEO allows the search engines to see your website as a high-quality resource and promises users of a great UX when they visit your site. All this ensures that your website is crawled, indexed, and rendered correctly.
2. What’s the difference between technical SEO and on-page SEO?
Technical SEO includes all efforts done to ensure that the search engines crawl and index your content effectively.
In other words, it includes SEO tactics applied to the non-content aspects of the website.
On-Page SEO is primarily concerned with the content on a page. It encompasses every tactic used to optimize the content, whether it’s researching and including relevant keywords or ensuring that the user experience is at par with the visitor’s expectations.
To summarize, here’s the difference between technical SEO and on-page SEO.
3. Is there any technical SEO checklist to follow while migrating a website?
If you are planning a site migration, you must have a clear roadmap with deadlines, failing which your site’s ranking will take a major hit.
We recommend the following steps when going for site migration.
It is a comprehensive SEO checklist that can be used to retain your existing ranking and traffic when redesigning or migrating a website. The post also offers a Website Redesign SEO Spreadsheet to compare all your URLs before, during, and after the migration.
4. How do you gauge the technical health of a website?
Regardless of how great your content is, it is bound to perform badly if your website has unresolved technical issues. Hence, it’s important to regularly gauge the technical health of your website.
Technical SEO encompasses critical elements like crawling, indexing, rendering, and website architecture. Hence, it is assessed using the metrics and tools that better reflect these technical aspects.
For this, follow these simple steps.
- Analyze website crawling using a technical audit. It is a great idea to review your log files and crawl data. All this will tell you how Google and other search engines are crawling your pages and interpreting content.
- Make sure the search engines are rendering web pages correctly. Anything tough for searchers and crawlers to find will not be rendered correctly.
- Review the indexing of pages. This will give you a clear picture of your site’s health, highlighting the pages that were indexed and ignored by the search bots.
- Monitor the top technical SEO metrics to understand how your website is performing.
Image Sources – Ahrefs, Danscartoons, Sitemap FAQs, Uber, Flipkart, Technicalseo.com, Twitter, Bing, Screamingfrog, Sitebulb, Deepcrawl, Oncrawl
Shahid Abbasi is a Senior SEO and Content Marketing Analyst at Growfusely, a SaaS content marketing agency specializing in content and data-driven SEO.