Want to speak with us?

Close this search box.

Did you know that 91% of websites get 0 traffic from Google?

Did you know that 91% of websites get 0 traffic from Google?
Building a website is the first step to establishing an online presence, but it’s a complete waste without an SEO strategy that delivers traffic. Shocking data from Ahrefs shows that 91% of all websites get absolutely 0 traffic from Google searches – a catastrophic failure for the businesses that invest in developing their sites and see no returns from organic search. While many websites are false starters, the “build it, and they will come” web agency brigade has sold many businesses down the river. The bottom line is that SOMETHING’S UP if you get 0 traffic after paying an SEO agency to optimise and hone your online presence. So, why do most websites fail to get any search traffic? By understanding the common pitfalls, you can avoid them and be part of the 9% that succeed.

The Topic Has No Search Demand

The Topic Has No Search Demand
The most fundamental reason websites get no traffic is that the topics they cover have no search demand. If people aren’t searching for what you offer, even the best SEO can’t drive visitors. The solution? Adequate keyword research before creating content. Start using the Google Keyword PlannerAhrefs Keywords Explorer, or the SEMrush Keyword Magic Tool. Identify terms that have at least 100 monthly searches and low competition. Without understanding what keywords to target, you’ll waste time targeting keywords that would drive little traffic even if you ranked #1. Ensure sufficient demand for each topic before allocating resources to your content budget. for sponsorship and advertising is a normal part of the web economy. Having such paid links on a website is not a violation of Google’s policies if they have either a “rel=” nofollow” or “rel=” sponsored” attribute value added to the anchor tag. So-called “do-follow” links are traditionally much more valuable for SEO because they pass authority to your website. However, Google has changed the nature of “no-follow” and “sponsored” links and now treats them as hints. You still need do-follow links to rank, but getting a no-follow link from a great website is still powerful. Start an outreach campaign focused on earning editorial rather than “paid” links of questionable value. Produce beneficial content that sites in your industry would naturally want to reference to earn links for free. Proactively building even a few authoritative backlinks can be the difference between ranking or being buried under the fold.

The Page Doesn’t Match the Search Intent

The Page Doesn't Mention Any Keywords
The Page Doesn't Mention Any Keywords
Without relevant keyword mentions, semantics and topic clustering within web pages, Google has no signal for what a page is about Perform on-page optimisation best practices like:
  • Getting your keyword in the H1 tag.
  • Creating a compelling page title and meta description.
  • Incorporating your primary keyword in the first 100 words.
  • Including related keywords in H2 and H3 tags.
  • Embedding target long-tails in image filenames/alt text,
  • Achieving a keyword density between 1-3%.
These best practices help Google understand your page’s relevance and crawl/index them for the correct search terms. The key is striking an artful balance between over-optimisation and missing critical terms. Using the right keywords aligns with Google’s evolving NLP capabilities.

The Page Has Thin Content

Since their Panda update, Google has stressed the importance of high-quality, in-depth content. Pages with only a few hundred words or one generic paragraph likely fail to provide enough value, so don’t rank for competitive keyword terms. Set a target benchmark of at least 500-1,000 words for blog posts targeting critical keywords and a benchmark of 2,000-3,000 words for in-depth topics. Expand on core topics by linking to supporting pages that cover sub-topics in more detail. Interlink content across your site to encourage visitors to read multiple pages during each session. Google notices dwell time as a signal of quality and satisfaction.

The Website Is Slow to Load

The Website Is Slow to Load

Site speed is an official component of Google’s page experience algorithm, which determines search rankings on mobile. Websites that take over 2 seconds to load on phones are buried below faster competition.

Test your site with Google PageSpeed Insights and identify optimisation opportunities like:

  • Compress images.
  • Use next-gen image formats like WebP.
  • Remove render-blocking JavaScript.
  • Upgrade to faster web hosting.
  • Use browser caching.
  • Defer non-critical CSS/JS.

With page experience carrying significant weight, there’s no excuse for maintaining a lagging site in 2024. Deliver sub-2 second speeds or miss out on mobile organic traffic.

Another excellent tool for optimising website performance is GTMetrix – we love this tool’s waterfall feature, which shows the request-by-request loading behaviour of a webpage, including the file size and load status. You can use the waterfall feature to see the files holding your load time back.

The Website Has Technical Problems

Search engines have difficulty crawling and indexing sites with technical barriers. Problems like incorrect no follow use, missing SSL certificates, excessive downtime, and malware can tank organic visibility. We recommend:

  • Avoid the blanket use of no follow in robots.txt.
  • Do not put your website behind a Google-blocking paywall.
  • Install an SSL certificate to activate HTTPS and the padlock icon.
  • Limit downtime from web host outages to less than 1 hour per week.
  • Fix 404 and server response code errors.
  • Upload an XML sitemap to Google Search Console.
  • Clean up malware because compromised sites violate Google’s spam policies.

Your Website is the Victim of Negative SEO


Your Website is the Victim of Negative SEO
Negative SEO refers to malicious techniques that deliberately sabotage a website’s rankings. Competitors and other bad actors use tactics like link spamming, scraping content, and reporting sites to Google. The impacts of a negative SEO attack can be disastrous:
  • Google applies an algorithmic or manual action penalising your rankings.
  • Your site gets buried pages deep in the SERPs by an influx of poor-quality links.
  • Hackers compromise your site by finding vulnerabilities and injecting malware.
Recovering from algorithmic penalties requires identifying and disavowing the toxic links to prove your innocence in Google Search Console. For manual actions, submit a reconsideration request showing that you’ve removed any guideline-violating content. Improving load times, page speed, security, and performance also aid your case. Use tools for auditing backlinks, identifying content duplication, and staying informed of Google warnings.

Getting to the 9%

To be in the elite 9% that drive sustainable traffic, you need a focused game plan:

Research and Target Buyer Keywords

Don’t waste months creating content that ranks yet sends no customers. Use tools to identify mid and long-tail keywords with enough search volume and low difficulty scores – this indicates an opportunity to own traffic around genuine buyer intents.

Create Content that Captivates

Getting links and social shares requires valuable, comprehensive content. Set guidelines for all blog and pillar pages: at least 500-1,000 words, optimised on-page elements, insightful data, and research citations. Publish unique content not found on competing sites.</a

Promote Content the Right Way

Simply publishing great content isn’t enough. You need amplification through social media, industry influencers, and link-building outreach. Landing just 10-20 high-authority editorial links can position you in the top results for competitive terms.

Optimise for Crawlability and User Experience

Follow technical SEO best practices to avoid issues that block bots from correctly accessing and indexing your pages – redirect broken links, fix security vulnerabilities, and eliminate excessive redirects. If you have a big website, monitor crawl stats in Google Search Console to keep up with everything and find technical problems.

More News