14/05/2024 • Search Engine Optimisation
Top 15 SEO Problems to Avoid
Find out about some of the most common SEO problems and how to fix them.
SEO (Search Engine Optimisation) is the process of making improvements to your website in order to generate more traffic from organic search results. There are many aspects to SEO and lots of ways that improvements can be made. In this post we’re going to look at common SEO problems to avoid. Many of these can be easily fixed, or become part of your ongoing content strategy. As always, if you’d like some SEO support for your website please get in touch.
1. Blocking Search Crawlers
Every search engine uses search crawlers, sometimes called robots or spiders, to gather information from around the internet. This is generally an automated process. Google operates a wide range of crawlers for different purposes, find out more about that here.
Websites can include a file called robots.txt. This text file lives in the root directory of the website and contains instructions which determine what can be crawled by search engines.
Most websites allow all of their content to be crawled, but it is possible to discourage search crawlers from indexing whole websites, or specific pages.
It’s surprisingly easy to end up with a robots.txt file which discourages search crawlers. For example, WordPress has a setting which allows users to ‘Discourage search engines from indexing this WordPress site’. When this setting is activated the robots.txt file is updated, instructing search crawlers not to index any part of the website.
If your website doesn’t appear to be indexed by Google, Bing or other search engines then this is a good place to start. You can check your robots.txt file using a tool like this.
2. Thin Content
Thin content is content which doesn’t provide enough information for a search engine to make sense of how it should be ranked. This could be down to a lack of words, or not being specific enough with the language.
This can be a problem for small businesses who struggle to write very much about themselves. While there are no exact figures given, it seems that Google likes to see at least 300 words of content on a page. Ideally this would be more like 700 – 1000 words. Hubspot did some research which discovered that the ideal word count for blog posts intended to produce leads was 2,500 words! This may sound like a lot, but there are ways of making it easier to write longer posts – we’ll cover that in another article.
It can be helpful to regularly review your website’s content, looking for pages or posts which don’t provide enough focussed content to rank well. We’d recommend refreshing and adding to older content or combining pages or posts. Of all the SEO problems this is probably the main issue facing most websites.
3. Poor Quality Content
Another SEO problem is that of poor quality content. This might be content which has a high enough word count, but doesn’t rank well because of errors or other problems.
When writing content for your website it’s a good idea to be as specific as possible, using keywords which reflect the primary focus of the page. Tools like the Yoast SEO plugin for WordPress can be helpful as they provide immediate feedback.
As well as being specific and using valuable keywords, it’s also important to make use of headings and create content with a user-friendly structure. A blog posts of 2,500 words would be very long and difficult to digest. But by breaking it down into sections with appropriate headings, human visitors as well as search engines can make better sense of the content.
Another problem can be poor spelling, grammar and punctuation. Don’t forget that search engines can ‘read’ – and with the advent of AI technologies make sense of language in a similar way to humans. Be careful to check your writing carefully. Remember to write for humans first though, they’re your primary audience.
4. Slow Page Speed
Slow websites don’t often rank well. Google and other search engines monitor the page load times of indexed websites and use this data to help determine how to rank websites. Pages which are slow to load are band news for users. A slow website is frustrating and likely to lead to a high bounce rate. Search engines want to send visitors to websites which offer the best experience, so speed is important.
A good way to check your site speed is to use Google Lighthouse, or Google Pagespeed Insights (bascially the same thing). You can quickly and simply check any page on your website and get feedback about what needs improving. Check out our handy guide on scoring 100% in Google Lighthouse to find out more.
5. No XML Sitemaps
A website sitemap is useful as it can tell search engines exactly what pages make up the site, and how they relate to one another. XML sitemaps can be submitted directly to Google through Google Search Console. This will ensure that Google is always informed when you add new pages or posts to your site, and will trigger it to crawl and index the new content.
Within WordPress we’d recommend using Yoast to automatically generate a sitemap for your website.
6. Too Few Internal Links
Internal links are links between pages or posts on your site. They help users navigate between content more easily. These links are often context driven – meaning they relate to the specific sentence or paragraph they exist within.
Adding more internal links to your website can be helpful for search engines. It allows them to determine how the content fits together, perhaps around specific themes or topics.
It’s not necessary to add loads of internal links, but try to include a handful on each page. Another tip is to create links which contain keywords (rather than linking text like ‘click here’, which is very generic).
7. Duplicate Content
Duplicate content can cause problems for your website as it provides little or no value to your audience and will likely confuse search engines. Duplicate content is a common problem, as it may be tempting to recreate pages as blog posts, and visa versa.
Search engines like to see distinct information on each page of a website. If you’re looking to heavily target a specific keyword consider how you might write about the topic from a range of angles. There are lots of ways to generate ideas for articles around a specific topic, rather than replicating the same information multiple times.
If you discover you have duplicate content on your website you can either:
-
Edit it to make it distinct
-
Combine it together to create a longer article or page
-
Delete it
Deleting content may sound like a step in the wrong direction. But if having duplicate content on your website is causing problems for search engines it may well be the right thing to do.
8. Poor Mobile Optimisation
Mobile optimisation is important as many users now spend more time browsing the net on their phones than with laptops or desktops. Again, Google Lighthouse provides helpful feedback on mobile optimisation.
The key things to consider are elements like images which can be slow to load over mobile networks and on low-powered devices. You’ll need to ensure that the layout of your website works across all screen sizes. Then there are considerations like tap target size (physical size of links on a page), which need to be large enough to be user-friendly.
An increasingly common approach to website design is that of ‘mobile-first‘ design. This is where a website is designed for mobile devices first, and scaled up from there. Using this approach ensures that mobile-friendly styles and layouts are applied first. However, if a website is well-designed and efficiently built this should not need to be necessary and will only offer marginal gains. The easy wins with mobile optimisation are:
-
Image size
-
User-friendly layout & text sizing
-
Reducing page load time by removing unnecessary code and scripts
9. Broken Links
Broken links are exactly what it sounds like. Links to other pages on your website, or to pages on other websites, which no longer work. This could be due to a mis-spelling or technical error made when creating the link. Broken links can also occur when target content is removed.
There are various ways to check for broken links, such as using a website crawler and analysis tool like Screaming Frog.
Broken links are bad news for SEO because they harm the user experience. Google wants to send visitors to high quality content, from which they might find additional helpful articles or information. Broken links breaks this chain, which is frustrating for the user.
10. Messy URL Structure
The URL structure of a site is often determined by the platform the site is built on. It can be difficult to optimise this without specialist knowledge. However, in platforms like WordPress and even Shopify, users can determine the URLs of their pages, which can be a good way of optimising your site.
URLs should be kept clean and tidy, using hyphens in place of spaces. URLs should be lower case only and can’t contain any special characters. URLs should also follow standard patterns across a website, which can help users understand their location within the site structure. As far as possible keep your URLs clean and user-friendly.
11. Meta Tag Problems
Meta tags contain information about you website pages which may be visible to humans, but is often primarily for the benefit of search engines. The most important meta tag is the description tag. This contains a brief description about the page which helps search engines understand what your page is about.
The meta description is often paired with the page title and these are displayed on the SERPs (Search Engine Results Pages). If a search engine does not find a meta description or well written page title it will create one itself from the page content. While this may be sufficient it is not ideal, as the search engine may mis-interpret your content. This is often one of the easiest SEO problems to solve.
12. Missing Alt Tags
The Alt in Alt tags standard for Alternate. It is text which is associated with an image in order to let screen readers and search engines interpret the image. These days this is primarily a accessibility issue, ensuring visually impaired users can make sense of your content. Google will prefer websites which make it easy for users of all kinds to understand the information contained within a website. Make sure you always include Alt tag information for your images – most website platforms make this easy to do.
13. HTTP(S)
Another common SEO problem to avoid is that of website security. Years ago it was complicated and expensive to run websites through SSL certificates, encrypting information as it passes between the server and the user’s device. However, SSL certificates are now provided for free by almost all website hosting platforms, so there’s really no reason not to use one.
Websites which are served from a HTTPS (secure) URL are preffered by search engines. This is because they provide a secure channel of communication, ensuring that any personal details which pass between the user and the website server are kept private.
Make sure your entire website is protected via SSL. Some old websites occasionally include images or other assets which are linked from the non-secure HTTP URL. This can cause errors and warnings to be displayed to users, which can be concerning. Tools like Screaming Frog can be helpful at finding insecure content and making the necessary changes.
14. AI Generated Content
Another SEO problem to avoid is that of AI generated content. Google says it rewards high quality content however it was produced. They released some very helpful guidance on the topic, which you can read here. What is clear is that they’re looking to prioritise content which demonstrates:
-
Expertise
-
Experience
-
Authoritativeness
-
Trustworthiness
Given that AI can only generate content by reinterpreting and amalgamating information which it has access to, it seems unlikely that purely AI generated content would outperform content written by a human.
This is not to say that AI cannot be helpful in providing structures or ideas for content creation. Rather, it should not be depended on to produce high quality content which will rank highly and be attractice to your audience.
15. Title Tags
Title tags are probably the most important on-page SEO optimisation you can make. Titles tell search engines exactly what your page is about, and as such should always be well optimised. In short, page titles should be:
- Concise (50 – 60 characters long)
- Specific (about 1 particular subject)
- Keyword-rich (containing the most important words for your topic)
Check out our post about writing better page titles.
SEO Problems Conclusion
As the internet has developed, search engines have had to become more discerning when it comes to how they rank websites. Years ago it was simply enough to write a few decent pages of content on a particular topic and ranking well was almost inevitable. Not any more.
Solving some of these SEO problems can be a good way to improving your ranking without much difficulty, particularly with older websites. However, if you’re building a new site and aiming to rank well for a particular topic, you’ll need to develop a comprehensive SEO strategy. This would cover every aspect of SEO, including technical aspects such as page speed to UX considerations like colour contrast.
We’re here to help with that.
If you’d like a thorough review of your website’s SEO performance and an SEO strategy developing to get you where you want to be, get in touch.
Photo by Firmbee.com on Unsplash