Since 2017, I’ve been involved as a director for a very talented agency called The Search Initiative (TSI).
The reason I teamed up with these guys was very simple.
These guys were testers. Like me, they only rely on experience, data, and test results for their ranking strategies.
It was a match made in heaven.
Since then, we’ve had the pleasure of onboarding hundreds of new clients offering a wide range of organic SEO consulting services.
Many of these new partners have had seen huge growth, others have had penalties removed, and all have clear roadmaps on how to grow in the future.
I wanted to share with you the top 10 SEO problems we typically encounter when onboarding new clients. I hope that by sharing with you some of these common mistakes, you can use this knowledge to your advantage and make some serious improvements to your rankings.
The 10 Most Common SEO Issues in 2024
Table Of Contents
Many of you will be familiar with the inverted pyramid writing style, where the most newsworthy content is at the top and the least is at the bottom. I’ve tried to follow this structure; however, all the points below are not to be slept on. They’re all major issues that commonly appear amongst even the best sites.
If you want to get the most out of this article, check out every point here. As you know, there are no shortcuts when it comes to SEO.
- Index Management
- Localization
- Keyword Cannibalization
- Over Optimized Anchor Text
- Poor Linking Strategy
- Low-Quality Affiliate Content
- User Performance Metrics
- Titles & Meta Descriptions
- Internal Redirects
- Low-Quality Pillow Links
And if you’re wondering which SEO Mistake is ruining you on Google watch this video.
1. Index Management Problems
The first and most common issue that we’re seeing is accidental devaluation of the website because of indexing issues.
It stems from a common misunderstanding about how Google actually works.
(More on this in a bit…)
Most people think that if they build links and noindex junk pages they’re fine. However, it’s not that simple – and I’m about to show you a real example.
Below you will find a screengrab from Screaming Frog. This is from a crawl of an eCommerce website that had a lot of onsite issues that need to be fixed:
It’s quite hard to see but you may notice I have highlighted the number of HTML pages that are filtered. It’s a whopping 32,064 pages and, yes, it took us a long time to crawl.
None of the 32,064 pages found in this crawl included a noindex tag, which means (in theory) Google should be able to crawl and index these pages. So, let’s check this against our numbers in the Google Search Console:
When we check in Webmaster Tools, we’re seeing 14,823 pages indexed. While this is a large volume, it’s still less than 50% of the pages that were found with Screaming Frog.
This is the first sign that something is seriously wrong, but the next screenshot will show you the extent of how badly our client had been stung with Panda’s low-quality algorithm. We use the “site:domain.com” operator to pull up the number of indexed pages:
Despite the website having 32,064 pages crawlable and with index tags, and despite Google having indexed 14,823 in Search Console – only 664 have made it into the actual index. This site search shows us that Google has highly devalued most of the website.
It is a crawling nightmare.
So, the question is, how can you fix this?
Thankfully the answer for most people is quite simple.
Start by performing a site:domain.com search and auditing Google’s index of your site. If you go to the final page and you’re greeted with the below message, you have work to do:
Take a hard look at which pages shouldn’t be indexed and start proactively removing them from your crawl budget.
The problem with Google is that despite you adding a noindex to your pages, they remain indexed until Google recrawls. Some people add robots.txt to block these pages and save crawl budget – which is a good idea, but only after the pages are removed.
For the rest of us, we’re going to need to use the URL Removal Tool.
To learn more about how to deal with crawl and indexing issues, check out this guide.
2. Localization Issues
The second most common issue we are seeing is when clients have multiple languages. While it’s great to have international coverage and provide foreign users with localized text – it’s a nightmare for Panda penalties if not setup correctly.
Many people are familiar with the URL structure that you should use for localized text, but many people forget to set up HREFLang on their website. My buddy Tom talked it in this interview I did with him here.
If you are looking to setup HREFLang codes, I suggest you use this website to get the right country and location code every time.
Below is an example of an eCommerce client. Whereas the previous client had issues with index management, this time it’s caused by HREFLang, and one more thing that goes unnoticed…
While the client has successfully included hreflang in their source code, they had not included both the location and language code. The one time they try to do this with en-GB, the page no longer exists and redirects to their sitemap.
To add, this covers just 50% of the languages their website operates under. This has created an enormous amount of duplication to be indexed.
However, there’s still one more thing that was missed. Each page has the Open Graph local set for en_US:
This includes the pages that aren’t in English.
While this setting isn’t as clear cut as hreflang, it is indeed something that will provide Google with information on locale, and therefore creates confusion.
If your website has a similar issue, we advise you make the locale dynamic to match the current language.
For more help with locality and its importance, check out this page on local seo solutions.
Client Testimonial
“The guys at TSI make SEO look easy!.. We were a completely new website planning to operate in arguably the most competitive online marketplace which made the task ahead extremely difficult as we were going up against many well-known global businesses which also resulted in many different SEO agencies reluctant to work with us. TSI wasted no time in implementing their campaign and within 4 months our website was ranking on page 1 for some of our most profitable keywords. The guys at TSI are constantly keeping me updated and send me monthly reports on my campaign performance and keyword tracking. I would recommend their services in a heartbeat!” – Jon H
3. Keyword Cannibalization
This is a surprisingly common issue for most websites that we encounter. Despite the large amount of resources online to help with cannibalization, you would be surprised how many people still suffer from it.
Never heard of it?
Quite simply, it’s when you have multiple pages on your site competing for the same keywords.
And guess what? Google doesn’t like it.
The first step is to learn to diagnose the culprit pages, because if you cannot find cannibalization – how can you fix what you can’t see?
At The Search Initiative we have a few ways to find cannibalization but here’s the easiest and most effective.
Use Keyword Tracking Tools
One of the benefits a client gets from working with TSI is that we track keywords up to twice daily with Agency Analytics, one of our partners.
The tool includes an overview of the site’s overall keyword performance, such as below:
Aside from showing us an overview of how this client has performed over the past 7 days, we can use this to track each keyword’s performance independently too:
In this photo, you might notice that there had been a jump from the 3rd page to the 1st page for their target term after implementing some of our onsite advice. However, more importantly, you will also be able to see that the Google URL had started flipping between their category page and their homepage.
This is an obvious sign of cannibalization and once noticed we jumped into action to fix the problem.
To learn more about keyword cannibalization, I have a master guide here.
4. Over-Optimized Anchor Text
There was a significant update in October 2016 as Penguin 4.0 rolled out.
Penguin 4.0 was an update that changed how Google perceives and interacts with links. I even wrote an article for Ahrefs covering how it affected anchor text optimization.
As part of our auditing process for each new client, we analyze your existing anchor text and break the types down into the below values:
- Branded – an anchor text that includes your brand name or a slight variation, for example: ‘thesearchinitiative’, ‘visit thesearchinitiative’, or ‘TSI’.
- Generic – an anchor that uses a generic term but does not include branding, for example:‘here’, ‘read more’, or ‘visit site’.
- Image – a link that has no anchor is generally shown as a blank in AHREFS export feature. Other clues might be file extensions in the alt attribute ‘jpg’ is probably an image.
- Miscellaneous – an anchor that does not qualify as generic, but is otherwise unrelated to the website. Forum and comment spam often includes anchors such as ‘Steve’, ‘Stuart’, or ‘Stan’.
- Low Quality – an anchor that is more than 100 characters is generally an irrelevant anchor unless it’s a long URL. Another low-quality anchor is a foreign language and symbols.
- Targeted – an anchor that includes the exact or partial term you are trying to rank for, effective to gain rankings but higher risk for tripping a Penguin filter against your site.
- Topical – an anchor that is on topic, but does not include your targeted term. For example, an affiliate site reviewing ‘best running shoes’ might include topical anchors such as: ‘healthy workout’, ‘burn lots of calories’, or ‘high impact sport’.
- URL – this is arguably the most obvious one, but anchors that are naked URLs such as ‘example.com’ and ‘https://example.com’ would count as URL.
Here’s an example of a client that recently joined and has an issue with anchor text. The labels match the descriptions above:
In this example, the website in question has chosen to use lots of targeted anchors, but has also picked up lots of low-quality anchors along the way. The solution was to increase the amount of topical, branded, and generic anchors so he could meet the anchor requirements as determined by the niche average (read more).
By increasing the volumes of those anchors, the client regained organic traffic loss and is now set up to survive future updates.
It’s important to note that most people use low-quality pillow links and press releases to redistribute their anchors.
There are indeed some issues with this that are covered in the next two points.
5. Poor Linking Strategy
Up until this point, 3 of the top 4 issues have been related to onsite. While Issue #5 is indeed another offsite link building issue, it’s important to recognize the connection.
When a website has fixed its technical issues, pumped out valuable content, and improved user performance metrics – link building becomes a lot easier.
Rather than needing 100’s of links to rank a site, you can achieve a lot more with less. Since link building and content both cost money – you may be wondering why not just spend money on links?
The answer is simple…
Google has introduced many link building filters to thwart your efforts, therefore, the more links you build the more likely you are to be caught. By delivering better content you will not only improve your conversion rates, but you will make it easier to rank higher, permanently.
Check out one of our older clients who has been relaxing on page 1 for 2 years comfortably:
So, the question is, what makes a link strategy good?
The first thing is to avoid over-optimizing your anchor text because this is eventually going to cause a penalty. Choose to use topical terms and branded anchors to hit your pages instead.
The second thing is to target other pages than your main core pages. If you have created a blog post that is valuable and internally links to one of your core pages, throw some links at that page too. Avoid becoming the Black Sheep.
Not only will this help prevent overcooking your page, it’s going to help you rank for longtail keywords that you didn’t claim before.
The reason your competition doesn’t do this is because they fail to focus on any pages other than their money pages.
Big mistake.
If you fix user flow on your entire website then every page becomes a money page.
Write that one down. Post it on your wall.
6. Low Quality Affiliate Content
This should go without saying, but if your content is not good then you don’t deserve to rank. Period.
However, what most affiliate sites are guilty of is not mopping up all the juicy longtail keywords that are easy to rank for and provide noticeable traffic. It’s not that they don’t want to rank for those keywords, it’s just that they don’t know how.
We want to share a couple of images with you that show just how powerful content can be, and if you don’t value your content – what can happen:
This client has recently joined us and suspected a penalty, and at first glance there’s a dip in visibility, but nothing that seems too unusual. Until you zoom into the top 10 positions:
What initially looks like a slight dip is really a huge drop in rankings, and this person has suffered from decreased traffic for about 8 months.
The culprit? Content.
This is no fault of their own, but somebody has scraped all their website and created duplicate content across the web. We’re currently working to re-write their whole website’s content, file DMCA requests and fix holes in their linking strategy. In 2 months, they saw a return in both rankings and traffic.
While this highlights the power of content in Google’s algorithm, this is slightly different from what I am describing with low-quality affiliate content. The main culprit we see is when every page has an affiliate link and there’s no actual user value.
It’s possible to rank this way, but there are some drawbacks.
Let’s look at User Performance Metrics and how this can help guide our content strategy.
Client Testimonial
The Search Initiative has helped to grow our business where four other digital marketers have failed. The team is not only flexible, responsive and reliable. They also make decisions that are data-driven with their proprietary tools. They are not testing and guessing with their work as most others do. We will be rounding out the year with them and looking forward to a long and profitable engagement. – Vik C
7. User Performance Metrics
We have noticed that many affiliate sites and even some eCommerce companies are not focused on user performance metrics. This is bad for several reasons:
Firstly, you’re going to be limiting yourself severely.
There is a finite amount of people searching monthly for what you offer. By avoiding the issue of content, you’re forcing yourself to spend money on links to brute force rankings towards terms that are not relevant and not fruitful.
Instead, we would suggest that you should focus on converting your existing traffic while simultaneously growing your potential traffic. Take this example:
If you have 1 customer in 1000 visitors that purchases from you then to triple your customers you have two options:
- Increase traffic from 1,000 to 3,000
- Increase conversion from 0.001% to 0.003%
We applied this strategy to one of the eCommerce websites mentioned earlier in the article. Their website had been devalued and it’s going to take time to grow traffic, so we decided to pursue conversion while working on fixing the huge issues.
Here are the results:
In the past 2 weeks, we have managed to increase the sessions by a modest 4.26%, which we’re happy to take, considering the condition of the site – but lower than average for our clients.
However, the main point to notice is the 79% increase in Conversion Rate, 86% increase in Transactions and 49% increase in Revenue. These changes mean that as we fix the devaluations against the site this client is primed to make a significant revenue gain.
8. Titles & Meta Descriptions
This is like keyword cannibalization in that it’s surprising how many websites still have issues with it. This is something I would expect most people to be getting right by now since there are literally millions of pages on the topic:
I personally cover this in great detail in my Evergreen Onsite SEO guide:
9. Internal Redirects
This is one of the most common issues that websites face. A large volume of 3XX Redirects on your website seems fine to most – if it’s a 301. However, this isn’t strictly true and here’s why:
A 301 redirect is designed for when a user requests a page that is no longer available and has been permanently moved. This is something that happens a lot across the internet. The server after a moment of latency returns a different URL and the page loads as usual.
The issue with the above is the term latency and it’s something most webmasters ignore. The physical distance between a user and your server means that even a tiny bit of header information takes time to send and receive.
If you are looking to improve your user experience, then you should make your website as fast as possible (read this) and therefore remove all 301s when not absolutely needed. This will be better for your user and help authority flow within the website unhindered.
However, what’s the difference between a 301 and a 302 redirect?
Whilst a 301 redirect points towards a permanent move from one location to another, a 302 is a temporary move. From Googlebot’s perspective this means that:
- 301 Redirects should index the new URL, rather than the previous URL
- 302 Redirects should index the previous URL, ignore the new URL
Google has claimed to handle both the same but it doesn’t make sense that they would do this. Both status codes have different purposes and should be treated differently.
Make sure you’re using the right redirect on your website.
10. Low Quality Pillow Links
What are pillow links?
Pillow links are used to diversify your anchor text ratios – in the audio industry, this would be comparable with signal-to-noise ratio. When you buy a microphone, you want low noise and high signal – but this is the complete opposite to SEO link profiles.
If your link profile has a high signal and low noise, it’s going to be easy for Google to analyze your website’s links and pick up unnatural trends with their machine learning. You could consider pillow links to be dithering for the audiophiles out there; you add noise to the signal to improve the quality.
Here’s a shocking figure for you from a recent client who had built 100s of pillow links attempting to dilute a high target anchor text distribution:
Despite the client having built 315 pillow links in the past 12 months, only 25 of these links were indexed. This means that despite all the time and money spent, it was providing almost no value.
The solution is to use indexing tools that encourage Googlebot to index your pillow links… or just build all quality links and you won’t have this problem.
For indexing, we use our own proprietary tool but there are some online services that can do this for a fee.
Client Testimonial
“TSI are very professional and their SEO knowledge is top level. I’d been doing my own SEO for years before hiring them, and had some decent success with it at times, so I know good SEO when I see it. My site’s traffic and rankings are gradually rising and TSI have answered all my questions. We’re only 2 months in, but so far so good!” – Sam D
Conclusion
So there you have it.
These are the 10 most common issues we have found with client websites in the past few months and I hope this helps you in your own ranking endeavors.
If you’re still having trouble then contact The Search Initiative to find out how we could help you gain results that last and improve your conversion rates in the process.
Get a Free Website Consultation from The Search Initiative:
Why do you use “read more” and “here” anchors for interlinking on this page? Why not use more relevant anchors?
Believe it or not, I don’t try to rank this website. For reasons that might be obvious.
Thought it was some kind of test at first 🙂
??
Hey Matt. Love this article man! I would like to ask how would you recommend to build content for ecommerce site. For example lets say I sell wedding rings. Should I build content on the blog page and link it to the money page or build the content directly on the page like http://www.example.com/wedding rings ?
Both.
Great post Matt.
Thanks, brotha.
Hey matt..
Thanks for putting it together. 🙂
Lots of valuable insight. As always. 🙂
My pleasure.
This is some high-value info! I have a question about your pie chart for anchor text distribution for the competiton. Are you grabbing the top 4-5 competitor’s backlinks from Ahrefs (or Majestic), collating them all together, hand tagging their links as “branded”, “URL”, “topic”, etc., and then using that as composite profile target for your money site to match up to?
Pretty much.
As I’m looking at competition in Ahrefs, some of the inner pages that are ranking have very low to almost no backlinks, which means the root is powering that page. In that case, where there are no backlinks to really analyze on the inner page, do you just move the analysis to data from the root?
If you’re talking about anchor text analysis, then I would simply move that glitch out of the data. Ignore that page in your averaging.
This one is confusing to me: ” If you have created a blog post that is valuable and internally links to one of your core pages, throw some links at that page too.”
Can you elaborate this?
Check out this article.
Looking forward to more reading on Index Management, Localization Issues, and Keyword Cannibalization!
Agreed. Should be very helpful.
Great post Matt again! And thank you !
Waiting article about keyword canibalization. 🙂
Hey Matt,
Thanks for the great info. As always I have picked up some invaluable tips in here.
I believe I may be suffering from low quality content on one of my sites, so it’s great you pointed it out!
Cheers
Killer as always Matt and thanks for taking the time out of your busy schedule to crank-out the quality content you do. It’s funny, many of the issues you addressed could be avoided if a site is managed properly, which your examples clearly weren’t. Keep up the stellar work my man 🙂
Thanks, Dana.
Matt,
I always appreciate your posts. I have been guilty of #10 (creating LQ pillow links) too many times. Saved, bookmarked and downloaded for future reference.
Thanks!
-Eric
Thanks Matt, working my way through this now I’ll let you know of any improvements on my site!
Quick question about the anchor text types. If my keyword is ‘best running shoes under $50’ am I right in thinking that any anchor text that contains the word ‘shoes’ will be targeted? Or is it just phrases like ‘best running shoes’ and ‘shoes under $50’ that will be classed as targeted?
That’s how I define it. The important thing is that you pick a convention and stick to it.
Hey Matt
Thank you for the post, looking for a clarification at the 10th point: to make myself clear, I will give you an example: do you think it’s OK to use contextual links (PBN / guest post / etc ) for target anchors, and web / business directory links + social links for pillowing (much cheaper to obtain), as long as they are indexed? Or do links that are not contextual count as low quality? In my country, most of my competitors are (still) using a lot of web / business directory links, so I’m thinking I should just blend in and drop a few PBN links with target anchors?
This whole thing about “use pbns for target anchors and other links for pillow anchors” makes a lot of sense, but I’m yet to see a solid case study on it.
Great post. Which indexing services would you recommend??
Not sure. We use our own proprietary tools.
Hello, Great article! Can an American PBN link to a Brazilian money website? You stay have any experience? Is there a greater chance of being penalized? Thank you! Excuse my English.
I’ve done it plenty, but supplement with Brazilian citations: read more.
Love the content 🙂 Hope to work with you in the future!
You know where to find us.
Pillow links.
I’ve heard this term before. No one seems to have defined it.
I’m assuming these are weaker links such as blog comments, bookmarking, etc. links of that sort.
Like you mentioned, press releases to fix anchor text ratios.
..on the right path?
Pillow links are referring to non-critical anchor texts (URL, misc, etc).
In regards to press releases you need to figure out what problem you are trying to solve. Google have site-wide and page-level algorithms. If a specific internal page is over-optimised and then you do a press release to your homepage – that’s not going to do anything at a page level.
We generally avoid press releases within the agency because there are better ways to acquire large volumes of valuable links. We also research anchor text each month before we get link placements so that we don’t find ourselves in a position of “Woops, over-optimised and now we need to get pillows.”
What Rowan said. ^^
Jimmy, to add on to what Matt said: Low Quality Pillow would be considered what you mentioned above. Bookmarks, blog comments to non-crit anchors; you were on the right path.
“If you fix user flow on your entire website then every page becomes a money page.”
That is a bonus statement, only that the message is not packaged for ordinary SEO. It is for the best. Anyway, from your article, I thought you forgot an important eleventh point; lack of topic relevance, lack of a site architecture that accommodates topical relevance, or lack of both. I think this is an important and a common problem lacking in most sites.
So what do you suggest , should we keep index the actual post , not the image or tags which WordPress shows as separate URL.
I usually only index main pages and posts. Deindex everything else.
So much valuable content here – really appreciate the information & Case studies – looks like I have plenty of work in front of me 🙁
Hello Matt,
Indeed a great article with amazing case studies. You always share stuff with strong proof. This has been an incredibly wonderful article.
Thanks for providing this info.
Hi Matt,
Thanks for sharing this valuable knowledge, you are one of the few SEOs I trust, hence the following question:
Keyword cannibalization has me suffering in silence, and I don’t know what to. I’m on the fence between the following two structures for a small Amazon affiliate site. Let’s say I have 3 product categories, 5 products per category and I want to keep the core content to ~25 000 words, with the plan to add “how to” and other support articles periodically later.
Structure no1:
500-word homepage, 3×2500 word category pages and then 15×1200 word detail product reviews for each product.
This structure is awesome for user experience since you have a conversion focused homepage and a funnel for each product, but it sucks for Google since you can’t get a way from keyword cannibalization here (or can you??).
Structure no2:
3×6000 word mega category posts, and a 3000-word mega homepage summarizing your category pages. Use the leftover words for more supporting content.
This is the only way I see how to avoid cannibalization, and gives you 4 clear pages you are trying to rank….but it sucks for user experience. You have too much text on your homepage, and too long product reviews.
I’m sorry for the wall of text but there is no simpler way to ask this. Any guidance would be sincerely appreciated.
Let’s hold up for the coming article I’ll have on cannibalization.
Thanks, looking forward to it 🙂
Hey Matt,
When are you going to release content on Keyword canabalisation?? I think a few of my sites are stuck because of this. Can you create a walkthrough/ step by step on how you look and solve the issue of KC??
Thanks
Can do.
Hi Matt,
I hope you follow up with an in-depth keyword cannibalization article.
I have often wondered if cannibalization affects rankings even when a page is obviously not flipping between other pages. This is often done unintentionally when “niching down”, or when building topical relevancy for example:
/best-blenders/ <= best blenders
/best-blenders/under-1000/ <= best blenders under 1000
In this case, the second page would have a similar amount of on page mentions of "best blenders" within the title, url, body copy etc.
My thought is by having fewer pages about a KW could help the primary page rank, by giving Google no doubt in which page it should rank for the keyword on your site. Then have related phrases to the niche that will in no way compete with the "money phrase" throughout the site or silo (for topic relevance).
If you or anyone else has experienced this before I would appreciate your thoughts.
These situations are hard to diagnose, especially when there’s no flipping. The only way to figure it out is to delete the child page for about a month and see what happens to the parent.
Thanks for the info, much appreciated.
I’ve thought about reducing the on page etc.
But I think ultimately, your solution to simply delete it and wait a month and see if I’m right or wrong is probably the best approach.
I guess 301’ing the child page(s) to the parent would also be acceptable?
Yeah, that works too.
I can’t agree on your explanations regarding indexing and especially site: operator. The site: operator is almost never accurate, not even for big trusted authority sites. Check site:indeed.com for example. After the 10th page it says: “In order to show you the most relevant results, we have omitted some entries very similar to the 92 already displayed.” They show 92 (!) results for indeed.com.
However, if you search for any job / location specific keyword you will find indeed everywhere (e.g. Junior Graphic Designer jobs new york etc). They have probably dozens of millions of pages indexed. But you will only find a very small fraction with the site operator. But still of course, indexing problems is a major issue why a lot of sites are not performing well.
Thanks for your take on the indexing. I’m glad someone mentioned it as it’s a very interesting topic!
You’re right, the “site:” operator doesn’t return all results. Google won’t show you everything it’s got indexed, as it wants to be the only one who knows everything… Cheeky cheeky!
We could literally create case study after case study of Google hiding URLs.
However, you can trick it into revealing more and more by adding the inurl: (or other) operators to your site: query, and then scrape specific sections of the website. That’s what we’re doing with the client websites.
Be aware though, Google will reveal the next batch of pages indexed under the section you’re querying only after the previous batch is removed. For a while this may cause a dance in what results you’re seeing under site: search.
Great piece. I’m actually working on a better anchor strategy at the moment. In regards to point 4: how would you label anchor texts for an exact (or partial) match domain?
Example 1
Domain is https://house.com/. How to label these anchors:
1.1 house.com
1.2 house
1.3 buy at house.com
Example 2
Domain is https://buyhouse.com/. How to label these anchors:
1.1 house
1.2 buyhouse.com
1.3 buy house
I don’t know what your target keyword is. But assuming its “buy house”.
URL
target
target
target
URL
target
Great, thanks Matt. See you 3 November.
Looking forward to it!
Hi Matt! I noticed when publishing product reviews, competitors always use 1 out of 2 URL formats: Either straight product name (and nothing else) OR category + product name.
For example, they publish either:
“domain.com/garcinia-xyz” or “domain.com/weight-loss/garcinia-xyz”
I was wondering… Is one of the two more effective than the other?
My guess is that the shorter the url the better… But then, doesn’t adding that category part in the url help you with topic relevance for the review?
Which url format would be more effective for rankings?
Untested but I would go for the shorter one.
I’m fairly certain I’ve seen far more instances of sites ranking highly without a category than with one.
Hi Matt, thank you for this awesome post. I’ve been able to derive a couple of important lessons from it that I’m going to implement. I’m going to bookmark this blog and come read more interesting content later. Thanks.
About indexing problems.
I actually have this problem too, but on much smaller scale that you described.
Anyways, the pages I see that shouldn’t be there are those kind:
domain.com/wp-content/uploads/2016/04/?SD
What would you do about those?
Tell your plugin (All in One or Yoast) to no-index images.
No-index images at all?
Isn’t that risky?
I mean what about Google not finding any of the images in the page?
Or it wont matter?
The page still references the image, but Google doesn’t need to index the image itself.
What I used to do was blocking through robot file, but I removed it after I saw Google can’t see images this way.
How do you explain this problem occurs for some sites and not all? I mean I have another site with the exact same theme/setting and it just doesn’t happen.
Not sure, man. Maybe historical indexing. Make Google recrawl.
Really enjoyed this post Matt. Focusing on better quality pages as opposed to trying to “brute force” rankings makes a lot of sense.
Hi Matt,
For no indexing with Yoast do you noindex all of the following pages : author, archive, categories, tags, images
Are there any pages you noindex or nofollow too such as contact page, privacy etc?
Thank you.
I keep the contact, about, privacy, etc indexed. No index all the other stuff you mentioned.
Love the blog Matt! I just applied for the links!
Thanks for the great tips! I’m new to online marketing, and this is really helpful! Since getting started, I’ve been bombarded by “spin writers” and such to create a TON of content quickly, but you seem to say that these search engines have become sophisticated enough to determine when your content is crap. Am I understanding that right?
For sure.
Hey Matt!
Thanks for the 10 solutions.
“For indexing, we use our own proprietary tool…” What might such a tool be? I have considered the “onehourindexing” software, but not sure about the quality.
And funny thing, when I got your mail titled “What are the Most Common SEO Problems You’ll Encounter?” (which took me to this article ), I immediately though of the reporting issues and getting clients to understand the value of SEO. But maybe it’s just me, who have those problems 😉
Hey Matt,
Great content as always. I have two questions.
1. In mobile search for some of the blog posts, the search result is with additionally added text (the name of the blog) at the end of the seo title. Have you seen that before and what may cause it?
2. Do the social accounts of the blog help for the ranking if they are not indexed even after months? If no how do you index them?
Thank you in advance. Have a great day.
1) Never noticed it.
2) If they’re not indexed, they don’t exist. Try using the Video Sitemap trick.
One more thing.
Please cover “Keyword Cannibalization” as soon as possible I think this help lot of webmaster over internet.
Will do.
How about multiple tags. Have you had experience with multiple tags indexed and they have the same content.
I deindex tags, categories, authors, etc.
Hi Matt,
Great content you provided here. I have a question about keyword cannibalism play a factor with inner pages as well? For example I have a dental client that doesn’t have dentist or his location in his domain, so I made him a xxxx.com/location-dentist page with other pages linking off of that. If I want to rank for the location and dentist am I hurting myself with this strategy?
Thanks!
You’ll probably get away with it. But if the homepage somehow gets more relevance and authority on “dentist location” than you intend, they may compete.
Matt,
We have a client that is running a Spanish language only site in the US. Do you see any issues with most of the links coming from sites that are in English? They don’t ever want an English version of the site because they are trying to position their brand as only serving the Hispanic community. Should I be looking for a certain ratio of Spanish language sites in their backlink profile? I’m thinking about keeping the pillow links in English so I don’t have to spend as much time tracking down Spanish link opportunities. Thoughts?
You definitely want links from Spanish content. As to what the golden ratio is… I have no idea.
Great suggestions Matt. Especially, solution to the indexing problem. I a sure that there are many marketers who are suffering from this issue and your suggestion is a very effective way to get it solved.
Hi Matt,
this is a great article, can I ask what happened with that client, in the chapter:
6. Low Quality Affiliate Content?
You said, you are anticipating a return in rankings within the next 3 months, but Im affraid, I couldnt find the result.
Did the positions get back?
I think, we could have some similar problem, we have a similar drop in search as in graph for 5 months.
I just reported 10 spammy sites today, what I found that they have stolen just one of our unique content (but there is probably a much more scrapping all our content). The problem could be maybe also that, the one of them even overrank us for our own content?
So I just would like to ask, how did it end? So Ill know, if I have some hope, that we could go back with our website, or nobody is reading the spam complains in google? 🙂
Thanks so much for the answer
In this case, it did well… very well.
Matt, this is a great, thorough article of the key components to great SEO for 2019! I particularly liked the idea of diversifying anchor text to ensure that the ratio of targeted vs non-targeted anchor text is relatively equal to those of the competitors for the same keyword. This is definitely something I had overlooked previously and got dinged for it for one of my clients!
Wow. Very good informations. I’m from Indonesia. In my country, SEO is very important in our business. But sometimes we do wrong SEO. Thanks
does meta description duplicate harm my SEO ?
Likely.
Solid Article Matt,
Nice to meet you BTW Im a fan.
For me #4 and #5 are super important.
Wow very good information
Hi Matt, great tips there. But I’m puzzled a bit with URL’s and SEO titles (I did read the evergreen guide). How do I construct them when domain name contains the keywords I’m after?
Let’s say the client’s domain is singlesholidaysitaly.com. The problem is they are selling weekend holidays and long holidays (each named respectively – singles weekends and singles holidays).
Will they be penalized if my long holidays page has this URL: singlesholidaysitaly.com/singles-holidays, and SEO title is: Singles Holidays – Singles Holidays Italy?
And how in general should I approach site organization in this case – should I remove the keywords that are in the domain from each page that contains them?
E.g.
singlesholidaysitaly.com/singles-holidays-barcelona
or
singlesholidaysitaly.com/barcelona?
Given a choice, I’d prefer the latter. URL: singlesholidaysitaly.com/singles-holidays, and SEO title: Singles Holidays – Singles Holidays Italy sounds like old-school keyword stuffing, right?
Hey Matt,
very informative, thank you. Link removal through GSC says “only for pages/images that have already been modified, or removed from the web”. What would be the quickest way here – robot.txt >crawl >gsc?
Tnx in advance.
Katarina.
Yes… that’s what I do in a pinch or if GSC is being stubborn.
That was informative. I found it useful. Thanks
Great article, Matt
Thanks for sharing
So glad I subscribed to you! This is gold! Thanks so much 🙂
Comments are closed.