Onsite SEO https://diggitymarketing.com Fri, 09 Feb 2024 21:10:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://eb75zekerce.exactdn.com/wp-content/uploads/2016/03/cropped-favicon-1.png?lossy=0&sharp=1&resize=32%2C32&ssl=1 Onsite SEO https://diggitymarketing.com 32 32 The Definitive On-Page SEO Checklist For 2024 https://diggitymarketing.com/on-page-seo-checklist/ Sat, 22 Apr 2023 01:23:34 +0000 https://diggitymarketing.com/?p=1022569 As much as having great content, a sound technical foundation (i.e. having a mobile friendly website) and link building can get your site on the right track for SEO dominance, sometimes, it’s the little things that can make the biggest difference… I’ve over a decade of experience in the SEO experience, and I understand the importance of optimizing on-page SEO elements for achieving search engine dominance. Therefore, I’ve created the ultimate On Page SEO Checklist so that you can ensure that the primary elements of your site pages are well-optimized. Quick Summary What is an On Page SEO Audit? Let’s get one thing straight, when I’m referring to “On Page SEO”, I’m not talking about optimizing content for reader retention or writing content to meet search intent – that’s for another article altogether. Instead, I’m talking about optimizing the core elements of a web page such as the page titles, headings, meta descriptions and images on your website for the primary keyword that you want to rank for. These are individual factors that can help improve your site’s ability to rank and lead to more organic traffic. For example, get your title and headings wrong, and it doesn’t matter if you’ve written the best piece of content, it’ll simply struggle to rank. Why? Because, an On Page SEO Audit also makes it easier for crawlers and human visitors to understand what the content is about without much effort. Why are On Page SEO Audits Important? An On Page SEO Audit is important because it can identify issues with your content that could hurt user engagement and confuse the crawlers. For example, unclear page titles that don’t include your primary keyword will decrease click-through rates, while also making it harder for the search engine bots to understand the purpose of your content. Poorly-optimized URLs or the lack of headings will make your pages harder to navigate for both internet users and crawlers. If there are a lot of images on your website, but they lack the alt attribute, search engines won’t know what they depict. Interlinking is another essential part of SEO that should be examined during an On Page SEO Audit. It is a way of showing Google which pages are to be considered the most valuable on your website. This can also help boost the rankings of individual pages and can make your website easier to navigate and improve user experience. I’ll go through each of these elements in the On Page SEO checklist below. On Page SEO Checklist In order to make the most of this On Page SEO checklist, I highly recommend performing keyword research for your core landing pages. Taking this step will help make the next steps much easier to do as you’ll be able to identify the target keyword (or other search terms) for your pages. Here are the SEO tools that you’ll need to complete your On Page SEO checklist: Page Title Optimization What is a Page Title? A page title (or title tag) is the HTML element that defines what the title of a web page is. Here’s the title for The Search Initiative homepage in its raw HTML form: <title>The Search Initiative - SEO Agency</title> Here’s how it appears in the search results: And here’s how it appears in a browser’s tab: Why Audit Your Page Titles? Page titles are one of the Three Kings of On Page SEO with the other two being the URL and H1 heading. I’ll go into detail about URL optimization and H1 headings later on in this article, but for now, let’s take a look at why page titles are so important. Page titles are a very important part of search engine optimization for both users and robots. The title tag of your web pages is one of the first elements that users look at when deciding whether or not to click through to a page from their search. Well-written, engaging titles are known to increase click through rate (CTR) and engagement. The main purpose of auditing your title tags is to ensure that they inform the audience (and search engines) what they can expect to find on your web page. They’re essentially a precursor for the main content on your site. Page Title Audit Checklist When auditing the page titles, you should ensure that they are: Avoid page titles like this that are over-optimized: Cheap Holidays and Cheap Trips to New York | Book Cheap Flights Today Instead, aim for page titles like this: Book Cheap Holidays & Flights to New York Today | Tripster Heading Optimization What is a Heading? Headings (aka header tags) are used to convey the hierarchy and structural outline of the content on a page to search engines and users. There are six kinds of HTML Header levels where the H1 tag is the largest and the H6 tag is the smallest. Here’s an example of a heading hierarchy <h1> This is a H1 heading </h1> <h2> This a H2 heading </h2> <h2> This a H2 heading </h2> <h3> This a H3 heading </h3> <h2> This a H2 heading </h2> Here’s what the headings would look like: Why Should You Audit Your Headings? Breaking up your content with headings makes it much easier for users to navigate through your page and find what they’re looking for. Likewise, from an On Page SEO point of view, headings provide better context for search engines as they’re able to understand the hierarchy and structure. Aside from this, they may also aid your chances of appearing in the Featured Snippets. Here’s an example for the keyword “how is vanilla extract made”. Let’s see where the featured snippet text is taken from… Importantly, notice that the heading is exactly the same as our search intent query. H1 Heading Audit Checklist When auditing the H1 headings on your site, ensure that: H2-H6 Heading Audit Checklist When auditing the remaining headings on your website (primarily H2 headings), ensure that they: Meta Description Optimization What is are Meta Descriptions? The meta descriptions summarizes the main content on your web pages is about in one or two sentences. Here’s an example: <meta name="description" content="This is a meta description"> Why Should You Audit Your Meta Descriptions? Despite the fact that they are not a ranking factor, meta descriptions should still be given importance because Google often displays them as part of the search results. If you’re still not convinced, take it from Google, who say that they should “generally inform and interest users with a short, relevant summary of what a particular page is about. They are like a pitch that convinces the user that the page is exactly what they’re looking for”. Essentially, this means that well-written descriptions may be the deciding factor as to whether or not a user clicks through to your website or not. Moreover, if you include your primary keyword or other search terms within your descriptions, then Google will bold any words from the keyword within the search results. For example, if our target keyword is “ruby chocolate”, then you can see that Google highlights all instances of the term within the results. Meta Description Audit Checklist During the auditing process, you should ensure that your descriptions: URL Optimization What is a URL? A URL (Uniform Resource Locator), is the unique address of any resource on the Web. Why Should You Audit & Optimize Your URLs? You should audit and optimize your URLs because this is one of the things Google recommends you do in their SEO Starter Kit. The article recommends using simple, friendly URLs. This is because the URLs of your site are usually one of the first things that both users and search engines will see. URL Audit Checklist When auditing/optimizing the URLs on your site, you should ensure that they are: This not only makes it easier for users (and search engines) to understand what your post is about, but it also makes it much easier for other sites to link to you. Avoid URLs like this: https://cycling.com/unicycle/red-2192734i.html They come across as unnatural, confusing, and unfriendly. A more meaningful and user-friendly URL would be: https://cycling.com/unicycle/red/ After making your URLs user-friendly, you’ll want to make them SEO-friendly too, which I’ll explain in the following points. For example, https://bestrouters.com/buy-wireless-routers/ could be shortened to https://bestrouters.com/wireless/ Instead, use hyphens (-). You want URLs like this: https://cycling.com/best-unicyles-under-300/ Instead of: https://cycling.com/best_unicycles_under_300 Here’s what a standard optimized URL structure would look like: For example: https://bestrouters.com/wireless/ is better than https://bestrouters.com/wireless-routers/ Image Optimization What is Alt Text? The Alt Text (or alt attribute/alt tag) of an image is the HTML element used to textually describe an image on the Web. Here’s an example of an alt tag: <img src="my-image.jpg" alt="This is an image."> Why Should You Audit & Optimize Alt Tags? Apart from improving your website’s accessibility and enhancing the experience for visually impaired users, there are also On Page SEO benefits to auditing and optimizing the image alt attributes on your site. Search engines can’t “see” the Read More Read More

The post The Definitive On-Page SEO Checklist For 2024 first appeared on Diggity Marketing.

]]>
Silo Structure & Website Architecture: SEO Silos Made Easy https://diggitymarketing.com/silo-structure/ Mon, 13 Jun 2022 11:09:21 +0000 https://diggitymarketing.com/?p=1020267 If you’ve been developing blog posts and content on your site without a plan, you may be undermining your goals. You may have already lost months of potential growth, and set your website up to grow sluggishly while brand new competitors easily zip ahead of you. As an experienced digital strategist, I can assure you that there is an effective way to structure your website to capture the attention of search engines like Google. One of those ways is by applying a silo structure. Different models of siloing can help you create an SEO-friendly site, and choosing the right one can unlock the potential for explosive growth. In this article, you’ll learn what siloing is and how it benefits your website’s search engine optimization. Then, I’ll take you through 5 silo structures for your site and show where they excel and fall short. I’ll close with some advice on choosing silos, and answer top questions.  What Is A Silo Structure? A silo structure refers to a type of planned site architecture where internal linking will connect certain pages to each other based on a thoughtful, standardized pattern. It starts with a site that is organized into a hierarchy structure where parent pages serve as the general introduction for a topic. Under the main topic pages are supporting pages—blog posts, guides, and other website content that covers narrower branches of the topic covered in the parent page. The silo is created when the pages are internally linked together to create a navigation path that can pass link juice and topical relevance to other blog posts, landing pages or other content on your site. Link juice is the name given to the authority that is passed along whenever any page in this hierarchy receives a link from another site. It’s one of the most important SEO concepts that relates to links, and it’s one of the reasons siloing is so necessary for your website. A silo can be developed so that the link juice flows throughout the entire site, instead of stopping dead at the original page. This guide is going to focus on teaching you to establish several different configurations of silos through internal linking. When you build your silo solely by the internal linking of each page, it is known as a soft silo. What About Hard Silos? In addition to soft silos, there are also “hard silos” (or, physical silos). These are silos that are built into your website structure (your URL structures And directory structure). Near the end, I’ll tell you more about building a hard silo that can reinforce the soft siloing on your site. For now, let’s focus on what the soft silo can do for your SEO. Developing a link structure for your website takes work, and you deserve to know why you’re doing it. The fact is that having a silo structure for your links offers serious SEO advantages for your site that you can’t ignore. How Does Siloing Benefit SEO?   A Silo structure benefits SEO by passing link juice throughout your website, and providing search engines like Google and search engine bots with a streamlined way to confirm the topical relevance of your content. Siloing Helps You Enhance the Topical Relevance of Your Site Search engines like Google look at different signals to determine if your website and its pages are relevant for different search queries. One way that you can demonstrate SEO relevance on your site is by linking related pages together in a structure. For example, imagine you have a blog post or commercial article on your website about the best protein powder. You can prove the relevance of that blog post to search engines by linking to related content on the site (such as “best protein powder for men”, or “protein powder supplements”) back to it. Applying an SEO silo architecture to your site also helps you more thoughtfully choose the pages and content that you want to enhance with more topical relevance. When your silo is fully developed, you can easily direct topical relevance like a current across your site. That can translate to more attention from Google search engines spiders—and that can mean higher keyword rankings and overall better SEO. Siloing Gives You a Lot More Mileage Out of Link Building Link building has an important relationship with your Google search engine rankings.. Securing an external link for one page can be great for its search performance. However, with the right siloing structure for your site, you can ensure that links spread their SEO power through your other blog posts and landing pages. When you have an effective silo architecture in place, the link structure allows the power to flow through to the other pages on your website far more effectively. This can directly improve the search rankings of each page—even the ones that don’t receive the link. In several of the siloing models that are covered later, you’ll learn how you can structure your internal linking so that the power of a link hits every page on your website. Siloing Lets You Rank Much More Easily for Long Tail Keywords Long-tail keywords on your website can benefit from having a silo in place. Developing links to each silo page at the bottom of the site hierarchy structure can be difficult because these pages often target the most specific and long-tail search terms. However, you don’t necessarily need links directly to these pages if you have a silo that can move the juice along from the most popular pages on your site. Silos that are set up properly will allow all of your smaller pages to receive the flow of link juice from each silo page above or below them, and the SEO benefits that come with it. Applying a silo can be so effective, in fact, that you may see pages in your website structure developing a better search engine rank even before they’ve received any backlinks of their own. How To Plan & Create a Silo Structure Implementing a silo structure is simple at the strategic level. I’ll illustrate with the website Dietmasters.com. As you can imagine, a site like this is aiming to be a great hub for all types of diet advice. In order to be competitive, it will need SEO-relevant content to cover all the possible pages within a topic like this. The best SEO keyword that a site like this could target is “best diets”. That’s a broad SEO-friendly term that could incorporate any number of landing pages, service pages or blog posts. To plan for an SEO-friendly silo structure, we have to be careful about how we set up the website and the first rounds of website content. Step 1. Start with good keyword research This first step will help us learn what intent people have when they’re looking for search terms like “best diets”. Follow your normal process for choosing keywords, and compile what you’ve learned into a research doc to help you develop content. We’ll need to know the right keywords so that we can break the topic down into narrow SEO keyword phrases that can be used to build blogs and other content. When we’re done, we can build content that matches the needs of search engine users and choose thoughtful anchor text. Imagine that you performed this research using a search engine or your favorite research tool, and learned that the search terms with the most keyword relevancy for a site like this were: These three SEO keyword phrases will make up the top-level category pages for our sample site. For the category structure, we’ll think of these pages as lateral to one another. They’ll have their own child pages, but they won’t necessarily link to one another. When developing content for each supporting page on the site, we want to choose keyword phrases that are even more specific. That way, we can target commercial intent. Step 2. Break the topic into supporting pages The next step would be to take those top-level site topics, and break them down into child pages/support pages so that we can create content that will speak to more intentional audiences. This use of keyword phrases reinforces the relevance of topics that are in the top level of the site hierarchy. Let’s use the “keto diet” search term as an example. Your research may show you that people who are using search engines to find information about the keto diet keyword terms are hungry for content about: These keyword terms will make up the next level of supporting content pages for our site, but we’re not quite done. Step 3. Break the sub-topics into long-tail keyword phrases We can keep going. Let’s use the meal plan page as an example, and look at the SEO-friendly search terms that could be developed from that. Your research may show you that people who use search engines to look keyword phrases like keto meal plans may also be looking for content about: This Read More Read More

The post Silo Structure & Website Architecture: SEO Silos Made Easy first appeared on Diggity Marketing.

]]>
What’s The Best Info To Commercial Content Ratio For Affiliate SEO? [Data-Backed Study] https://diggitymarketing.com/info-to-money-content-ratio/ Mon, 14 Feb 2022 13:01:26 +0000 https://diggitymarketing.com/?p=2029394 In December 2020, many affiliate websites completely lost their rankings due to Google’s Core Algorithm Update. If your site’s rankings were also wiped out by this update or if you just want to make sure your sites won’t be affected by it in the future, this post is for you. At the time, I had crawled over 600 affiliate websites to tease out what commonalities there were on sites that gained rankings vs sites that lost rankings following the update. You can watch that analysis here:  One of the core findings was that the ratio between informational content to commercial content seemed to play a part. In particular, affiliate sites with high proportions of commercial content seemed to be hit the hardest. This post is a follow-up and presents an analysis of the ideal ratio of commercial content to informational content. I’ve uncovered some golden nuggets that I’m excited to share with you!  How This Analysis Was Conducted With the help of one of my rockstar staff members, we collected data on a total of 1,517 affiliate websites across 32 different industries that are winning after the Dec 2020 core update. The criteria for selecting websites was simple: The goal of our study was to look for sites that either weren’t affected or that experienced a positive effect (even if this positive effect occurred down the track, per the websites that lost and then reclaimed their rankings). As a result, our data-driven study allowed us to reveal the general ratio between info to commercial content that worked for the affiliate sites that were clearly winning after the December 2020 core update. Average Informational To Commercial Content Ratio For our analysis, “informational content” is defined as anything that does not include affiliate links and does not target keywords with commercial intent. The sole purpose of info content is to answer questions or to educate the reader. On the other hand, commercial (or “money content”) typically has the following features: Of course, these parameters don’t cover every single commercial page out there. There are definitely outliers that we probably missed. But for the most part, these parameters will do the job and give a decent picture for us to work from. We ignored the small percentage of “other” pages that did not fit into either of these criteria. These include the home, about, contact, and category pages to name a few. Key Insights Average % of Information vs Commercial Content for the Full Set of Surviving Sites By analyzing the ratio of info to commercial content, we wanted to determine some industry guidelines for affiliate sites moving forward. Thanks to this data, we no longer need to rely on speculation and “gut feeling”, you can see the average ratio of the winning sites above. Naturally, there’s a big distribution and a handful of outliers compared to these averages. But, as you’ll see in the scatter plot below, the majority of sites that survived the December 2020 update have a heavy focus on informational content. We also looked at the data according to different industries. As you’ll see below, there’s a broad range of commercial to informational content ratios with the exception of “Coupons”, a clear outlier. However, all other industries had a higher percentage of informational content than commercial content on average. Note: The percentage of info to commercial content in this bar graph doesn’t total 100% due to that small percentage of “other” content we mentioned earlier. Affiliate sites in the “Automotive” industry have the lowest amount of commercial content sitting at 20% on average. On the other end of the scale, “Office Supply” and “Music” have 42% and 39% commercial content respectively. A 22% difference in the amount of commercial content only goes to show how, like everything else in SEO, the ratio of info to commercial content required for a website to thrive differs with each industry.  Now let’s look at specific segments of the data and how different factors influence the info to commercial content ratio. Info To Commercial Content Ratio For Niche Vs Authority Sites We noticed a correlation between the ratio of commercial content and whether a site was classed as a niche or authority site. For our purposes, niche websites only focus on one specific topic or vertical within an industry. For instance, a website that only focuses on the best table saws is considered a niche site. By contrast, authority sites have content on multiple different topics. For example, a website like Best of Machinery, which reviews all types of garden and power tools, would be classed as an authority site. Within our dataset, on average, niche sites have 6.6% more commercial content than authority sites. Niche Sites We also noticed that industries dominated by niche sites correlated with higher amounts of commercial content. Note: The percentage of info to commercial content in this bar graph doesn’t total 100% due to that small percentage of “other” content we mentioned earlier. For instance, over 80% of total sites in the following industries were niche sites. They’re also at the top of the list for industries with the highest amounts of commercial content: Now let’s compare to authority sites. Authority Sites When we look at the distribution of authority sites in specific industries, it’s interesting to note that different industries rise to the top compared with niche sites. Note: The percentage of info to commercial content in this bar graph doesn’t total 100% due to that small percentage of “other” content we mentioned earlier. It’s also clear that with the exception of “Coupons” (a clear outlier in our study), the winning authority sites have a lower percentage of commercial content compared to niche sites, regardless of industry. Commercial To Info Ratio For Sites With High E-A-T Factors There is a subset of websites that require a large number of signals indicating expertise, authoritativeness, and trust (E-A-T). Typically, these sites fall into what’s known as “Your Money, Your Life” (YMYL) niches. If your site falls in this category, a high amount of commercial content could get in the way of your trust-building signals. In our study, we analyzed affiliate sites in four YMYL niches including: Typically, YMYL sites fall below the average benchmark of 29% for the percentage of commercial content. Key Takeaway Play it safe if your affiliate site is in a YMYL niche. Keep your commercial content to a maximum of 25% and aim for over 70% expert, informational content. Does Website Size Affect Commercial To Info Content Ratio? Given the above insights about niche sites, we also checked out whether there is any correlation between a website’s size and the ratio between commercial to info content. Smaller sites tended to have higher proportions of commercial content though our data did not suggest a different threshold for websites of different sizes. It could also be the case that larger websites that publish more content each month run out of commercial content they could publish. Key Takeaway When you’re starting out, you can tip the scales towards more commercial content, even if the ratio is higher than your industry’s average. There’s no shortage of small websites out there with more than 40% commercial content! However, as your website grows, you should aim to tip the balance towards more informational content. Ideally, you should be aiming for over 65% of informational content. Publishing Velocity & Info To Commercial Ratio Publishing velocity relates to the rate at which new articles are added to a website. Like most affiliate marketers, you probably don’t have the resources to compete with big-budget sites like NerdWallet or the Wirecutter as far as monthly content production is concerned. So we looked at publishing velocity to see whether you might be at a disadvantage due to publishing fewer articles a month. In our analysis, we looked at six patterns related to publishing velocity based on how many new articles went live over a 6 month period. These include: Regardless of how many articles a month are published, the data indicates a fairly narrow band of 24% – 33% commercial content compared with 62% – 72% informational content. Key Takeaway It doesn’t matter if you publish more or less content per month; it’s still best to stick to your industry’s average info to commercial ratio. If you’re not sure what that is, aim for a maximum of 30% commercial content and at least 65% informational content. So, What’s The Ideal Ratio Of Info To Commercial Content? No matter how you cut it, the ratio of commercial to info content that Google is rewarding fits into a fairly tight spread. Authority sites tend to skew towards more informational content, whereas niche sites typically have higher proportions of commercial content. This pattern holds true for most industries with a few exceptions, including: Is 60/40 A Safe Ratio For Info To Commercial Content? Some industries have loads of websites with over 40% commercial content. In fact, around 29% of all sites we looked at had more Read More Read More

The post What’s The Best Info To Commercial Content Ratio For Affiliate SEO? [Data-Backed Study] first appeared on Diggity Marketing.

]]>
How To Do Keyword Research for SEO in 2024: Strategy & Tools https://diggitymarketing.com/keyword-research/ Mon, 13 Sep 2021 06:56:00 +0000 https://diggitymarketing.com/?p=1524454 As an experienced digital marketer and SEO expert, I know that no search engine optimization strategy can be truly effective without first performing thorough keyword research. Regardless of how much or how good your content is or how many backlinks your website has, you’re going to struggle to rank if nobody is searching for what you’re writing about. That’s why I’ve put together a guide on how to carry out keyword research, along with a helpful checklist and template for you to follow. Quick Summary Keyword Research Basics Before showing you how to go about picking the most important and fruitful keyword ideas for your online business or site, let’s define what Keyword Research is and why it’s important for SEO. What is Keyword Research? Keyword Research for SEO is the process of finding and analyzing keywords that are most relevant to a particular niche and which can bring the most organic traffic to your site. This way, you can use those keywords to improve a website’s position in the SERPs. However, it’s not only about finding terms that will drive the most traffic but also about finding the right search phrases that will help turn organic visits into conversions. Why Is Keyword Research Important? Keyword research is important for a number of reasons. Before you create an SEO strategy, you should conduct Keyword Research first. Otherwise, you won’t know which search phrases present an opportunity to improve your website’s position in the SERPs. There are several reasons why Keyword Research is essential to your website’s growth… Keyword Research allows you to identify a list of keywords with decent search volume that are the most relevant and valuable to your business or website. Even if you have an idea of what internet users are searching for, guessing can only get you so far. By including the most popular keywords, you’ll be able to increase your website’s organic traffic. Thanks to Keyword Research, you’ll also understand which queries your competitors are ranking for and create a strategy to eliminate this keyword gap and overtake them in the SERPs. At the same time, Keyword Research can help you achieve the most results with the least effort. How? You might find keywords relevant to your niche that your competitors aren’t ranking for, in which case reaching the top spots in the search results pages and driving more traffic becomes even easier. Here are some more ideas to find easy keywords:  Lastly, analyzing your keyword ideas will help you answer some important questions like: Keyword Research Checklist Without understanding what keywords your audience is searching for, your site won’t rank anywhere near its potential. For those who want to know how to research keywords, we’ve put together a checklist outlining the core steps you should follow when carrying out Keyword Research for SEO. To complete this Keyword Research process, you will need: We’ll go through the template below using thesoundjunky.com as an example. Get An Overview of Your Website’s Keyword Visibility The first step in this Keyword Research process is all about getting an overview of your website’s current keyword visibility – i.e., identifying which keywords your website is already ranking for. If you have a brand new website that isn’t ranking for any queries, you can either skip these steps or do them for your most important competitors to give you an idea of what keywords you should target. Raw Data Enter your domain into Ahrefs’ Site Explorer tool, click Organic Keywords in the left sidebar. Ensure that you see the keyword ideas for your target location (in this example, it’s the USA), click Export. Then, click Full Export (if you’re ranking for more than 1000 keyword ideas) > Start Export. Once exported, remove all columns from the spreadsheet apart from those that I have kept in the Raw Data tab of the Keyword Research template. Copy these keywords into the tab and sort the keywords by Volume. You now have a full list of keyword ideas that your website is currently ranking for based on search volume. Keyword Map The next step is to create a keyword map to see which URLs are ranking for which terms. Create a copy of your Raw Data tab. This time, order the columns so that you have the following: Now, sort the data by URL. So that your sheet now looks something like this: You’re now going to create a Pivot Table so that the keyword data looks more presentable. Highlight all of the data by clicking on the blank cell in the top-left corner. Click the Filter icon, Data > Pivot Table. Select New sheet, click Create. In the Pivot Editor, add in the rows in the same order listed above. Be sure to deselect Show totals. Call this Pivot Table, “Keyword Map” and hide your copy of the Raw Data tab by right-clicking on the tab and selecting Hide sheet. You can stretch the URL column to show more text. And that’s your Keyword Map complete – this is a more visually appealing and informative representation of the site’s keyword visibility. For example, you can perform useful calculations to see: Low Hanging Keywords More often than not, you’ll have lots of keywords ranking just outside of the first page (i.e. positions 11 – 30 in the search results). These are potential quick wins that you can target as a priority. To find your Low Hanging Keywords, go back to Ahrefs and this time, filter your keywords by Position, After clicking Apply, you can export, copy and paste the terms into the Low Hanging Keywords tab. Alternatively, you can create a copy of your Raw Data tab again and filter the terms by Position using the Filter icon. You now have a list of search terms that you can optimize to climb into the first page of the search engines. Long Tail Keywords The next step is to identify long-tail search terms. These keyword ideas often have a lower SEO difficulty and present traffic with good conversion potential. Long-tail keywords might have lower search volume than general keywords, but they usually have low competition and are more likely to interest the target readers and generate conversions. Essentially, you want to look for terms that contain words like: To do this, head over to the Ahrefs keyword analysis tool and type in the above phrases (separated by commas) into the Include text box. In the Exclude text box, click Any target, deselect URLs and click Apply. This will now show you the keywords that contain the above words/phrases. Export, copy and paste these terms into the Long Tail Keywords tab. Sort by Keyword from A to Z. You now have a filtered list of long-tail terms that your website is currently ranking for. These are great for answering questions that your target audience may have about your niche, business, or site. Potential Keyword Cannibalization A common problem websites face is when multiple URLs with similar or the same content compete for the same keyword – this is known as keyword cannibalization. Keyword cannibalization can really hurt a site’s SEO performance, so it’s important to ensure that there isn’t any internal competition between your landing pages. Here’s a quick way to spot potential keyword cannibalization using Ahrefs. Create another copy of your Raw Data tab and call it Potential Keyword Cannibalization. This time, you want your columns to be ordered with: Sort your keyword ideas from A to Z – this is a super important step! In the Cannibalized column, add the following Excel formula: =if(A1=A2,"Yes",if(A2=A3,"Yes","No")) This formula is checking whether there are multiple instances of the same keyword in the spreadsheet. Copy this formula across all of the rows in your spreadsheet so that it looks like this: Next, filter out the keywords that aren’t cannibalized. Click the Filter icon, click the icon under the Cannibalized? column, deselect Yes, then click OK. This will show you all keywords that aren’t cannibalized. Highlight them all and delete the rows by right clicking, Delete selected rows to leave you with just the terms that are cannibalized. Hopefully, your sheet will be empty at this stage (as you don’t want any cannibalization), but if not, it’ll look something like this… Remove the Cannibalized? Column and sort by Volume to identify the most important keywords that have been cannibalized. You now have a list of cannibalized keywords. To learn more about analyzing and fixing keyword cannibalization, check out this article. Finding New Keyword Suggestions & Ideas Up until this stage of this SEO Keyword Research process, we’ve only focused on the keywords that your website is already ranking for in Google’s SERP. In this chapter, I’ll show you how to find new keyword ideas to target by using the same keyword research tool (Ahrefs) and other methods. Identify Seed Keywords by Topic A great way to build a foundation of new keyword ideas is to brainstorm the key topics that define your website’s niche with seed keywords. Seed Read More Read More

The post How To Do Keyword Research for SEO in 2024: Strategy & Tools first appeared on Diggity Marketing.

]]>
How To Do a SEO Content Audit For Your Website in 2024 https://diggitymarketing.com/content-audit/ Mon, 09 Aug 2021 05:19:41 +0000 https://diggitymarketing.com/?p=1023752 You’ve probably heard the saying, “content is king.” While having a technically sound website and strong link profile are great accelerators for SEO dominance, these days, they’ll hardly move the needle if you don’t have a solid content strategy. As an experienced digital marketer who has built multiple successful affiliate websites, I understand the importance of content in driving results. Making sure that things like your page titles (aka title tags), headings, and meta description are optimized for your primary keyword is important too, but thorough content audits will help seal the deal. In this article, you’ll learn how to perform a website content audit in detail and highlight why you should make it a part of your SEO and content marketing strategy. You’ll find out the importance of writing content that is easy to read and is written with the intent of the core keyword you want to target in mind. I’ll also show you how to ensure that you don’t have landing pages with thin content, duplicate content, and keyword cannibalization. Lastly, we’ll look at how to audit your content for E-A-T signals which will help build your site’s authority. This content audit spreadsheet checklist will help maximize your content marketing efforts and reach your SEO content goals, as we’ll make sure that no stone is left unturned. Quick Summary What is a Content Audit? A content audit is the process of analyzing the main text of your web pages to ensure that it is helpful to your users and bots like Google. This process can help find and fix issues such as low readability, duplicate content, or keyword cannibalization, all of which could hurt the hard work you’ve put in with your SEO efforts. Remember, content audits aren’t about optimizing page titles, headings, and meta descriptions. If you want to audit these, you should use Screaming Frog (or any other crawler of your choice). A content audit spreadsheet ensures that your site’s content assets (i.e., landing pages, blog posts, etc) are created with both Google and the end-user in mind. Why Do You Need a Content Audit? You need a content audit because it will help you identify and take stock of which pages on your site work well and which don’t. Content audits are primarily about improving content metrics such as readability so that users will be more likely to spend more time on your website. For example, if you are running an eCommerce business, the amount of time your users spend browsing your website directly impacts your sales. Conducting a content audit also ensures that you’re providing your audience with the best possible experience while making sure that the text is optimized for Google. To put it straight, content audits will help refine your site’s content assets as well as increase conversion rates and social shares. Content Audit Template & Checklist I’ve compiled a list of key Google analytics tools, a content audit data template, and a checklist to make this process as smooth as possible for you. Content Audit Tools Search Intent What is Search Intent? The search intent of a keyword is the underlying reason for the user’s query – i.e., it’s about understanding why and what the user is looking for. There are four main types of intent: Important: not all informational keywords are questions. For example, users searching “Elon Musk” are likely looking to find out more about Elon Musk. Why is Search Intent Important? Search intent is important because the primary goal for Google is and always has been to provide users with the best and most relevant results for their queries. While you could pretty much get away with ranking in the top 5 positions with links and mediocre content before, this content strategy doesn’t work in today’s SEO climate. Google’s understanding of a piece of content and its ability to provide relevant results is continually improving through machine learning techniques like Natural Language Processing and its BERT algorithm. If your content marketing strategy doesn’t account for what the end user is looking for, how can you expect your SEO goals to be met? As a result, it’s one of the most important steps in this content audit process. Search Intent Audit Checklist The best way to “audit” intent is to compare your content to the sites that are ranking in the top positions for your core keywords. Then, you want to check whether your content aligns with what Google is rewarding. It’s likely impossible to audit every single page on your site, especially if you have an online store with 100s or maybe even 1000s of products. Therefore, you should start by making a list of the most important content pieces you want to audit first. Kind of like a content inventory! A quick Google search for your core terms will show you what kinds of results Google (and in turn users) is looking for. For example, below, you can see that for a transactional query like “Samsung phone cases”, Google leads with paid ads/shopping results. During your audit, you should: For example, a user searching for “how tall is the Eiffel Tower” is probably looking for a short, numerical answer. Whereas someone searching “why do we yawn” may be looking for a more in-depth answer. Answering these questions as part of your website content audit will allow you to pinpoint exactly what your target audience is looking for from their query. It will also help identify where you should place certain pieces of content. For example, if you target a keyword like “how to do a cartwheel,” you may find that the top ranking results include a video right at the top of the page. Therefore, to match the intent, you would do the same. Readability What is Readability? Readability refers to how easy it is for users (and search engines) to understand and read the content on a web page. Why is Readability Important? Making readability a part of your content audit process is important because reading a big block of text isn’t great for user experience. This may lead to an increase in the bounce rate (which you can track on Google Analytics) as users may simply lose interest and leave because they felt overwhelmed with information. Therefore, you should ensure that your website’s content is easy to read so that visitors don’t leave. Readability Audit Checklist A great (and free) tool for auditing the readability of a piece of content is Hemingway. It’ll give you a readability score and provide suggestions on how to simplify the text. Thin Content What is Thin Content? More often than not, “thin content” is referred to as content that is too short. However, there’s more to it than just length. A more accurate definition is content that offers little to no value to the target audience. It just so happens that most landing pages that are considered thin content by SEOs tend to be short in length. A quick way to spot these is via the tool Screaming Frog, which will allow you to order your site based on the number of words. The page below has lots of text, but is it actually valuable to the user? Not really. The most common types of pages that are considered as thin content contain: Why Should You Carry Out A Thin Content Audit? Let’s take a look at why your auditing content strategy should include checks for thin content by looking at it from both the user and search engine’s perspective. For Users As mentioned above, thin content pages offer little value to the searcher. As a result, they hinder the user experience and will likely lead to a higher bounce rate. You want to create pages with rich content that satisfies what the searcher is looking for. Otherwise, users will simply leave and visit a competitor. Because of this, other engagement metrics like the average time on page and conversion rates will also be impacted. For Search Engines Google evaluates content on both a page and domain level; and bases the frequency at which it crawls your site on the quality of your site’s content. This means that: If you’re still not convinced, a site with too many thin content pages can be penalized by Google with a manual penalty. Check out the video below, where Google’s Matt Cutts explains what it means if your site has been penalized with a thin content manual penalty.  Thin Content Audit Checklist When going through your content inventory for thin content, look out for the following: Duplicate Content What is Duplicate Content? Duplicate content refers to content that is identical or very similar to existing content on the Web. So, this… And this… …would both be considered as duplicate content. It’s important to note that Google acknowledges that content that is the same or identical in most cases are “not deceptive in origin”. In other words, duplicate content is primarily an issue when content is “deliberately duplicated across domains in Read More Read More

The post How To Do a SEO Content Audit For Your Website in 2024 first appeared on Diggity Marketing.

]]>
The Definitive Technical SEO Audit Guide For 2024 https://diggitymarketing.com/technical-seo-audit/ Mon, 10 May 2021 06:28:17 +0000 https://diggitymarketing.com/?p=1021666 To put it simply, if your website’s technical SEO isn’t on point – you’re not going to rank well on Google. As a professional digital marketer with over 10 years of experience in conducting technical SEO audits, I understand this very well. Having great content and backlinks isn’t enough when it comes to ranking in the top positions of Google search. If you really want to dominate the search engine results pages, you want to make sure that your website’s Technical SEO is perfect. Conducting a full technical Search engine optimization audit can be a pretty daunting task. Lucky for you, I’ve broken down each of these SEO elements and have provided a helpful SEO audit checklist with ready-to-use templates for you to follow. From the most basic level of ensuring that Google can crawl and index your content to more advanced techniques that look into site speed, mobile SEO, JavaScript SEO, and more, I’ll be with you every step of the way. Quick Summary What is a Technical SEO Audit? A Technical SEO audit is a process that is meant to identify and fix the technical issues that could make it harder for Google to crawl, index and rank your site. And to put it simply, make life easy for Google, and they make life easy for you. Common issues often discovered during a Technical search engine optimization audit include poor site architecture, broken links, slow page load speed, or display issues on mobile devices. The technical SEO audit is an important part of a website’s SEO strategy and should be one of the first things that you look into to improve your visibility in the Google search results. Why Are Technical SEO Audits Important? Technical SEO Audits are important because even if you’ve spent a long time creating excellent content, your users may not even see it if there are issues with your website’s crawlability and indexability. However, even if your site can be found by internet users, its rankings could be hurt by performance-related technical factors. Page load time is a ranking factor, which means that a slow website is unlikely to reach top spots in SERPs (search engine results page). Internet users are even less patient than Google crawlers and will leave your website if it takes ages to load. Likewise, a poorly structured website can also lead to confusion among your users. A site that is easy to navigate leads to a better user experience, and consequently, generates more leads. During a technical search engine optimization audit, you could also find out that mobile users face numerous problems while browsing your website. Given the fact that mobile devices generate more than half of worldwide web traffic, such issues could lead to a terrible loss of revenue. Let’s also not forget that mobile-friendliness is a ranking factor. Technical SEO Audit Checklist Here’s what tools we recommend in order to complete your Technical site Audit: Crawlability & Indexability As we’ve already mentioned, making sure that your site content can be crawled and indexed is a critical aspect of technical SEO. Google “crawls” the internet through links in order to find content. If they can’t find it, then it doesn’t exist in their eyes. In this section of our technical SEO audit guide, we’ll walk you through the various in which you can audit your website for technical SEO issues related to crawling and indexing. Robots.txt What is the Robots.txt file and Why Is It Important? A robots.txt file is a file that instructs search engine crawlers which pages or files they can and can’t access on your website. For example, if you have an eCommerce site , then you don’t want search engines to access sensitive pages like the cart or checkout page. It’s worth noting that… The robots.txt should not be used to hide pages from Google (or other search engines). Why? Because your web page may still be indexed by Google if other sites have linked to it with descriptive text. Robots.txt Audit Checklist Here’s what ours looks like: https://thesearchinitiative.com/robots.txt And here’s the robots.txt for: https://searchengineland.com/robots.txt Below are examples of search engines and their respective user-agents (i.e. what the search engine identifies as): Helpful tip: use the asterisk (*) wildcard. This is a special character that allows you to assign rules to all user-agents. # this directive blocks all crawlers User-agent: * Disallow: / # this directive grants access to all crawlers User-agent: * Allow: / Below are some examples of wildcards you can use: For example, the below set of rules prevents all user agents from accessing URLs in the /product/ subfolder that may contain a question mark. # this directive grants access to all crawlers User-agent: * Disallow: /products/*? # this directive grants access to all crawlers # This directive blocks all user-agents from accessing PDF files. User-agent: * Disallow: /*.pdf$ For example, if your robots.txt had the following set of rules: User-agent: Googlebot Disallow: /subfolder-a/ User-agent: Googlebot Disallow: /subfolder-b/ Google would still follow both directives. But, it could quickly get pretty confusing if your robots.txt has many rules. Therefore, something like the following is much better and cleaner: User-agent: Googlebot Disallow: /subfolder-a/ Disallow: /subfolder-b/ Technical SEO Fact: the robots.txt file was originally introduced to prevent search engine bots from overloading websites with multiple requests. XML Sitemaps What is an XML Sitemap? An XML sitemap (or sitemap) is an XML (Extensible Markup Language) file used to tell search engines where to find the most important content on websites. XML Sitemap Audit Checklist Below, we’ve outlined some simple checks that you should follow when auditing your XML sitemap as part of your technical site audit. Here’s an example of how this might look like: If you want a full list of possible on-page optimizations, I’ve put together a guide on how to audit the On-Page elements of your web pages. Here’s the XML sitemap for The Search Initiative: https://thesearchinitiative.com/sitemap.xml If you do not currently have a sitemap, you can create one manually or by using a tool like this XML Sitemap Generator from Screaming Frog. If your website is on WordPress, your life is about to get much easier as there are many SEO plugins such as Google XML Sitemaps and Yoast that will automatically generate your sitemap(s) for you. Index Bloating A very common technical SEO issue that most websites tend to face, is index bloating. Sometimes, Googlebot (Google’s web crawler) will crawl and index pages that simply offer no value to the end-user. These pages “bloat” your index and use up precious crawl budget as Google spends time unnecessarily crawling and indexing them. Below are some simple checks you can make in order to identify the types of pages that cause index bloating. Once you’ve carried out these checks, follow this awesome guide from Ahrefs on how to go about removing them from Google’s index. Pagination Paginated pages being indexed by Google can cause serious duplicate content issues. This is a common problem for eCommerce websites that may have thousands of products and hundreds of categories. To quickly check whether your paginated pages being indexed, use the following site searches: In the below example, we can see that the Zara website has over 14,000 paginated pages that have been indexed. Tags Adding tags to your WordPress or eCommerce sites is useful in organizing the content on your website, but it can also create SEO issues such as duplicate content. To quickly check whether you have any tagged pages indexed in the SERPs (search engine results page), use the following site searches: Don’t believe this is important?  Here are the before-and-after results of a client we had in the eCommerce space where one of the things we did was remove /tag/ pages from the index. Check out the full case study here. HTTP Pages If your website isn’t on HTTPS (you really should move over to HTTPS!), then it’s a given that all of your HTTP pages will be indexed. However, if you’ve made the move to HTTPS, there’s still a chance that some of the HTTP versions of your pages are indexed. To check this, use the following site searches: We can see below, that the Zara website also currently has over 2k HTTP pages indexed by Google – these are unnecessarily wasting crawl budget and creating duplicate content issues. Serving Both www. and non-www. Pages If your website serves pages with www., then it’s important that there aren’t any non-www. pages being indexed by Google as this causes further duplication. To check this, use the following site searches: If we look at the River Island website, we can see that by default, it serves www. pages: https://www.riverisland.com/. However, there are still almost 56k pages without www. indexed by Google. Having this many duplicate pages can be incredibly problematic and impact a website’s performance in the search engine rankings. eCommerce Empty Category Pages As a customer, one of the worst feelings is landing on a page to find that the website whose Read More Read More

The post The Definitive Technical SEO Audit Guide For 2024 first appeared on Diggity Marketing.

]]>
Site Speed Optimization in 2024: The Definitive Guide https://diggitymarketing.com/site-speed-optimization/ Mon, 12 Apr 2021 06:31:17 +0000 https://diggitymarketing.com/?p=1021661 Want to improve your website speed and boost your site rankings? Looking to improve your sites Core Web Vitals? Are you confused by all the conflicting information out there on website speed testing and how to speed up your wordpress site? This is why I’ve put together this in-depth guide on Website Speed Optimization… My team and I have optimized over 4000 sites since we started offering website speed as a separate service under our WPSpeedFix.com brand. That’s a pretty decent sample size. In this post, you will learn in detail the state of website speed optimization and where things stand today. You will find out how to speed up a website performance and, by the end, I’m hoping that you have some new insights you won’t find anywhere else on the web. Read on, to find out more… Quick Summary Where To Start You’ll see action steps throughout this post and I’ll link to various resources on website development around the web. I’ve written this with as little fluff as possible and a way that each section can be digested and you can run through the action items quickly and easily for that section. I’ve also included audio in various sections as an easier way to digest this info as it can be somewhat technical at times. As a starting point, run your site through our free speed test at https://sitespeedbot.com It’ll take 60 seconds and will give you an initial baseline to start with plus will provide you with insights you probably won’t have seen before. Run it a few times and also from multiple server locations so you get a good baseline average to start you off. You can track the history at https://app.sitespeedbot.com/domain/YOURDOMAIN.COM What Is Page Speed or Site Speed? Page Speed or site Speed are essentially the same thing and refer to how fast  pages load. Now, this is not to be confused with Google Pagespeed Insights or Pagespeed Score which is Google’s speed test and it’s own scoring system for the speed at which a site loads. Site speed is one of the big things in SEO right now, even more so with Google rolling out the Core Web Vitals metrics which has made speed metrics more visible in Google Search engines Console. This video outlines everything you need to know about Core Web Vitals:  There’s lots of different tools for measuring website performance and several different timings that measure speed. We’ll get into the specifics of what the different metrics mean and which are actually important further down in this article… Also, regardless of what terminology you use, one key thing most people miss when looking at the speed of their site is that the speed of all pages matter, not just the homepage! How Site Speed ACTUALLY Impacts SEO   This is probably the real reason why you’re here and reading this, right? You want better Google rankings and more organic traffic. But first, let me address a myth… Anywhere you see SEO advice dispensed these days, you’ll see it’s just a given fact that faster websites rank better. About half the inquiries we get at WPSpeedFix look something like the screenshots below. Someone will ask us for help with website speed testing because they want “better SEO.” The frustrating thing with a lot of these inquiries is that most of them have crappy on-page SEO: missing title tags, meta descriptions, and so forth with even crappier content delivery network cdn with next to no keyword-content matching. Speed probably won’t make a dent overall if your SEO sucks! You will learn about why this is and how the theory of constraints can help you make more money in a few sections below… But first, before putting massive amounts of load times and effort into improving site speed, ask yourself: Below, I address the real reason why faster websites rank better and how you can improve your speed and rankings. Here are the three ways “site speed” will help your SEO. I put site speed in quotes because technically, two of these problems aren’t site-speed factors and more a case of poor website speed being a symptom of these problems. 1. Uptime and reliability   Uptime of your website development and reliability is probably the most important one of the three here and often the most overlooked. People go wrong on this one all the time inspeed land. The uptime and reliability of your site and hosting provider REALLY matters. This makes logical sense when you think about it for a minute – downtime is essentially zero speed; therefore, we should probably start our website speed optimization process by making sure we have as little downtime as possible. When I see people talking about speed improving rankings, it’s most often when they move from a bad host to a good host. They then see their speed improve, see their rankings, organic internet traffic and conversions improve and determine that speed is clearly better for SEO. What’s actually happened is they’ve improved their site’s reliability, and Google can now crawl it without getting frequent DNS errors or intermittent downtime in the form of 502 and 504 errors. Most of the cases where there’s a clear organic internet traffic and rankings improvement after website speed optimization work, it’s due to the site’s reliability being improved in some meaningful way, not because the site is faster. Secondary to that, the other most common reason is that canonical issues with the number of http requests and https versions of the site or www and non-www were fixed as part of their WordPress speed optimization work which arguably should have been fixed if technical or onpage SEO work was done on the site. So based on this insight here’s some simple action items we can take to improve website performance: Get uptime monitoring setup for your site …And if you’re highly monetized, monitor the top 5-10  pages. Uptimerobot.com has a free plan that checks on 5-minute intervals, so there are zero excuses not to get this setup. We use the paid plan, which checks at 1-minute intervals, and it’s still dirt cheap. If you want to take this up a notch, services like Littlewarden.com and Domaincomet.com are useful for monitoring things that also impact uptime. eg SSL certificates and domain expiries. If you want to really throw cash at this, Uptrends.com would be the next level again. Use good hosting close to your primary target market. Every day we see people using Bluehost, Hostgator, Godaddy, and other garbage low-quality web hosts and often in the US when the target market is in Europe or Australia. A high quality host is the absolute minimum bar to entry here – WPX hosting is good quality, dirt cheap, and one of the best hosting for affiliate marketing. If you’re running a site that needs more processing power, eg Woocommerce, then Cloudways is typically the hosting we recommend. If you want to get super technical, Gridpane, Runcloud, or Wordops coupled with a VPS from Vultr, Linode, AWS, or Google will allow you even more granular control over your hosting. These combos will enable you to squeeze every last drop of website performance. That said, you should probably work on your SEO instead of messing around with dedicated servers – refer to the theory of constraints section further below. DNS hosting speed and quality really matters. It’s possible to have good quality web hosting but still be using shitty DNS hosting. This will have a massive impact on your site’s overall reliability. DNS hosting turns your www.domain.com into an IP address so the browser can find your web server and connect to it. The first thing the browser does when you type in an address is to do a DNS lookup. The speed of this lookup is critical and hugely overlooked. Slow DNS hosting can take 0.5-2 seconds to answer a query, which means your site takes 0.5-2 seconds to load regardless of how good your hosting is. We use Cloudflare for DNS hosting (it can be used with all other features off) as it’s consistently the fastest DNS worldwide as tested by DNSPerf. Because DNS website performance is so overlooked, our speed test tool SiteSpeedBot checks DNS hosting speed as one of it’s metrics. A common mistake we see all the time is using the default DNS hosting provided by your domain registrar. Don’t make this mistake! The worst performing DNS hosting we see is typically run by IT support companies trying to squeeze a few bucks out of their clients and charging for this service. If you do client work you’ll probably see this more often than not. Backup your site and ideally use two backup systems. Over the lifetime of a site, data loss is almost inevitable, and often that means downtime. You can minimize downtime by having a quality backup solution in place you can quickly restore from. For WordPress, you always use two backups, one provided by the host and then Read More Read More

The post Site Speed Optimization in 2024: The Definitive Guide first appeared on Diggity Marketing.

]]>
45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] https://diggitymarketing.com/infoproduct-seo-case-study/ Mon, 11 May 2020 04:19:23 +0000 http://diggitymarketing.com/?p=512380 You’re about to get the strategy behind one of the most challenging SEO campaigns my SEO agency has ever run. Why was it so challenging?  3 reasons: First, the niche is massively competitive: A make-money-online infoproduct in the financial niche.  Nuff said. Second, we only had 5-months to pull this off. Third, just like any other client, they were extremely hungry for results and demanded quality work. In the case study below, you’re going to learn the technical playbook, the onsite content strategy, and the link building techniques we carried out to get this 45.99% revenue growth win for this infoproduct business. The Case Study Our client takes advantage of the wide reach of the interwebs to teach his students how to earn money trading online. We’re talking currencies, forex, stock markets, crypto, etc. The business’ revenue is generated solely through the sale of digital download products – in this case, trading guides in an ebook format and video trading courses. When the owner of this profitable business (which already built some authority in the niche) approached The Search Initiative (TSI) about helping to grow their organic reach and find new students, we were excited to take on the challenge in one of the most competitive spaces there is. To accomplish this, the game plan was to focus hard on a quick-win strategy, while setting the stage for long term gains post-campaign. Our strategists were certain that the value we could provide would have a considerable impact on his business’ bottom line. How?  Because… Over the course of the campaign, our technically-focused SEO strategies were able to grow organic traffic by 23.46%. But what did the best job for the client’s business was the 45.99% increase in the number of conversions comparing 1st vs last month of the campaign. Sales went up from just over 2,100 a month to 3,095 – this really bumped their monetization. And we did it in time. These gains were achieved within only 5 months of the client signing with TSI and our team starting the campaign. Here’s how we did it… The SEO Playbook for Infoproduct Websites Phase 1: A Comprehensive Technical Audit I’ve said this in every TSI case study we’ve published so far… and I simply cannot emphasize enough: So before you begin any campaign, always start with a full technical audit. Starting with… Page Speed First, our technical SEO strategists started at the bottom of the client’s tech stack… and you should too. This starts with you digging into the web server’s configuration, and running a series of tests to measure the site’s speed. This enables you to ensure that the performance of the web server itself wasn’t causing a penalty or disadvantage on either desktop or mobile connections. So, what tests we run? PageSpeed Insights (PSI) – this should be everyone’s go-to tool and shouldn’t need an explanation. GTmetrix – it’s good to cross-check PSI’s results, therefore we use at least one other tool. In reality, we use GTmetrix together with Dareboost, Uptrends, and Webpagetest. HTTP/2 Test – this one is becoming a standard that can greatly improve your page speed, hence, it’s definitely worth looking into. If you’re not HTTP/2 enabled, you might want to think about changing your server or using an enabled CDN.You want to see this: Performance Test – I know it might sound like overkill, but we included this in our test suite earlier this year and use it for the sites that can expect higher concurrent traffic.We’re not even talking Amazon-level traffic, but say you might get a thousand users on your site at once. What will happen? Will the server handle it or go apeshit? If this test shows you a steady response time of under 80ms – you’re good. But remember – the lower the response rate, the better! In cases where transfer speeds or latency are too high, we advise you (and our clients) to consider migrating to faster servers, upgrading to better hosting or better yet, re-platforming to a CDN. Luckily, most of the time, you can achieve most of the gains through WPRocket optimization, as was the case with this case study. Your Golden WPRocket Settings Cache → Enable caching for mobile devices This option should always be on. It ensures that your mobile users are also having your site served cached. Cache → Cache Lifespan Set it depending on how often you update your site, but we find a sweet spot at around 2-7 days. File Optimization → Basic Settings Be careful with the first one – it may break things! File Optimization → CSS Files Again, this section is quite tricky and it may break things. My guys switch them on one-by-one and test if the site works fine after enabling each option. Under Fallback critical CSS you should paste your Critical Path CSS which you can generate using CriticalCSS site. File Optimization → Javascript This section is the most likely to break things, so take extreme care enabling these options!! Depending on your theme, you might be able to defer Javascript with the below: Note that we had to use a Safe Mode for jQuery as, without this, our theme stopped working. After playing with Javascript options, make sure you test your site thoroughly, including all contact forms, sliders, checkout, and user-related functionalities. Media → LazyLoad Preload → Preload Preload → Prefetch DNS Requests The URLs here hugely depend on your theme. Here, you should paste the domains of the external resources that your site is using. Also, when you’re using Cloudflare – make sure to enable the Cloudflare Add-on in WPRocket. Speaking of Cloudflare – the final push for our site’s performance we managed to get by using Cloudflare as the CDN provider (the client sells products worldwide). GTMetrix If you don’t want to use additional plugins (which I highly recommend), below is a .htaccess code I got from our resident genius and Director of SEO, Rad Paluszak  – it’ll do the basic stuff like: GZip compression Deflate compression Expires headers Some cache control So without any WordPress optimization plugins, this code added at the top of your .htaccess file, will slightly improve your PageSpeed Insights results: Internal Redirects You know how it goes – Google says that redirects don’t lose any link juice, but PageRank formula and tests state something different (there’s a scientific test run on 41 million .it websites that shows PageRank’s damping factor may vary). Whichever it is, let’s take all necessary precautions in case there is a damping factor and redirects drop a % of their link juice. As we investigated the configuration of the server, we discovered some misapplied internal redirects, which were very easily fixed but would have a considerable effect on SEO performance – a quick win. You can test them with a simple tool httpstatus.io and see results for individual URLs: But this would be a long way, right? So your best bet is to run a Sitebulb crawl and head over to the Redirects section of the crawl and look at Internal Redirected URLs: There you will find a list of all internally redirected URLs that you should update and make to point at the last address in the redirect chain. You might need to re-run the crawl multiple times to find all of them. Be relentless! Google Index Management Everyone knows that Google crawls and indexes websites. This is the bare foundation of how the search engine works. It visits the sites, crawling from one link to the other. Does it repetitively to keep the index up-to-date, as well as incrementally, discovering new sites, content, and information. Over time, crawling your site, Google sees its changes, learns structure and gets to deeper and deeper parts of it. Google stores in their index everything it finds applicable to keep; everything considered useful enough for the users and Google itself. However, sometimes it gets to the pages that you’d not want it to keep indexed. For example, pages that accidentally create issues like duplicate or thin content, stuff kept only for logged-in visitors, etc. Google does its best to distinguish what it should and shouldn’t index, but it may sometimes get it wrong. Now, this is where SEOs should come into play. We want to serve Google all the content on a silver platter, so it doesn’t need to algorithmically decide what to index. We clean up what’s already indexed, but was not supposed to be. We also prevent pages from being indexed, as well as making sure that important pages are within reach of the crawlers. I don’t see many sites that get this one right. Why? Most probably because it’s an ongoing job and site owners and SEOs just forget to perform it every month or so. On the other hand, it’s also not so easy to identify index bloat. With this campaign, to ensure that Google’s indexation of the site was optimal, we looked at these: Site: Search Google Search Console In our Read More Read More

The post 45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] first appeared on Diggity Marketing.

]]>
Ultimate Guide to Surfer Onsite Optimization in 2024 https://diggitymarketing.com/surfer-seo-guide/ Mon, 09 Mar 2020 07:45:14 +0000 http://diggitymarketing.com/?p=511034 Do you know the best thing about on-page SEO? Control. As an experienced SEO professional with more than 10 years of experience in the filed and founder of multiple 7 figure digital marketing businesses, I know firsthand the importance of on-page optimization in achieving higher search engine rankings. Even with a small budget, you can see real results with on-page optimization. You won’t get the #1 spot overnight, but you can still rise in the ranks with fewer backlinks than your competitors. There are two ways to deal with on-page optimization: you can do it on your own and build great content from scratch, or you can use tools that speed up your optimization work. Like we do. In this guide, you’ll learn more about Surfer, an SEO tool that specializes in on-page optimization. You’ll also get a step-by-step guide on how to use the tool to create better content, fill content gaps, and rank higher for your chosen keywords. Quick Summary What Is Surfer? Surfer is an SEO tool that analyzes why top pages are ranking well for your keyword. Based on that information, you’ll be able to figure out what you need to do to create content that will help you outrank your competitors. Surfer helps in two ways: Creating/outsourcing new optimized content Optimizing existing pages The Surfer Content Editor analyzes your content structure, keyword usage/density, phrasing, and more, comparing them to high-ranking results for the same keywords. Then, it provides you with guidelines on how to build content that has the right structure and wording to show up on page 1. There’s also the Keyword Analyzer, with Audit and Terms to Use features that help you optimize existing pages. Rounding out the Surfer SEO tool kit are the Keyword Research and Common Backlinks features—I’ll discuss all these in full detail later. Surfer Recommendations In Action How does this tool help you optimize your content? And what can you expect from Surfer? Let’s take one of my own pages as an example. I have an SEO coaching landing page that was ranked at #1 forever. All of a sudden, it had dropped to #2. After plugging it into Surfer, I found out that my landing page content was too long…and thus I cut it in half. The Terms to Use feature also let me know that my word usage was off.   Tweaking the landing page boosted the page back to #1, and now it’s at a similar length to other high-performing pages. Here’s another example: in November, I turned my Affiliate Networks page into a blog post and adjusted the densities of relevant phrases based on Surfer’s recommendations. The next day, I checked my keywords and saw this: My page jumped to the top three after my tweaks. Similarly, Matthew Woodward had an extremely comprehensive review of SEMRush that clocked in at a whopping 26,000 words and was ranked #7. Surfer told him to remove 22,000 words…or almost 85% of his content. While it sounds counterintuitive to reduce your long-form blog post to a “regular-sized” one, his review jumped to the #1 spot…the next day. These are three examples of Surfer giving you insights on pages that are currently doing well with their content so that you can use the same rewarding practices. Correlation SEO In On-Page Optimization Now that you know what Surfer can do, you’re most probably thinking, “How does Surfer know which ranking factors are the most crucial for SEO?” Unfortunately, Google and other search engines aren’t transparent about their algorithm. Enter correlational SEO. Correlation SEO analyzes various ranking factors in order to determine which ones have the biggest impact on ranking. Surfer’s data comes from reverse engineering the search engine results page (SERP)—it looks at what top-performing pages are doing that you aren’t. Instead of giving you vague advice (“long content is better than short content”) or ballpark figures (“aim for 1,000-2,000 words per blog post), Surfer provides recommendations that are based on pages that already rank for your target keywords. And this extends to more than just word count. Surfer also looks at what kinds of pages rank best (e.g. long-form vs. quick answers), what kind of media they contain (e.g. graphics, lists, etc.), what topics they cover, and what words and phrases are most commonly used. Surfer wasn’t the first correlational SEO tool to hit the market.  Cora and Page Optimizer Pro came earlier and are both exceptional tools as well. One of the major criticisms of correlation SEO is that correlation doesn’t mean causation—just because a competitor is ranking while using certain practices, it doesn’t mean that those practices are the reason they’re ranking. But by optimizing your content so that it’s similar (but higher-quality) than the content that Google ranks at the top, you are more likely to take the top spot. Check this video to learn more about Google SEO ranking factors. The trick is to know which pages to compare yourself to, so that you aren’t introducing the wrong kind of change. Choosing The Proper Competitors For Your Work With Surfer Although it sounds sensible to look at the top ten results for your analysis, you’ll end up getting a lot of imprecise data and ineffective recommendations. John from Freedom Bound Business found that out the hard way. When he didn’t pay much attention to picking the right competitors, his page dropped from rank 25 to rank 41. When he qualified competitors correctly, his affiliate review got bumped up from the second results page to the first. Source: https://www.freedomboundbusiness.com/surfer-seo-review/ What does this tell you? There is no point in comparing oranges to apples, and the same rule applies to competitor analysis. To get the most out of Surfer’s correlational SEO tools, look into pages that are similar to yours, and don’t compare yourself to websites that are not. Here’s a quick guide to choosing the right competitors: Don’t compare yourself to high-authority sites like Wikipedia or Amazon (unless you are a site of this level, of course). Find websites, pages, or competitors that are within the same niche or have the same format (e.g. review sites, blogs, etc.). Avoid listings and directories while optimizing for local SEO. Look at the word count of top pages and exclude outliers. Once you do the above, your data will be much more accurate. Use Case No. 1: Building High-Quality Content From Scratch Now that you know more about correlational SEO and Surfer, it’s time to put your knowledge to the test. Create a new page for your target keyword using Surfer’s Content Editor tool. Automate Your Content Brief Content Editor lets you create guidelines for your copywriter that includes all your requirements like keywords, topics, and optimum word count. I’ll show you how to create comprehensive briefs that outline your requirements and make it easy for your writers to understand. Preparing a good brief the traditional way is a lot of work. You have to manually research your competitors, extract some basic data about keywords, and follow good SEO practices—things that can take a lot of time. Here’s an alternative way: 1. Type Your Keyword And Location When working with Surfer, you always start with a keyword and location. In this case, our target keyword is “cordless circular saw” and our location is the United States. You can also choose to turn on NLP Analysis for more phrases and words suggestions from Google API (more on this later). Once the analysis is done, you can find your query in the history log below the input. Open it to access a customization panel. 2. Choose Pages To Compare Against The customization panel has five sections: pages to include, content structure, terms to use, topic and questions to answer, and notes. Let’s start with the “Pages to include” section. By default, Surfer checked the top five pages. These top five pages are your benchmarks. Pick URLs that are organic competition for your page. Exclude pages that rank high because of their extremely high authority, pages for different business models, and pages that target a different search intent. Also exclude word count outliers or pages that have word counts that are way shorter or longer than the others. Basically, everything I already told you about selecting comparisons. Here. Check out this example for an affiliate review: 3. Let Surfer Determine The Word Count Surfer automatically recommends a word count based on your chosen competitors, but you can also customize it if you prefer. However, if you chose your competitors wisely, there shouldn’t be much reason to adjust the number. Content length is critical—Surfer calculates phrase and keyword density based on it, so be cautious when modifying it! After you save your changes, the average length will appear in the requirements section. 4. Incorporate The Suggested Words And Phrases In addition to word count, Surfer also checks the top-performing pages for words and phrases relevant to your page. Surfer uses its own algorithms to reverse-engineer the top words and phrases that you should include in Read More Read More

The post Ultimate Guide to Surfer Onsite Optimization in 2024 first appeared on Diggity Marketing.

]]>
E-Commerce SEO Case Study: How we 4x’d Traffic and Doubled Revenue https://diggitymarketing.com/e-commerce-case-study/ https://diggitymarketing.com/e-commerce-case-study/#comments Mon, 02 Sep 2019 08:41:18 +0000 http://diggitymarketing.com/?p=9815 Whether you’re an e-commerce manager or an SEO specialist, you’ve invested a considerable amount of time and energy into working out the best practice approach for tackling organic search for online stores. An E-Commerce SEO Strategy Walk-Through In this case study, I’ll be showing you my agency The Search Initiative was able to double revenue by building a custom strategy for one of our e-commerce clients who operates within a small b2b furniture niche. My goal with this case study is to introduce you to a wide range of new ideas that will help you to expand and improve your e-commerce SEO game and better serve your customers. You’ll learn the strategies we used to improve UX, technical stability, onsite optimization, content, and of course backlinks. The approach that I will detail saw our e-commerce client grow their traffic by a massive 417% in 8 months. It also earned them $48k in consistent additional monthly revenue. This took them from generating $43k a month to $91k a month, or a 112% increase in overall revenue. The Challenge Our client is in the b2b furniture and equipment business and they offer their products only within specific locations in the UK. As well as offering their products for sale to clients, they also offer their products for hire. The client came to us with a solid foundation. They had an existing e-commerce business, a solid website, and a great brand. However, when setting up their company, SEO hadn’t been a top priority. Establishing E-commerce E-A-T (Expertise-Authority-Trust) & Earning Backlinks If you have a high-quality site and with a keen desire to establish your brand (like our client does), your approach needs to be particularly focused on sustainable, long-term growth. You need to create quality content that represents the brand well and earn backlinks naturally. In addition, focus on signalling trust in the online store and the brand by demonstrating transparency and authority. We’ll get to this later. Here’s how we did it… Step 1 – E-commerce User Experience To enjoy the benefits of some quick wins, first focus on the low-hanging fruit. User Experience The client came to us with robust branding already established and a professional-looking website, but we were able to identify a few small tweaks that created a significantly better experience for potential customers. Visual Changes Optimize visitor experience by adjusting color contrast (here’s a couple of great tools for choosing brand colors and color contrast), adjust placement and selection of images, and add zooming and scaling images to product pages to further improve user experience and increase the likelihood of generating a conversion. Mobile Optimization The majority of Internet traffic now originates from mobile devices, so local and mobile optimization are now crucial for small businesses. Make these small changes to your site that make a big difference to those viewing on mobile: Making phone numbers clickable Making emails addresses clickable. Increasing the font-size to a minimum of 16px for mobile users, as you can see in the screenshot below. These small tweaks contributed towards significantly increased conversions on mobile. Step 2 – Technical Auditing The foundation of any SEO strategy is technical optimization of the website, and since we were dealing with an e-commerce site with many functions and pages, there were plenty of opportunities to identify and resolve technical problems. They are as follows… Google Index Management This included removing all traces of their old website from the Google index, removing duplication in their category pages, managing index bloat, adding their XML sitemap to the robots.txt, and removing now-defunct meta keywords from their site. For example, the client’s login pages were indexed. In some cases, this type of unnecessary indexing can cause more valuable pages to be pushed out of the search results, or skipped over in a routine crawl, thus diluting your message. HTTP Pages and URL Parameters We also found HTTP pages and URL parameters in the index. URL parameters are parameters whose value is set dynamically in a page’s URL. For example, you may have a search bar on your website where customers can search your catalog. Whenever customers do an internal search, new URL parameters will be created, which ends up bloating the index with a bunch of URLs like:  website.com/category/search?pink+scarf In order to make it clear to Google’s Search algorithm what the different URL parameters do, specify them in Google Search Console. Cleaning Up Legacy Strategies Next, we looked at any technical issues caused by legacy strategies and began to clean them up. One example of an issue was that the site included meta keywords on the pages, which have been considered defunct since Google confirmed that these self-defined keywords hold no weight in their algorithm. Worse, competitors could look at your meta keywords and find a perfect list of keywords and niches that you’re targeting. We then looked at how the client’s CMS might be causing issues without them even knowing it. Managing Magento 2 Our client’s site is built on the popular Magento 2 ecommerce website builder, which is notorious for not having a robots.txt and sitemap.xml configured out-of-the-box. We created the sitemap ourselves using Screaming Frog web crawler, added it to robots.txt, submitted to Google Search Console, thus helping Google’s search algorithm to better understand the layout of our client’s site. Finally, we dealt with a considerable site-wide issue. The site used canonical tags that were meant to be self-referencing, but were actually canonicalized different pages. This is suboptimal because it confuses Google’s web crawler bots, making it a mess when trying to rank. We cleaned it all up, so that Google knew exactly which pages should rank. Step 3 – Internal Link Building Once you have done a technical audit, earned some quick wins and solved some user experience issues, start to think about improving the internal link structure. Adding Internal Links To Existing Content Quickly, we noticed that while the client did have a blog on their domain, there was very little content on it and much of it was out of date. Also, there weren’t many links between their blog and their category and product pages… a huge opportunity for spreading link juice and establishing topical relevance. Our plan was to create more high-quality blog content and expand its scope, allowing us to build more internal links to relevant product and category pages. We drew up a content strategy that involved producing a consistent number of new content pieces each month and went back through each old blog post, updating them with relevant links to product and category pages. We’ll get to the content plan in more detail later, but for now, let’s really dig into internal linking. E-commerce Topical Clustering Create “topical clusters”, which can be thought of as groups of pages that talk about different elements of the same key topic. For example, “protein powder” might be the topical cluster. It would be made up of a cornerstone article that you hope to rank for the keyword “protein powder”, as well as several other articles talking about sub-topics of “protein powder”. Some examples could be “How to Make Pancakes from Protein Powder”, or “Can Protein Powder Help you Lose Weight?” or “10 Side Effects of Synthetic Protein”. You would then create a content piece for each of these sub-topics and have each linking to the cornerstone article using a close anchor text to “protein powder”. Using this technique, you’re able to pass value from the smaller articles to the main piece and have a better chance of ranking the main piece for “protein powder” in Google Search. From these cornerstone articles, we were then able to link back to category and product pages, increasing their perceived authority too. Step 4 – Content Strategy Before you can implement a solid external backlink building strategy, you need to create a bedrock of content to be used to support your outreach. I suggest giving your writers the following guidelines for creating content. Evergreen, Algorithmically Optimized Content Focus on evergreen content, preferably creating linkable assets such as infographics, slideshows or documents containing industry insights. An example of an evergreen topic would be “why ergonomic chairs are good for your back”. Conversely, “the best chairs in ” would not be evergreen, as it will obviously lose its relevance at the end of the year. In the same line of thought, avoid using dates in the page title, headings or URL. Look at the people ranking on page 1.  Ask yourself: How many words did they write? Find the average and add 20% more. What sub-topics did they cover? When discussing “How to lose belly fat”, you’ll see that it’s necessary to talk about “avoiding trans fats”.  Do the same. What kind of layout are they going for? Are they presenting in tables?  Do the same. And don’t forget, write in an easy-to-read manner, and without any grammar mistakes. E-A-T and E-commerce Content Create content that referenced your products and services so that you can funnel users to your Read More Read More

The post E-Commerce SEO Case Study: How we 4x’d Traffic and Doubled Revenue first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/e-commerce-case-study/feed/ 19
Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank https://diggitymarketing.com/page-rank-case-study/ https://diggitymarketing.com/page-rank-case-study/#comments Mon, 04 Mar 2019 09:55:34 +0000 http://diggitymarketing.com/?p=7421 Introduction I’ve been a director at The Search Initiative a while now. We’ve had some crazy results for a whole bunch of clients – all in very different niches. I’m going to share with you how my team and I took a low-authority website and put our foot on the gas to get it moving – fast! I truly believe there was a key ingredient that accelerated this website, but I’m going to share the whole process from the start. Why? Each website is going to be different, so you need to figure out what your site needs. You need to go through the process. Here’s a sneak peek of the growth. Now learn how we did it… Initial Analysis Since starting TSI’s organic seo services, I’ve realized that working with my own sites is hugely different from working with clients; especially if the website has weak foundations. I know how I want my money sites to look, so I build them using rigorous attention to detail. But if you take a website that’s been developed without a certain level of SEO knowledge – there’s normally quite a lot of on-site and off-site to fix. Here’s how my team broke down the initial analysis: Keyword Research On-Site Audit Backlink Audit Competitor Analysis Keyword Research My team tackled keyword research with two main workflows: one is used to monitor the health of a website, and the other is for content gap analysis. When we’re looking to track keywords for a website, we want to track some of the core terms, but also terms that are having problems. If a term is suffering from keyword cannibalization that we’re trying to fix – it’s worth tracking this daily until it’s resolved. Since this client needed a huge content strategy, we did both a health check and initial content gap analysis. This approach included breaking down all keywords for that industry into topics of relevant terms. In total, this process took over 20 hours and included thousands of keywords chunked into neat topics. This work later helped with choosing page titles, headings and content. Here’s an example of how we did it: Step 1. Search Broad Keywords Step 2. Review Parent Topics Step 3. Find Competitors for Parent Topics Step 4. Reverse Engineer Competitor’s Keywords Step 5. Exclude Outdated Keywords There is the option to also export all of these keywords into excel documents and then filter them that way. But most of the time, a lot of the top keywords are fairly similar. Here’s an example for the best dog food term: best dog food best dog foods healthiest dog food what is the best dog food top rated dog food best food for dogs While each keyword is unique, they all follow a singular intent. The users are interested in finding out what are the best dog foods in the market. On-Site Audit Finding all the technical and content issues with the website requires a full on-site audit. However, while big reports are easy on the eyes, it’s small changes that make the difference. We audited the website and found a whole bunch of technical issues, from lack of breadcrumbs, poor internal link structures, bad quality anchor text and unoptimized titles. A full on-site audit tutorial is too big for this post (perhaps coming soon), but here are some quick tips: Screaming Frog – A cheap way to regularly crawl your website. There are lots of ways to find errors, redirects, and missing metadata. You can also use a custom search to find all references of your keywords. Sitebulb – This tool is more expensive and is a monthly recurring fee. However, it gives you lots of extra data that would be impossible to spot manually and hard with Screaming Frog. An example would be empty hyperlink references. Site Search – By using Google’s site search (site:domain.com) and operators, you can find hundreds of issues with index management, outdated page titles, and multiple pages targeting the same keyword. There are a lot of quick wins here. Page Titles – If you wrote your page titles 1 – 2 years ago, you may find that they’re outdated now. A quick site search with “intitle:2018” will find all your content that is either not updated or not yet crawled by Google. Internal Links – A major way to pass relevance signals and authority to your core pages is through internal links. Make sure that your pages are well interlinked and you’re not using low-quality anchors from your power pages, such as “click here” or “more information”. We focused on fixing around 5 issues at a time varying from small changes like improving accessibility, to bigger changes like introducing breadcrumbs for a custom build website. Backlink Audit The website had a relatively small backlink profile, which meant it lacked authority, relevance signals and entry points for crawling. It also meant that a full in-depth link analysis was unnecessary for this campaign. In this instance, the initial check revealed there was nothing to be concerned about, so we moved on to technical implementation as soon as possible. Had the website experienced problems with the link profile, we would have done a full backlink audit to try and recover this. Here’s what to look out for: Link Distribution – Pointing too many links toward internal pages instead of your homepage can cause lots of issues. So make sure that you’re not overdoing it. Anchor Text Analysis – Using exact match, partial match and topical anchors are a great way to pass third-party relevance signals. Too many and you’ll be caught out over-optimizing, but too few and you won’t be competitive. Read more about anchor optimization. Referring IP Analysis – There are a finite number of IPv4 Addresses, so this isn’t often a big cause for concern. However, it’s worth making sure that you’ve not got too many links from the same IP address. Autonomous System Numbers – Since a server can be assigned any number of IP addresses, these systems often include an ASN. This is another way that Google could flag large numbers of websites from the same origin. My team did a case study on how to remove an algorithmic penalty, a lot of these audits come included in any penalty removal campaign. Competitor Analysis The difference between a search analyst and data scientist is how you approach the search engines. An analyst is focused on reviewing the SERPs and finding what is working best today, while a data scientist wants to understand how things work. We built our team to include both since competitor analysis requires a keen eye for reviewing the SERPs and algorithm analysis requires solid data scientists. If you want to do SEO at a high level, you’ve got to constantly be reviewing competitors using various analysis tools. You will notice that tons of best practices get ignored in the top positions and the devil is in the details. In this instance, we found that both more content and more links would be required for long-term success. Content Strategy Building any long-term authority website in competitive industries will include both an authoritative link profile and content plan. My team reviewed their existing content, looked at how other websites in their industry wanted to help users and then addressed these four cornerstones: User Intent – before we did anything, we wanted to nail the user intent on every page. This research meant that we identified three pillars of content for their site. We’ll get into this in further detail below. Service Pages – these pages were dedicated to explaining what service was offered, how to contact and what was included with that offering. Blog Content – these posts were dedicated to providing non-commercial, informative content that was interesting to the reader. Resource Center – this section was dedicated to giving basic information about topics in their industry. Instead of using Wikipedia for all our links to authority content, we wanted to use internal links instead. Here’s a little bit about each section and our strategy for them: User Intent The biggest mistake I see all the time is the simplest thing to check: What types of content is Google ranking in the top 10 positions? If you’re serving 10,000 words of content in a huge blog post, but Google is only interested in serving service pages with 50 words of content – you’ve missed the point. Another commonly found problem we find at The Search Initiative is including too much content in a single post, when your competitors have several shorter posts. One of the main attractions for Thailand are the yoga retreats. If you’re searching for this (yoga retreats) in America, you’re expecting to find destinations. Let’s take a look: The first position is called Yoga Journal and includes almost no content aside from images and headings. That’s exactly what the users were looking for. There are other websites doing a similar service and can help you make bookings. While others Read More Read More

The post Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/page-rank-case-study/feed/ 30
TF*IDF for SEO: Test Results and Tool Comparison https://diggitymarketing.com/tfidf-for-seo/ Mon, 07 Jan 2019 08:08:38 +0000 http://diggitymarketing.com/?p=6293 You may have seen the term TF IDF being tossed around in the last year or so, but no one could blame you if you haven’t started paying attention yet. A lot of SEO fads come and go, and some of the most interesting concepts just end up attracting penalties, later on, right? But TF IDF is something a little different. It’s not a manipulation of Google search results; it’s a method of analyzing the topics in content, and it’s built on the same principles as the search engines themselves. Because of that, it has amazing potential for SEOs who need a truly objective method to measure and improve content. I just recently wrapped up a case study into exactly what it’s capable of, and the results are quite interesting. In case some of you are where I was only a few months ago, I want to make sure that I cover what I learned about TF IDF, and how it’s used before I get to what I learned from my personal experiments with it. The crash course starts in the next section, but if you’re an experienced user already, you can find the results of my personal tests and some comparisons of the top TF IDF tools near the end. Looking forward to your questions and comments. Quick Summary What is TF IDF? Term Frequency times Inverse Document Frequency (TF*IDF) is an equation that uses the measurement of how frequently a term is used on a page (TF), and the measurement of how often that term appears on all pages of a collection (IDF) to assign a score, or weight, to the importance of that term to the page. I know… nerd alert, right? We’ll look at why this is so important to SEOs in a bit, but first, let’s look at where it came from. The equation has a very long history in academia, where researchers in fields as diverse as linguistics and information architecture have used it to analyze massive libraries of documents quickly. It’s also used by information retrieval programs (including all search engines) to sort and judge the relevance of millions of search results efficiently. There is an important difference between what you want to do and what the search engine wants to do with this same information. The search engine wants to consider a collection made up of all the search results on the web while you want to compare one page or website to just the sites that are out-performing it…. namely the top 10. Let’s look at TF and IDF in more depth… The Equations that take you to TF*IDF You need to do a little more math to get both of the measurements involved, that is TF and IDF. but I promise it won’t be difficult. Depending on the application, the equations for TF IDF can get a lot more complicated than the examples I’m using below. Simplified or not, you generally don’t want to be caught doing this work by hand if you’re trying to optimize a site. These equations will help you understand how TF IDF functions, but it’s the content optimization tools I’m discussing at the end that really open up the potential. Solve the first one, Term Frequency, by doing a raw count of the number of times a term appears on one page. Then, plug that number into the equation below: Term frequency = (raw count of terms) / (total word count of document) Alone, the TF score can tell you whether you’re using a word too rarely or too often, but it’s only really useful when weighed against the other measure. Calculate the Inverse Document Frequency by dividing the number of documents the term appears in by the total number of documents in the chosen collection, like so: Inverse Document Frequency (term) = log (number of docs / (docs containing keyword) With the IDF score, you can now measure the importance of a target keyword/phrase to a page, not just its number of uses. This is important because it’s putting you in the mindset of the people who are building search engine algorithms. Why Does TF*IDF Matter to SEOs? TF*IDF matters because, with the help of this equation, SEOs are able to an actionable relevance score to their content. Using TF IDF tools available, you can then compare your scores to the scores of the top-performing pages for any term. By grading pages on this measure, you can nearly pull back the curtain on how Google might grade sites dedicated to the same topic. It’s unknown if Google uses TF IDF in its algorithm, and if they are, is it a mutated form of it or not? That said, there have been some private correlation studies that I’ve been privy to whose data suggests that it’s likely. TF IDF analysis allows you to optimize the balance of terms in your content according to what is already being rewarded by the algorithm. That’s huge for the SEO community because it marks the return of something all the old hats knew and…loved? Keyword Density Returns? Nope. No one loved the days when keyword density reigned. However, using TF IDF for SEO could mark a return to the primacy of phrases and keywords as an important marker—just in a very different way. Instead, that SEO strategy was an early attempt to game out how Google was really using TF IDF for its indexing and recall. People were keyword stuffing their blog posts, so then algos and filters came out to combat it (hi, panda). So, in a way, keyword density is back.  It ran away from home as a surly teen and has returned as a mature adult with a degree in the sciences. It was an early, limited tactic that mostly encouraged bad habits. Measuring keyword use TF IDF will give you an idea (at least as far as the top search results are using them) balance. It reveals what terms are considered natural, in a very precise way. Using TF IDF Analysis to Enhance Keyword Research TF IDF analysis goes a step further than just the density of keywords. In this way, it opens you to insights about whole families of words on a website, which can take your keyword research to the next level. For example, imagine that you’ve already completed keyword research to optimize a page for “DUI lawyer Chicago”. Most research tools for keywords will spit out keywords like “DUI lawyer in chicago”, “chicago DUI attorney”, etc. When you use the TF IDF tools that I’m covering later on, you’ll also be able to find related non-SEO terms that are being used by the top-ranked pages that you would have never found before using normal keyword research. Terms like “legal”, “experienced”, “rights” and “practice”. These words wouldn’t have shown up in keyword research tools because the articles themselves aren’t ranking for them, yet they’re needed to tell the story of the search intent. Let’s put the equation to use. Fortunately, you won’t need to do it by hand for your sites. There’s always a tool to use, and you’re only a few scrolls from seeing the on-page SEO tools I’ve tested for results. Putting TF*IDF Analysis to Use Oh, no. More math. At this point you may be having high school flashbacks, twisting around in your chair looking desperately for the wall clock that will tell you when you’re free. Don’t worry, this time, I’m going to do the math. Immediately after this, we’ll get to the juicy stuff—How to use TF IDF for SEO purposes. Let’s take a look at the equation in action… Say that a document, such as a client’s landing page you’re examining, contains the keyword “PPC” 12 times, and is about 100 words in length. If you wanted to begin analyzing this piece of content, you would begin by plugging that into the term frequency equation from earlier. TF (PPC) = (12 / 100) = 0 .12 Now, say that you wanted to understand how this usage compared to the usage of this term on the rest of the web. From a sample size of 10,000,000, at least some of these pages are going to be about web services and will include references to PPC. Let’s say, 300,000 of them. We can use those numbers to finish the Inverse Document Frequency equation. IDF (PPC) = log (10,000,000/300,000) = 1.52 Now you score your page based on that term with the TF*IDF equation TF*IDF (PPC) = 0.12 * 1.52 = 0.182 That’s a great score. Or is it? The truth is, it’s not really a matter of meeting a limit. You want to bring your score for targeted terms into balance with the best-performing URLs on page 1. A high score for a certain term isn’t necessarily a good thing (12 uses in 100 words is a lot, after all). What about Common Terms like “the” and “of”? You may be wondering, what about the noise? What about all Read More Read More

The post TF*IDF for SEO: Test Results and Tool Comparison first appeared on Diggity Marketing.

]]>
The 4 Pillars of Mastering Google Website Crawl https://diggitymarketing.com/site-crawlability/ https://diggitymarketing.com/site-crawlability/#comments Mon, 04 Jun 2018 09:45:42 +0000 http://diggitymarketing.com/?p=5093 Foreword by Matt Diggity: In a quick moment I’m going to hand things over to Rowan Collins, the featured guest author of this article. Rowan is Head of Technical SEO at my agency The Search Initiative.  He’s one of our best search engine technicians. Other than being overall well-rounded search engines, Rowan is a beast when it comes to the site audit technical side of things… as you’ll soon learn. Introduction: Rowan Collins Without question, the most overlooked aspect of a search engine is the site’s crawlability and indexability: the secret art of sculpting your web crawlers for the Googlebot. If you can do it right, then you’re going to have a responsive site. Every small change can lead to big gains in the SERPs. However, if done wrong, then you’ll be left waiting weeks for an update from the Googlebot. I’m often asked how to force Googlebot to crawl specific pages. Furthermore, people are struggling to get their pages indexed. Well, today’s your lucky day – because that’s all about to change with this article. I’m going to teach you the four main aspects of mastering site crawl, so you can take actionable measures to improve your site’s content standings in the SERPs. Pillar #1: Page Blocking Web crawlers are essential tools for the modern web. Web crawlers work uniquely, and Google assigns a “crawl budget” to each web crawler.  To make sure Google is crawling the pages that you want, don’t waste that budget on a broken page. This is where page blocking comes into play in search engine crawlers. When it comes to blocking pages, you’ve got plenty of options, and it’s up to you which ones to use. I’m going to give you the tools, but you’ll need to conduct a site audit of your own site. Robots.txt Search engines use advanced algorithms to sort through millions of pages, so we can easily find what we’re looking for. There are a variety of search engines that have proven to be promising. A simple search engine technique that I like to use is blocking pages with robots.txt. Originally designed as a result of accidentally DDOS’ing a website with a Google’s crawler; this directive has become unofficially recognized by the web crawler. Whilst there’s no ISO Standard for robots.txt, Googlebot does have its preferences. You can find out more about that here. But the short version is that you can simply create a .txt file called robots, and give it directives on how to behave. You will need to structure it so that each search bots knows which search engine rules apply to itself. Here’s an example: User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Sitemap: https://diggitymarketing.com/sitemap.xml This is a short and sweet robots.txt file, and it’s one that you’ll likely find on your web crawler. Here it is broken down for you: User-Agent – this is specifying which robots should adhere to the following rules. Whilst good bots will generally follow search engine directives, bad bots do not need to. Disallow – this is telling the search engine bots not to crawl your /wp-admin/ folders, which is where a lot of important documents are kept for WordPress. Allow – this is telling the bots that despite being inside the /wp-admin/ folder, you’re still allowed to crawl this file. The admin-ajax.php file is super important, so you should keep this open for search engine bots. Sitemap – one of the most frequently left outlines is the submit sitemap directive. This helps Googlebot to find your XML sitemap and improve crawlability and indexability. If you’re using Shopify then you’ll know the digital marketing hardships of not having control over your robots.txt file. Here’s what a good site structure will most likely resemble: However, the following digital marketing strategy can still be applied to Shopify, and should help: Meta Robots Still part of the search engine bots, the meta robots tags are HTML code that can be used to specify crawl preferences. By default all your site’s content pages will be set to index, follow links. This setting ensures that your web page is visible to search engines and that your web page has follow links – even if you don’t specify a search engine preference. Adding this tag won’t help your page get crawled and indexed, because it’s the default. However, If you’re looking to stop your site’s crawlability and indexability of a broken page then you will need to specify. <meta name=”robots” content=”noindex,follow”> <meta name=”robots” content=”noindex,nofollow”> Whilst the above two follow links tags are technically different from a robot’s directive perspective, they don’t seem to function differently according to Google. Previously, you would specify the noindex to stop the page from being crawled. Furthermore, you would also choose to specify if the page should activate follow links. Google recently made a statement that a noindexed broken page eventually gets treated like Soft 404s and they treat the links as nofollow. Therefore, there’s no technical difference between specifying follow links and nofollow. However, if you don’t trust everything that John Mueller states, you can use the noindex, and follow links to specify your desire to be crawled still. This is something that Yoast has taken on board, so you’ll notice in recent versions of the Yoast SEO plugin, the option to noindex pagination has been removed. This is because if Googlebot is treating the noindex tag as a 404, then doing this across your pagination is an awful idea. I would stay on the side of caution and only use this for pages showing broken links, and server redirects. These are the pages you don’t want to be crawled or followed. X-Robots Tags Other search engines that people never really use that often include the X-Robots tags. X-robots are powerful, but not many people understand why it’s so powerful. With the robots.txt and meta robots directives, it’s up to the robot whether it listens or not. This goes for Googlebot too, it can still ping all the pages to find out if they’re present. This can either be done by PHP or by Apache Directives because both are processed server-side. With the .htaccess being the preferred method for blocking specific file types and PHP for specific pages. PHP Code Here’s an example of the code that you would use for blocking off a broken page with PHP. It’s simple, but it will be processed server-side instead of being optional for google’s crawlers. header(“X-Robots-Tag: noindex”, true); Apache Directive Here’s an example of the code that you could use for blocking off .doc and .pdf files from the SERPs without having to specify every PDF in your robots.txt file. <FilesMatch “.(doc|pdf)$”> Header set X-Robots-Tag “noindex, noarchive, nosnippet” </FilesMatch> Pillar #2: Understanding Crawl Behaviours Many of the people who follow The Lab will know that there are lots of ways that robots can act as your web crawlers. However, here’s the rundown on how it all works: Crawl Budget When it comes to crawl budget, this is something that only exists in principle, but not in practice. This means that there’s no way to artificially inflate your crawl budget. For those unfamiliar, this is how much time Google web crawler will spend on your site. Megastores with 1000s of products will be crawled more extensively than those with a microsite. However, the microsite will have core pages crawled more often. Rather than trying to force crawls on pages, you may need to address the root of the problem. However, for those that like a rough idea, you can check the average crawl rate of your site structure in Google Search Console > Crawl Stats. Depth First Crawling One way that search engine bots can act as web crawlers on your web page is through the principle of depth-first. This will force google’s crawler to go as deep as possible before returning up the hierarchy. This is an effective way for web crawlers to perform their role if you’re looking to find and strengthen internal links with relevant content in as short a time as possible. An effective internal link structure helps website visitors navigate easily to find the search results they are looking for, as well as helps search engines understand the relevant content of the website. Internal link structure also helps to build page authority, which can result in better rankings for your pages. However, core navigational website pages will be pushed down to the search results page. Being aware that Google’s crawler can behave in this way will help when monitoring your website pages, including doing a site audit of your site’s internal link structure. Breadth First Crawling This is the opposite of depth-first crawling, in that it preserves site structure. It will start by crawling every Level 1 page before crawling every Level 2 page. The benefit of this type of web crawler is that it will likely discover more unique URLs in a shorter period. This is because it travels across multiple categories in your website while overlooking old or deleted URLs. So, Read More Read More

The post The 4 Pillars of Mastering Google Website Crawl first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/site-crawlability/feed/ 28
The Ultimate Guide to SEO for E-commerce Websites https://diggitymarketing.com/ecommerce-seo/ https://diggitymarketing.com/ecommerce-seo/#comments Tue, 03 Apr 2018 09:11:37 +0000 http://diggitymarketing.com/?p=4341 Foreword by Matt Diggity: Impressive.  Knowledgeable. Underrated. These are a few words that I would use to describe Brendan Tully. B.T. is one of the first SEOs I ever met in Chiang Mai.  He’s a veteran in the game and it shows.  I mean… the man used to be commissioned by the Australian government to teach SEO. The guy is good. He’s a beast at E-commerce SEO, as you’re about to find out in this monster of a piece. Introduction It’s tough when Matt Diggity asks you to do a guest post. The guy sets a seriously high standard and regularly wows the SEO space putting out stuff that nobody is talking about or has talked about before publicly. When I first started in this game in the early 2000’s, I had zero idea what SEO was, but knew if I changed certain things on my site it would move up and down the SERPs in a particular way. We grew that biz to 7 figures in the mid 2000s and ultimately ended up in client work at some point in 2008. Now through our services, one-on-one consulting, and in-person training workshops, we’ve worked with over 3000 different businesses or sites in some capacity, which has given me an awesome opportunity to try and test different approaches to determine what works and what actually moves the needle. SEO has changed a lot since then –  (strangely, stuffing a website footer with 200 suburbs and cities doesn’t seem to work anymore) with the SERP landscape changing week to week in some cases. While the tactics and SERP layout change massively over time, I think it’s safe to say search is here for the long term – which brings me to the next point. The more I speak with different SEOs and clients, the more ecommerce SEO in particular seems completely misunderstood. Because tactics change so rapidly, I’ve tried as much as possible in this article to stay away from short term tactics that have a use-by date. Instead, I focus on core fundamentals and strategies that are time-tested and supported by solid business, sales, and marketing principles – this article is not limited by just ecommerce SEO but instead is more broadly focused on ecommerce marketing and optimization – which is where you ultimately need to be playing if you want to stay competitive. I’ve included action items as we work through this article – some are going to be totally obvious but are things that are regularly missed and some you may not have heard of before but have the potential to make a huge impact with a small amount of work. I’ve also included an audio breakdown for most sections and a handful of videos too as some things are just easier explained that way. Please bear in mind that the bigger your site the chunkier the action items or todos, it wouldn’t be unreasonable to create 3 months of SEO work here for a 7 figure site. Ideally, if I can have you walk away after digging into this article with one easy actionable tactic or quick win, one new broad SEO strategy you can apply to your site, or one principle or SEO foundation you weren’t aware of before – then I’ll be happy to say this article was a success. If you have a question about a particular point here, post in the comments section and I’ll be happy to clarify for you. Ecommerce vs Local SEO vs Affiliate SEO Before we get into the meat and potatoes, let’s look at some of the key differences between ecommerce, local small business, and affiliate sites. There are some fundamental differences between them so let’s make sure we’re all on the same page. Ecommerce Sites Typically an ecommerce site has a lot more pages than other types of sites, even if there’s only a handful of products. Most of these pages are auto-generated from the CMS, for example category, tag, cart, and checkout pages. When you add tags, especially if you’re using a platform like Shopify, tons of pages are auto-generated off those tags and can create a canonical and keyword cannibalization mess. For bigger sites, cleaning this mess up can take some serious work but is a key component of getting the site to rank. It’s easier to fix these issues on smaller ecommerce sites, but for bigger sites, it’s tough to handle auto-tags without automation or SEO techniques that work at scale. Local & Affiliate Sites Local SEO sites may offer 5 to 10 services or products. What that translates to are 5 to 10 target keywords or groups of keywords. Affiliate sites are targeting more keywords but generally still not as many as a mid size ecommerce site – for example, an ecom site that offers 100 products will typically have 100 or more target keyword groups. It’s difficult to hand optimise a large number of keywords without using automation, templating, or SEO techniques that scale well when you’re dealing with a lot of pages and keywords. SEO is not just backlinks What I often see is when SEO’s do ecommerce SEO, they do it poorly. A lot of the time the default thinking is SEO=links but that’s not the case at all. Particularly with ecommerce sites, on-page SEO is vital and you need to get it nailed before chasing backlinks and internal links if you’re going to get serious SEO traction. This should be common sense, but you’d be surprised how often we see well established ecommerce sites with glaring on-site issues. On page SEO should be your initial focus, especially for established ecommerce sites. Generally as an ecommerce SEO strategy, I handle backlinks last. Backlinks are usually the most expensive part of SEO compared to internal links but actually give you the least control and there’s often no direct correlation between a link and a result. With on-page, there’s close to a 100% direct correlation between doing the work and getting results. ACTION STEPS: Run a Semrush Site Audit across the site. I’ve tried dozens of onpage audit tools and right now Semrush is my favorite. It picks up a ton of things other ecommerce seo tools won’t and presents them in a way that it makes them easy to get fixed. Semrush will pickup many of the easy technical problems that are roadblocking rankings. Run the site through siteliner.com – this is a fantastic tool for uncovering content duplication and cannibalisation issues. Get a Littlewarden.com account setup. It’ll monitor your site on an ongoing basis and detect basic but critical onpage SEO problems that you wouldn’t otherwise notice until your rankings start to tank (shout out to Kevin from Bulk Buy Hosting for this reco) A Note On Ecommerce Client SEO… It can be tough working with smaller ecommerce sites as clients, brand new sites, or sites where the average sale size is small. Generally I’ll stay well away from these types of clients. Generally when we do client SEO for ecommerce sites we never do SEO only. It’s just too hard to get them fast results and if the client is not commercially mature, often the expectation is that you’ll make them a millionaire overnight Whether I’m talking with a prospect about Adwords, SEO or something else one of the first questions I’ll ask if what is their budget. You can usually tell by the way they react what they’re comfortable spending and how commercially mature they are. A common problem with client SEO is that expectation that SEO is free so that’s where the focus should be versus ads. But that’s the wrong way to look at it. At least half of SERPs are paid ads now. If you don’t include paid traffic into your ecommerce SEO campaign strategy, you’ll have a hard time matching the competition. Ecommerce SEO – The Current State of Play, SEO is not enough any more Let’s talk about the current state of play with ecommerce SEO. As you’re well aware the SERP is constantly evolving which means your SEO approach MUST evolve otherwise it’s simply going to be less effective as time goes on. SEO is becoming increasingly fragmented. 10 years ago, the #1 result on Google was the SEO search results. There might have been one or two adwords results at the top of the page but now the landscape has totally changed. The #1 search result could mean many different things today: Regular paid adwords Google Shopping Google Maps Featured snippet (learn how to get it) Knowledge panel Paid Google Maps, which changes depending on location And more To top it off, the SERP also changes based on device AND the location of that device at the time of the search. The traditional #1 result has been drowned out by ads and other SERP elements now. A lot of SEOs haven’t realised it yet: Google is a paid search engine with some free results. Google wants to sell ads, they don’t care about searchers. Read More Read More

The post The Ultimate Guide to SEO for E-commerce Websites first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/ecommerce-seo/feed/ 25
The Definitive Guide to Keyword Cannibalization: How to Diagnose and Fix it https://diggitymarketing.com/keyword-cannibalization/ https://diggitymarketing.com/keyword-cannibalization/#comments Mon, 05 Mar 2018 02:20:41 +0000 http://diggitymarketing.com/?p=4005 As an SEO veteran since 2009 and the founder of multiple 6-figures SEO businesses, including Diggity Marketing, LeadSpring, The Affiliate Lab, and The Search Initiative, I’ve encountered numerous SEO challenges. One such challenge that’s been causing a stir in the industry is keyword cannibalization. Despite the controversy surrounding it, I believe it’s crucial to address this issue as it can hinder your ranking success. In this article, I’ll show you how to assess internal keyword cannibalization through tools that we use in our SEO agency, and most importantly, the three most effective ways to fix the problem. Quick Summary What Is Keyword Cannibalization? Keyword Cannibalization is when two or more pages on your website have the same target keyword, appear for the same search queries, and as a result, have difficulty ranking. Why did this topic become so popular? Namely, because keyword cannibalization has quickly become an increasingly prevalent SEO issue that can hold back pages from ranking. If you optimize multiple pages for the same search query (intentionally or not), they will likely hurt each other’s chances to rank. Why Is Keyword Cannibalism Bad For SEO? Keyword cannibalism is bad for SEO because when multiple pages try to compete for the same keyword, they split up the power you’ve earned, and every single page becomes less visible on Google. Your goal as an SEO is to make your site as visible as possible for your chosen keywords. If you have an X and a Y landing page fighting each other, for example, on page 2 of search results—instead of a single one that appears on page 1—you’ve messed up. Fortunately, it’s not a problem that’s hard to fix. I’m going to show you ways of identifying diagnosing keyword cannibalization using several common tools. How to Diagnose Keyword Cannibalization 1. Agency Analytics To illustrate keyword cannibalization with an example, I set up an experiment to track an utterly random site that clearly exhibited keyword cannibalization issues for one keyword across multiple pages. Agency Analytics is a keyword tracking tool that we use to track day-to-day Google positions of keywords across your core pages. When used correctly, it can be an option for a keyword cannibalization tool. It’s also a great way to track relevant on-page health and (if used correctly) can diagnose a lot more cannibalization issues too. Below, you’ll see a screenshot of the average overall Google positions over time for the entire website: Since the start of tracking, the focus keywords we chose to track have continued to decline consistently, and on October 23rd, we can see a huge fluctuation. Could it be the result of keyword cannibalization? Let’s look deeper. One of the great features of this tool is that we can track the progress of a specific keyword over time, not just a net combination of all the keywords. So, let’s dive into the broad term ‘acoustic.’ Since we began tracking this keyword, Google has selectively ranked a total of 3 multiple pages targeting the same keywords. There’s potentially a 4th page competing if we consider Bing choosing another page. This is the first and easiest way to pick up keyword cannibalization—by monitoring one keyword daily and tracking the URL changes. 2. Ahrefs Ahrefs is by far one of the most versatile and powerful digital marketing and SEO tools available – if you haven’t got a membership, then you should get one. One of the great features of Ahrefs is its keyword explorer, giving you an option to audit your keywords versus your competitors. Yet, an often-overlooked feature is to the right: When you use the Organic Keywords feature on Ahrefs, you suddenly have access to historical data on all your keywords and instantly spot keyword competition issues. I only have 6-months of historical data with my plan, so perhaps an upgrade is in order.  *Cough* Tim Soulo? Click on “Show History Chart” to drop down your ranking graph. Each color on the graph represents a different URL’s rank in Google (as denoted by the legend in the lower left), so if you see more than one color, you’re cannibalized. Notice how the keyword is constantly dropping out of the index, and there have been multiple pages ranking for this term? That is keyword cannibalization and what it does to your keyword rankings. 3. SEMRush One of my favorite SEP keyword cannibalization tools is SEMRush. To do this, export a large chunk of your keywords, perhaps only including your core pages or keywords with high search volume. You do that here: Take a sample of the top keywords with the highest search volumes, which will help you get a holistic view of your site. Throw all of these into a spreadsheet and set it up so that yours looks something like the one below, and if you’re having doubts, you can copy our template here. Once you have set up your spreadsheet, you will want to sort these five columns by Keyword (column B) in alphabetical order. This will mean that any cannibalized keywords are next to each other and will have a different position and URL. You can then scan from top to bottom to determine which keywords have multiple URLs competing.  But you don’t want to waste time scanning, right? If you use a little bit of spreadsheet magic and use the following formula: =IF(B2=B3,”Cannibalized”,IF(B1=B2,”Cannibalized”,”Unique”)) If you use this formula correctly, you should easily duplicate the cell and turn Column A into a long list of automated checks – without the work. This means you just checked 10,000+ keywords in less than 2 minutes. Win. 4. SerpLab Edit: Late suggestion from Prince Olalekan Akinyemi [thanks for the contribution]. SerpLab is one of the few SEO rank trackers out there with a freemium plan. One of the features of Serplab is that it tracks which actual URL of your pages in Google SERPs has the top rank, and as such, it is excellent for diagnosing keyword cannibalism. To use Serplab for a cannibalism diagnosis of your pages, here are the steps to follow: 1. Log in to your Serplab account and select the project you want to diagnose 2. On the page that opens, click on any of your keywords that have been experiencing wild fluctuations. 3. Then Click on ‘’View Full Keyword Details’’ as shown below 4. On the page, you will see a graph showing the SERP overview of the keyword in view. When you notice too many fluctuations time and time again (as shown in the image below), then your pages are probably cannibalized in Google. 5. Scroll down a bit to the bottom of the page, and you’ll see a list of URLs that have once ranked for that keyword and the positions they occupied.  In my case, I discovered three different pages of my website that were trying to rank for the same keywords. The only issue with this method is that it might be time-consuming as you have to go over each of your keywords one after the other. Aside from that, you need to have been tracking your keywords for a considerable amount of time before you can use it to identify SEO keyword cannibalization. 5. Google Search Console (GSC) Edit: Late suggestion from Joe Kizlauskas [thanks for the contribution]. For the most observant search engine junkies, you will notice that all four tools (Agency Analytics, Ahrefs, Semrush, SerpLab) are focused on the top 100 positions. They only report on detected cannibalization within these positions. Using GSC (formerly Google Webmaster Tools) to diagnose this issue will give you access to the top 300 search positions GSC provides a far larger data set to work with than most SEO tools. It’s also based on all queries that your pages are returned for within Google’s search results, so you aren’t likely to miss anything. Plus, it’s free. Here’s how you get started: 1) Login at https://www.google.com/webmasters 2) Choose your website from the right-hand side 3) Select the options below: 4) Filter by keyword to narrow down the search results: 5) Enter your keyword as an exact match: 6) View the Pages that are being returned for the filtered keyword:7) Scroll down and you can see all of the pages that rank for this keyword You’ll see that the one I’ve highlighted in red is indeed outside the top 100. 6. Google Search Operators For proactive SEO, there’s a sixth technique that can also reach outside the top 100 positions. It takes time, but you have an option to check the entire Google index to find duplicate pages by using Google site operators.  Using search operators, you can track down multiple pages that have any content issues like duplicate content. For example, take a look at some keyword duplication on Diggity Marketing.  Here are some duplicate keywords on my own site: If I wanted to rank for “pbn link,” I have 17 pages that all include this term. I can cross-reference this with AHREFS and see that this is indeed one page Read More Read More

The post The Definitive Guide to Keyword Cannibalization: How to Diagnose and Fix it first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/keyword-cannibalization/feed/ 112
The Story of 10Beasts.com – An Uncensored Interview with Luqman Khan https://diggitymarketing.com/interview-with-luqman-khan-of-10beasts/ https://diggitymarketing.com/interview-with-luqman-khan-of-10beasts/#comments Mon, 08 Jan 2018 08:00:29 +0000 http://diggitymarketing.com/?p=4032 About a year ago, I was introduced to a site called 10beasts.com. At the time, it was an 8-page affiliate website in the technology niche that quickly busted out of the sandbox in six-figure profitability within 8 months.  December 2016, it broke $80k. This site became incredibly popular when it was featured on Glen Alsop’s Gaps.com.  I mean, how often does someone go public with an affiliate website of this level? Fast forward one year… 10Beasts grew in size and earnings and flipped for over half a million dollars. And then the unspeakable happened. It got penalized with an unnatural links manual action in Google Search Console. And guess what? The penalty had recovered in 5 days. Meet Luqman Khan. Luqman is the creator, builder, and recoverer of 10beasts. In this no-holds-barred interview, Luqman discusses the entire story of 10beasts, how he got it ranked, how he sold it, and how he recovered it.  In this interview we get into: The story of Luqman Keyword research Content planning Onsite optimization Backlink strategy Social signals The huge flip for $500k+ …and the miraculous 5-day recovery Resources: Tools Keyword Finder – Keyword Research CrazyEgg – Heatmap Monitor Backlinks Services AddMeFast – Social Signals Upwork Fiverr: Character Images AllTop.com Empire Flippers Guides An SEO’s Guide to Flipping  Blogs NichePie Backlinko Cloud Living Gaps.com NeilPatel.com Transcript Matt:                     Hey, Luqman. How’s it going, man? Thanks so much for coming on. Luqman:              Hey, nice to meet you, Matt. You’re absolutely welcome and thanks for inviting me for this interview. Matt:                     Absolutely. For the people that are watching that don’t know who you are, can you give us a brief introduction like what’s your name, how old are you, where you came from? Luqman:              Well, my full name is Mohammed Luqman Khan and I’m from Lahore, Pakistan. Lahore is actually the second biggest city of Pakistan. I’m like actually, I’m born here and I’m living since I’m born here, and I have been to England, Turkey, Egypt, and a few other countries. And for now, I am still living in England as a computer science student in University of Manchester. Matt:                     Oh, great, awesome, so you’re well-traveled. Tell me a little bit more about the home city you grew up in. Lahore, right? Luqman:              Yeah. Matt:                     Is that a city where people are doing like what you’re doing, working online? What’s it like where you’re from? Luqman:              Well, Pakistan is actually the second biggest country who are doing the freelance work like freelance jobs are more open here in Pakistan. And what I’m doing here a lot of people are doing here. Actually, I’m inspired from a Pakistani guy called … his name is Salman Baig. He’s from another city called Peshawar. It is the north side of Pakistan. So, yeah, that’s all. Matt:                     Yeah, cool. And what do your parents think about what you’re doing? Luqman:              Well, my parents really don’t like what I’m doing. They want me to really work. They ask me there what you do and I really tried to explain them what. They don’t know what you can do on internet. They want me to get a physical job, they want me to show doing something because the people think that I’m just a lazy guy who’s sitting in home all the time and do nothing but yeah, that’s actually what’s going on here. Matt:                     It’s not one of their familiar lawyer, doctor, so it’s garbage. I get that too. So tell me a bit about your background. You said you were getting a degree in computer science, right? Luqman:              Yeah. Matt:                     Okay. And are you working on your bachelors or masters? Luqman:              I’m doing bachelors yet. Matt:                     Okay. How are you doing there? Luqman:              It’s not that great. It has nothing to do with my career, so I’m actually just doing it to get a degree to satisfy my parents, that’s all. Matt:                     I hope your parents don’t watch this, and if they do, I apologize for instigating this guy. Okay, cool. Have you ever had a jobby job? Have you ever worked for someone else? Luqman:              I had a job in call center. It was in the sales department in some kind of product, I think security installment product in Canada but the call center was here in Pakistan. Matt:                     That sounds fun. Luqman:              Yeah. I had it back in 2011 maybe. I don’t remember. I don’t really … yeah. Matt:                     Okay, all right. Luqman:              So, I only worked for like one month. My back was already completely trashed of sitting on a chair for eight hours continuously. Matt:                     Yeah, I can agree. Luqman:              It’s like … Matt:                     Mm-hmm (affirmative). And so, when did you get into SEO and how did that happen? Luqman:              When I was a freelancer, I started, you know, I figured out online earning from an ad. It was a PTC website, I don’t know. I was I think smartphone were newly introduced back then and I was looking for a smartphone on GSMArena.com, and there was an add earn by clicking in it was a PTC website. I hope you know about PTC websites. Matt:                     Mm-hmm (affirmative). Luqman:              There you click on an add and you get a few cents and things like that. And that was a scam website but I ended up with an idea that earning, online earning is quite a possible thing. So I started research, I learned HTML, CSS, and WordPress. By the passage of time, I started to work on upwork.com, fiverr.com. And I had a project on Fiverr, they were the client who had a website. I think Amazon associate website. That’s how I figured out about Amazon associate and by the passage of time, I figured out about search engine marketing that how you can get visitor to your website and that’s how I ended up on backlinko.com, cloudliving.com. And I saw that guy, Suman Bake, whom I told you about earlier from Peshawar. I saw him. I know he was posting somethings on his Facebook walls so it was good. Matt:                     Okay, so you were doing some online freelance work. You started working for a website. And you’re like, “Okay, if he’s paying me this much, how much is he making?” Luqman:              Yeah. Matt:                     Then you went down the rabbit hole, I’m guessing. Luqman:              Exactly. Matt:                     And where have you learned from in the meantime? Do you read blogs? Luqman:              Yeah, the main learning source is backlinko.com for branding. And a few other Facebook pages, Facebook groups, sorry, and Neil Patel. You know neilpatel.com and Quick Sprout also. These famous blogs, they are really helpful. Matt:                     Awesome. And this was how long ago when you first started getting into SEO? Luqman:              I think in 2013 or ’14. Maybe … I’m not really remember. Matt:                     So like maximum like four, four-and-a-half years ago. Luqman:              Yes. Matt:                     And I would definitely say you classify as what I would call a very successful SEO. I’d say you’re probably in the 1% considering what you’ve done with 10Beasts. How does that sound to you? Luqman:              Oh, thank you. Matt:                     Like how does that make you feel? Luqman:              That sounds great. That sounds really great, man. Matt:                     I’m not just saying that because its coming from me… but just like you were not an SEO four years ago and now you’re … I would say you’re in the 1%. That … you’re awesome. Luqman:              I really do feel awesome actually. Matt:                     That’s good, that’s good. You deserve it. You did a lot of hard work and I’m excited to talk about that site but not quite yet. On the way to where you are now, did you ever face any setbacks or any big issues that kind of … roadblocks that got in your way. Luqman:              The biggest issue I faced was drop out of college in November 2015. I had a fight issue with my ex’s boyfriend and the fight really turned rough fight, fight. Matt:                     Okay. Luqman:              So you know, actually, that guy, he brought a few guys to beat me up from outside the college, those who weren’t students. So the students of the college, they find out that people came outside the college to beat me, the student of the college. So the fight really turned into a big scenario like there were more than 50 to 60 students fighting in the hockey ground. Matt:                     Oh, my goodness. Luqman:              And it really turned bad. They suspended like more than 16 students and including me and that other guy. I was suspended for five years. I cannot [inaudible 00:07:47]. Matt:                     Wow. So, I mean, that probably not just affected you in your school life. It probably affected every aspect of your life including the relationship with your parents. Luqman:              Yeah, exactly. The relationship with my parents, my family, my teachers, so it was really bad. Matt:                     How did you bounce back from that? Luqman:              I flew to England. Read More Read More

The post The Story of 10Beasts.com – An Uncensored Interview with Luqman Khan first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/interview-with-luqman-khan-of-10beasts/feed/ 100
The Definitive Guide to Content for SEO and Conversion https://diggitymarketing.com/content-for-seo-and-conversion/ https://diggitymarketing.com/content-for-seo-and-conversion/#comments Mon, 04 Sep 2017 09:30:20 +0000 http://diggitymarketing.com/?p=3061 As a seasoned digital marketer and content creator, I understand the critical role that content plays in SEO and conversion optimization. Crafting quality content is vital for driving traffic to your website, building brand awareness, and ultimately, converting visitors into customers. However, creating an effective content that serves both SEO and conversion goals requires a strategic approach and a deep understanding of your target audience. In this article, I will share my knowledge and expertise to provide you with a comprehensive guide on how to create content that drives SEO and conversion success. I will focus mainly on conversion rate optimization (CRO). Whether you’re a seasoned content creator or a beginner, this guide will equip you with the necessary tools to create compelling and effective content that drives results. Why You Need to Start Taking Your Content Seriously I hire only the best writers.  Each page of content on my affiliate sites costs at least $200.  On some sites the content ends up costing over $10,000 on its own. Here’s why I’m so serious about content… Reason #1: To Get People to Read Your Article in the First Place Let me tell you a story about an average web user named Jeff. Jeff is a dental assistant.  He works 40 hours per week in his office and enjoys fishing on the weekends. One day Jeff gets home from work, sits down with his family for dinner, and then decides to browse the web, looking to purchase a new fishing rod.  He searches “best fishing rod” and lands on your affiliate site. At that instant, the clock starts ticking: you have less than 1 second to convince Jeff that he’s in the right place. Luckily you pass the test.  You have enough images of people fishing and a huge H1 that reads “Best Fishing Rods in 2017: Reviewed”.  You’re in the clear. Now, you need to convince him that this article is worth reading.  This is where copy comes into play. If you have a boring introduction, Jeff is going to bounce and spend his time elsewhere. Something like the following introduction paragraph is not very inspiring to dedicate 10 minutes of your life reading… Choosing a new fishing pole is a difficult process.  We have reviewed the best poles to come to the market in 2017.  The criteria we have used is… blah blah blah The first paragraph is your “hook”.  It’s your chance to hook your reader into staying and reading.  Don’t blow it. Typically I use three different approaches (sometimes combining them) in order to get folks like Jeff to stay on my sites. The Knowledge Bomb With this approach, you need to teach the reader something they didn’t know before.  Show them that you’re an expert on the subject and let them know that if they read your article, they’re about to learn something new. In the case of fishing poles, you could lead with an introductory paragraph like this… As of , over 55 new fishing pole models have become available on the market.  Many of these new rods have introduced new technologies never seen before.  This makes purchasing quite difficult, without having the resources to actually test … See the difference? In the above paragraph’s copy, you’ve shown that your market knowledge is up-to-date and by throwing around some industry terms you’ve shown that you’re a fisherman. The Show In this approach, you want to entertain them in the first paragraph of copy. Do whatever it takes: crack a joke, use a lot of slang, and be silly. The idea is to convey to the reader that if they go on and read the rest of this article, they’re going to have a fun time doing it. This is quite hard to do for the fishing niche, but I’ll do my best… Hey boss.  The name’s Karl.  Let set things straight.  I’m not much of interweb guy, but I do know a hell of a lot about fishing.  This year I’ve spent over 200 days on the water and tested over 40 different fishing poles.  Most of them sucked (like this piece of crap) but some of them really fit the bill … So maybe this isn’t Louis C.K. stand-up material, but you get the idea.  Karl’s article is probably going to be fun to read. Fear Factor The third approach I go with, which is often the most effective, is to use fear to make the reader feel like if they don’t read my article, something bad is going to happen. Horrid things such as… They’re going to get ripped off They’re going to buy the wrong product They’re going to get hurt If you’ve ever read a headline like “Do NOT Buy Garcinia Cambogia Before You READ THIS REVIEW!!” then you know what I’m talking about. Let’s try with the fishing niche: Welcome to Fishermen First’s 2017 Fishing Pole Review Last year I spent $600 on a new carbon/graphite composite pole.  It was stiff to the point that it actually broke in half on its first deep sea run. At that point I decided there wasn’t enough information out on proper poles, so I decided to take matters into my own hands and I built this review site… Playing to someone’s desires to avoid getting ripped off is a HUGE emotional motivator.   More on this later… Putting these Approaches to the Test The key thing about these different approaches is that there’s never a one-size-fits-all solution to choosing which one will work best for your eCommerce site. Depending on the niche, one approach might be better than the other. This is why it’s extremely important to AB split test to find the best approach for your niche. For the following experiment, I used Optimizely to put The Knowledge Bomb, The Show and Fear Factor against each other in one of my niches. Original = The Knowledge Bomb Variation #1 = The Show Variation #2 = Fear Factor Each of these gets served to various readers 1/3 of the time. For the test criteria, I’m looking at “Engagement.” An engagement event occurs when someone does something on my page, such as click.  It’s a good indicator of that someone stayed on my page due to my “hook” paragraph. As you can see here, The Show is highly superior over the other approaches, resulting in an 8.0% increase in engagement over The Knowledge Bomb.  Each time a new visitor comes to my site, through the conversion funnel, the chance that they stay and read, and site’s conversion rate,  increases by 8%.  This adds up quick when you have more traffic like thousands of visitors per day. This is both surprising and not; surprising because The Show hardly ever wins and not surprising because of the niche that I tested this on (which is a pretty light topic). You can see now how important it is to test your intro content.  If you want to keep people from bouncing once they get to your site, it’s worth taking seriously. Now that we’ve gotten people to agree to read your page, let’s see how we can use content to keep them there. Reason #2 – To Get People to Continue Reading Until They Get to Your Call-to-Actions It’s no secret: we all have internet attention deficit disorder. When was the last time you fully read every single word of an article? Hopefully, you’re doing that right now, but I’d be surprised (and honored) if you were. Most people read the first paragraph of copy and then go into skim mode, rapidly scrolling their eyes down the page until they see something they like. Or they bounce when they see a lack of things that they like. Let’s work on that… Eliminate Walls of Text As mentioned above, people don’t like to read.  Ain’t nobody got time for that. When you present a top-to-bottom, browser filled wall-of-text, it’s a huge turn off to your visitors. Instead, break up your content with more visually digestible pieces of content along the way to your CTAs. Here’s a list of techniques that I commonly use: Media: At any given point in the vertical scroll of your article you should see some kind of image, video or one of the following… Headings: Use H2, H3, H4 headlines to notify a skimming reader when there’s a new section that he might be interested in. Call Outs: Use call-out boxes to highlight important information that they might like Formatted Content: Use structured markup like tables and lists to present information in easily digestible ways. Also works well for capturing feature snippets. Scroll up to see what I’ve done in this article. Short Paragraphs Loooooong paragraph blocks look intimidating.  They’re hard to read. Do you ever see that guy on Facebook that doesn’t use his “Enter” key?  He writes a post that very well may be the best piece of literature ever to grace the internet, but you witness this textual behemoth and say Read More Read More

The post The Definitive Guide to Content for SEO and Conversion first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/content-for-seo-and-conversion/feed/ 118
How to Rank In Google Images in 5 Steps https://diggitymarketing.com/how-to-rank-in-google-images-in-5-steps/ https://diggitymarketing.com/how-to-rank-in-google-images-in-5-steps/#comments Thu, 27 Apr 2017 09:25:59 +0000 http://diggitymarketing.com/?p=2843 Foreword by Matt Diggity: If you’ve look in the SERPs recently, you might have noticed that featured snippets have slightly changed. Whereas before, if you rank for the snippet the image is pulled from your site – Now, the image will most likely be pulled from the #1 ranking image in Google Image search. Sounds annoying, right? Not if you know how to rank in Google Images. In the video below, we’ll cover how to rank #1. For this sneaky post, I’ve brought on Dino Gomez to show you exactly how to do just that. Benefits of Ranking Images 1) Branding Include a logo watermark on your image and that can be seen front and center on Google for a competitive target keyword. 2) Traffic Depending on your niche, image optimization can drive a fair bit of extra traffic. In particular, ecommerce websites have much to gain. I ranked a client in the fashion apparel industry and their designs showed up first for competitive terms in addition to their website. Since people look closely at the design of clothing as part of the buying consideration; it helped drive click through rate from Google image results. “How to articles” also have much to gain from image optimization. When people are looking to learn something they often want to see a “visual” of how it is done. 3) Rich Snippets There’s been a lot of recent discussion around how to rank for rich snippets and rich snippet images. Results are showing that rich snippets sometimes pull images from websites not even on page 1 of Google for that query.  The truth is that most of the time, like Matt said, the images is getting pulled from Google image rankings. Imagine your competitor does all the work to gain a rich snippet but your image is sitting with the snippet at position zero. There’s a good chance you’re going to steal some of that traffic and brand awareness again. 4) Amusement I ranked myself for “best looking guy in San Diego”.   My competition (see the images next to me) was obviously fierce as I’m barely beating out the fellas that went shirtless. Needless to say,  I showed my girlfriend. And my mom. And my buddy who spams me with fake political stories. Their reaction? And the result… Girlfriend thinks I’m amazing. (win) Mom has most proud moment of her life. (win) Friend stops spamming me with fake news. (win) Who would’ve thought image optimization could be so damn amusing and powerful. How to Rank Images in Google (Step by Step) Ranking images in Google is very similar to ranking a Youtube video. In fact ranking images is actually easier because the competition is lower (not many people are intentionally aiming to rank photos). Here are the steps to rank an image… 1) Alt text The alt text of the image needs to match or be a partial match keyword to what you’re aiming to rank for. 2) Content The content surrounding where your image is hosted needs to be closely related to your target keyword. 3) Authority This is relevant to the competitiveness of the keyword you’re trying to rank. If you’re aiming to steal a rich snippet image spot then the hosting website of your image should have some comparable authority to your competition holding the actual text snippet. Here’s an example of stealing the image snippet. If you check the alt text of the image you’ll verify it’s a partial match of the keyword.    You’ll also see the website stealing the image snippet has very similar authority to the website with the actual text snippet. (below) 4) Syndication Take your image and syndicate it out to all the image sharing websites. This includes Google plus, Flickr, Google Picassa, & others. Link back to the original image. Here’s a link to the gig that I use: click here. 5) Juice The Image Take your image and embed it on a few PBN sites just as you would if you were aiming to rank a Youtube video. Make sure the embed references the hosting website image source code as the point of origin. That’s it. Depending on the competition level and authority of your image-hosting website you’ll hit the top of google images quickly. Finally, what image search do you plan to rank for?   Foreword by Matt Diggity: If you’ve look in the SERPs recently, you might have noticed that featured snippets have slightly changed. Whereas before, if you rank for the snippet the image is pulled from your site – Now, the image will most likely be pulled from the #1 ranking image in Google Image search. Sounds annoying, right? Not if you know how to rank in Google Images. In the video below, we’ll cover how to rank #1. For this sneaky post, I’ve brought on Dino Gomez to show you exactly how to do just that. Benefits of Ranking Images 1) Branding Include a logo watermark on your image and that can be seen front and center on Google for a competitive target keyword. 2) Traffic Depending on your niche, image optimization can drive a fair bit of extra traffic. In particular, ecommerce websites have much to gain. I ranked a client in the fashion apparel industry and their designs showed up first for competitive terms in addition to their website. Since people look closely at the design of clothing as part of the buying consideration; it helped drive click through rate from Google image results. “How to articles” also have much to gain from image optimization. When people are looking to learn something they often want to see a “visual” of how it is done. 3) Rich Snippets There’s been a lot of recent discussion around how to rank for rich snippets and rich snippet images. Results are showing that rich snippets sometimes pull images from websites not even on page 1 of Google for that query.  The truth is that most of the time, like Matt said, the images is getting pulled from Google image rankings. Imagine your competitor does all the work to gain a rich snippet but your image is sitting with the snippet at position zero. There’s a good chance you’re going to steal some of that traffic and brand awareness again. 4) Amusement I ranked myself for “best looking guy in San Diego”.   My competition (see the images next to me) was obviously fierce as I’m barely beating out the fellas that went shirtless. Needless to say,  I showed my girlfriend. And my mom. And my buddy who spams me with fake political stories. Their reaction? And the result… Girlfriend thinks I’m amazing. (win) Mom has most proud moment of her life. (win) Friend stops spamming me with fake news. (win) Who would’ve thought image optimization could be so damn amusing and powerful. How to Rank Images in Google (Step by Step) Ranking images in Google is very similar to ranking a Youtube video. In fact ranking images is actually easier because the competition is lower (not many people are intentionally aiming to rank photos). Here are the steps to rank an image… 1) Alt text The alt text of the image needs to match or be a partial match keyword to what you’re aiming to rank for. 2) Content The content surrounding where your image is hosted needs to be closely related to your target keyword. 3) Authority This is relevant to the competitiveness of the keyword you’re trying to rank. If you’re aiming to steal a rich snippet image spot then the hosting website of your image should have some comparable authority to your competition holding the actual text snippet. Here’s an example of stealing the image snippet. If you check the alt text of the image you’ll verify it’s a partial match of the keyword.    You’ll also see the website stealing the image snippet has very similar authority to the website with the actual text snippet. (below) 4) Syndication Take your image and syndicate it out to all the image sharing websites. This includes Google plus, Flickr, Google Picassa, & others. Link back to the original image. Here’s a link to the gig that I use: click here. 5) Juice The Image Take your image and embed it on a few PBN sites just as you would if you were aiming to rank a Youtube video. Make sure the embed references the hosting website image source code as the point of origin. That’s it. Depending on the competition level and authority of your image-hosting website you’ll hit the top of google images quickly. Finally, what image search do you plan to rank for?   Read More

The post How to Rank In Google Images in 5 Steps first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/how-to-rank-in-google-images-in-5-steps/feed/ 59
How to Steal the Featured Snippet in 3 Sneaky Steps https://diggitymarketing.com/how-to-steal-the-featured-snippet/ https://diggitymarketing.com/how-to-steal-the-featured-snippet/#comments Wed, 08 Feb 2017 13:11:08 +0000 http://diggitymarketing.com/?p=2136 One of the most impactful developments to the Google SERP has been the introduction of the featured snippet optimization. But what exactly is it? You’ve seen them plenty of times before – most likely whenever you’ve asked a “how to” question to Google. As search features are becoming increasingly important, featured snippet, also known as the quick answer or knowledge graph, is essentially an area at the top of the first page where the searcher’s query is answered right then and there. With years of experience in SEO and a deep understanding of Google’s search algorithms, I am here to help. And by the time you finish this article, you’ll not only know all about these coveted SERP positions, but you’ll know how to steal them from your competitors as well. Quick Summary Types of Featured Snippets Snippets come in various forms. Most commonly… Answers to how-to Questions Recipes Weather Branding Queries Videos Are Featured Snippets Actually Good for us SEOs? In short, both yes and no. Let me explain… No – because snippets essentially try to answer the question without the user having to leave the SERP result by clicking on your page. If people aren’t coming to your page, then they’re not useful to your clients, they’re not generating leads, and they’re not clicking on your affiliate links. Yes – because if you do manage to take over this critical spot on the first page, you now take up the biggest chunk of real estate on the first page. I mean, look at the size of these things.   It’s hard not to click them. The greatest aspect of the snippet is that you don’t have to seize the #1 position in order to get the snippet’s coveted “position #0” in the SERP. Simply being somewhere on the first page is enough. (Courtesy of Moz) While the debate may continue on whether or not featured snippets are a good thing for us SEOs, the fact of the matter is that they’re here to stay.  So it’s better that we take it over, rather than our competitors. This guide will show you how to do exactly that. I’ve spent hours testing various techniques for snippet snatchery and have laid them all out here for you. BONUS: Download My FREE Onsite SEO Tools Guidance for All My Onpage Tactics By following these 3 steps, you’ll not only learn how to a create web page that will attract snippets, but also how to steal featured snippets when they’re not already yours. Step 1 – Set the Baseline Featured snippets are highly attracted to structured markup. There are essentially three different types of structured markup that are especially effective to trigger featured snippets. Structured Markup that Featured Snippets Love Unordered Lists Unordered lists are basically bullet point lists that can be used to clarify different points that aren’t sequentially related to each other. These types of lists are great for answering questions that might have multiple answers. For example… Query: “What are the signs of aging skin?” As can be seen here, when you steal featured snippet for this query, you will quite easily draw buyers for affiliate products in the anti-aging skin care niche. Ordered Lists Ordered lists are similar to unordered lists but the list items follow a sequence.  They’re especially useful for specifying steps in a series, such as following a recipe for cooking. Or… Query: “How do I unclog a drain?” Capture this featured snippet for your plumbing lead generation site and you’ll end up with more leads than you can deal with. Tables Tables attract different types of featured snippet and this trend is increasing over time. Tables are useful for displaying a matrix of data, such as “who played in the last five Super Bowls?” Or… Query: “Best Treadmill?” (Note: there’s something special about this table featured snippet which we’ll visit later) Needless to say, your Amazon Associates account would love the clicks that this featured snippet attracts. Headings Many times you’ll notice that the featured snippet is preceded by an H2 or H3 heading that actually includes the keywords that you’re searching for. Let me explain in an example: “Who are the premier league winners?” Make it a rule-of-thumb to “kick off” your lists and tables with headings. Although you’ll mostly be using H2s and H3s for your featured snippets if you have questions on writing killer headers check out what is an h1 tag. Where to Insert Structured Markup onto Your Pages Take an hour to look at every web page on your website that you’re actually trying to rank.  You can ignore your supporting pages for topical relevance. Get a notepad and write down every section of your content that answers a question. Selectively choose a handful of these sections and display their content with the particular type of structured markup that best gets the job done. For example, if I were to have a page called “What is Garcinia Cambogia?”, then I might have the following subsections: Introduction to Garcinia Benefits of Garcinia Cambogia Ingredients of Garcinia Cambogia How to take Garcinia Where to Buy It Sections 2, 3, and 4 are great candidates for structured markup. Unordered List Example <h2>Benefits of Garcinia Cambogia</h2> The benefits of garcinia include the following: Increased Metabolism Reduction of Body Fat Lessens Cravings for Sugar and Dense Carbohydrates Easy to Digest Table Example <h2>Ingredients of Garcinia Cambogia</h2> To create my tables, I use the WordPress plug-in TablePress, which in my opinion is the best and easiest to use, with most themes. Protip: When implementing tables and you’re reviewing a list of products, always list your products from top-to-bottom in the rows, while the specifications and score information are listed from left-to-right in the columns.  In my experience, tables which follow this format (like the treadmill snippet you saw before) are highly more likely to capture the featured snippet than the opposite.  In fact, I’ve only seen one instance of it captured the other way. Ordered List Example <h2>How to Take Garcinia</h2> Consult with your doctor before taking the supplement Ensure that you’ve had a decently portioned meal Drink with plenty of water Avoid using heavy machinery or participating in S&M for 24 hours Insert structured markup where it fits in the content, but don’t overdo it. Keep your sections moderately small (around 50-150 words), as lengthy sections tend to not be snippet-friendly. In terms of markup frequency, I try to make sure that there’s no more than one visible piece of marked-up content on a given vertical snapshot of the page. Now what? Now that you’ve set the baseline, it’s time to wait. If your site is new then you need to wait until you have some ranking pages to get to the first page and become eligible for the snippet. If your site has pages already ranking on the first page, force a re-crawl and wait a few weeks to see if the Google search console liked your changes enough to trigger featured snippets. How do you figure out which types of featured snippets you obtained? How do you figure out which types of featured snippets you didn’t obtain? Onto the next step. Step 2 – Snippet Discovery Phase In this phase you’ll be using SEMRush to reverse engineer which of your niche’s search queries have featured snippets.  You’ll be able to see which featured snippets you’ve already obtained and which snippets are not yet yours. First, open SEMRush and perform “Organic Research” to give organic search results on your website.  Make sure this is a site explorer that has at least some page 1 rankings of the search volume. Click on “Positions” in the left hand column to bring up the following view. Notice on the right hand side, you’ll see “Featured Snippet”.  Go ahead and click on it. (No, this is not my website.) Here you’ll find a list of keywords that are featuring a snippet. Some of these you might already own, some you won’t.  But essentially these are the keywords you’re going to optimize for. Start to manually check each keyword and mark down if you have the snippet or you don’t: Do note that this tool isn’t entirely accurate, but it’s not the tool’s fault. Featured snippets results are very dynamic.  Sometimes the result disappears entirely for a few days and then pops back with a new winner. For now, you’re just looking for a general list of keywords where you can steal google featured snippets from your competition. Step 3 – Steal Featured Snippet Now that you know which keywords to target, it’s time to snatch these bad boys up. This process is simple in nature. You’re essentially going to mimic what your competition has already done – but you’re going to do it better. And why will you beat them? Because I’m about to share some snippet-stealing ninja tactics with you that they don’t know about. Step 3a: What is form of the answer? For each of your keywords that Read More Read More

The post How to Steal the Featured Snippet in 3 Sneaky Steps first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/how-to-steal-the-featured-snippet/feed/ 143
How to Get Unstuck by Establishing Topic Relevance https://diggitymarketing.com/how-to-get-unstuck-by-establishing-topic-relevance/ https://diggitymarketing.com/how-to-get-unstuck-by-establishing-topic-relevance/#comments Wed, 01 Jul 2015 07:17:12 +0000 http://diggitymarketing.com/?p=619 Along this continual road as a student of SEO, there have been a handful of breakthrough techniques that I’ve picked up which helped me rise to new level.  Actually, I hesitate to call them techniques, since some of them are so simple, they really could be coined as “tricks” or “hacks”.  Nevertheless, before I had these breakthroughs I was stuck with a particular problem, and after I had integrated them into my knowledge-base, my entire suite of client and personal money sites experienced huge gains. Some examples of these breakthroughs can be found in my posts on: Anchor text selection PBNs that actually rank The black sheep principle Sandbox reduction technique The particular hack I’ll be discussing in this article is how to create “On-page Topic Relevance” If you have any of the following issues, this article is especially for you: Trouble ranking a niche micro-site Couldn’t rank a new silo/product/review page on an authority site Had a client who wanted to rank in a new city and you couldn’t get past page 2 Each of these issues has the same root problem: not enough pages on a specific topic.  Google doesn’t like thin content nor thin sites.  This isn’t the old days where you could rank single landing pages. Here are the two key takeaways that you must understand: Google now prefers to rank domains that have multiple pages on a given topic. When you try to rank a specific page on a particular topic, Google expects to see that there are other supporting articles on the same site using it as a reference (i.e.: linking to it). This is called ‘establishing relevance’ on a topic. Understanding this concept, you can know the bare-minimum page count requirement that you’ll need to rank plus you’ll have full control over which page ranks by using contextual inner-page linking. Here’s some real life client examples of how this plays out… Example #1: Authority site won’t rank for anything new A client of mine had a massive authority site related to fitness.  Each day, for the past 6 years, he’d personally post a handwritten workout-related article.  The client had built up so much authority and topic-relevance on fitness, he could toss up any brand new article about ‘crossfit’ (highly competitive) and it would rank in less than a week. This client wanted to make more money from affiliate commissions by writing a ‘Top 5 Protein Supplements’ article.  He’d wrote a killer page, with perfect onsite elements and it simply wouldn’t rank, even after hitting it with backlinks.  Google knew his site was about fitness (there was no doubt about that) but he was an amateur on the protein supplement topic. To resolve this, I simply created at 4 random articles about supplements and linked them contextually to his review page.  It ranked in 2 weeks without the need for additional backlinks. Example #2: Local SEO client wants to expand into a new city Establishing relevance often comes into play in local SEO. I have a surgeon client in San Francisco, CA that was ranking fine in San Francisco. He came to me because he was having trouble ranking in the neighboring city of Oakland.   Despite having an excellent Domain Authority (DA) of 33 and 20+ PBN links pointed to the Oakland page, it simply wouldn’t break into page 1. All I did was throw-up 4 (random-ass) articles that talked about Oakland and linked them to his landing page. Oakland Nightlife Review Best Restaurants in Oakland Budget Hotels in Oakland Famous People from Oakland Google knew his site was about his particular form of surgery.  They also knew his site was all about San Francisco because of the Schema, Title tags, About and Contact pages which all discussed San Francisco. His site just didn’t have enough to do with Oakland, so I gave them what they wanted.  Even though the articles had nothing to do with his service, that didn’t matter.  He needed to establish topic relevance for the city of Oakland and that was all. The Tech: How to set up your site for topic relevance Step 1: Create at least 4 Supporting articles for a topic on a site Generally, I’ve found that Google wants to see at least 5 indexed pages of a site, in order to establish enough topic relevance.  This would include your landing page that you want to rank, plus 4 additional supporting articles (500 words+) that you’ll be linking to it. Sure, you can get away with less, I just find that 5 articles seems to be the magic number which makes things a lot easier. Step 2: Contextually Link from the supporting articles to the main landing page A contextual link is a link which is placed in the middle of the body of an article.  Why is it important? Of all types of links (contextual, sidebar, footer, navigation bar, etc), it passes the most link juice.  Furthermore, it’s another way of saying that the content of the supporting article is reinforced by the master topic article.  Namely, the one we’re trying to rank. What kind of anchor text should you use?  Onsite anchors follow the same rules as offsite anchors but you can be much more relaxed in terms of diversity.  Out of 4 anchors, I’d make them as follows… Target anchor: DUI lawyer in Tampa Target anchor: Tampa DUI attorney URL anchor: site.com/tampa-dui-lawyer Misc anchor: click here Step 3: Get the supporting pages indexed This part is crazy.   The supporting pages don’t even need to be accessible from the navigation bar. As long as they’re in the XML sitemap and you’ve indexed them, they’ve played their part of establishing relevance for the master landing page. That’s why I’m able to get $2 iWriter articles written that have nothing to do with the client’s sales copy.  No one will ever read these pages. Step 4 (Bonus): Make sure all pages about topic ABC, only link to other pages about topic ABC This part is the icing on the cake.  For each of the supporting article pages, try to make sure that they link only to the other pages of this particular topic, and don’t cross link to unrelated pages.  This includes links from the navigation bar, sidebar and footer.  Once you do this, there is no confusion at all about your various silos and their topic relevance. I pull this off by turning off the navigation bar on the supporting article pages, while installing the “Custom Sidebars” plug-in to create page-specific navigation bars for the side bar. The concept of topic relevance is often completely missed.  Most of my coaching students are blown away when they realize how simple it is, yet the results are so huge and so quick. Try it out, and be sure to leave a comment on your result.   Along this continual road as a student of SEO, there have been a handful of breakthrough techniques that I’ve picked up which helped me rise to new level.  Actually, I hesitate to call them techniques, since some of them are so simple, they really could be coined as “tricks” or “hacks”.  Nevertheless, before I had these breakthroughs I was stuck with a particular problem, and after I had integrated them into my knowledge-base, my entire suite of client and personal money sites experienced huge gains. Some examples of these breakthroughs can be found in my posts on: Anchor text selection PBNs that actually rank The black sheep principle Sandbox reduction technique The particular hack I’ll be discussing in this article is how to create “On-page Topic Relevance” If you have any of the following issues, this article is especially for you: Trouble ranking a niche micro-site Couldn’t rank a new silo/product/review page on an authority site Had a client who wanted to rank in a new city and you couldn’t get past page 2 Each of these issues has the same root problem: not enough pages on a specific topic.  Google doesn’t like thin content nor thin sites.  This isn’t the old days where you could rank single landing pages. Here are the two key takeaways that you must understand: Google now prefers to rank domains that have multiple pages on a given topic. When you try to rank a specific page on a particular topic, Google expects to see that there are other supporting articles on the same site using it as a reference (i.e.: linking to it). This is called ‘establishing relevance’ on a topic. Understanding this concept, you can know the bare-minimum page count requirement that you’ll need to rank plus you’ll have full control over which page ranks by using contextual inner-page linking. Here’s some real life client examples of how this plays out… Example #1: Authority site won’t rank for anything new A client of mine had a massive authority site related to fitness.  Each day, for the past 6 years, he’d personally post a handwritten workout-related article.  The client had built up so much authority Read More Read More

The post How to Get Unstuck by Establishing Topic Relevance first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/how-to-get-unstuck-by-establishing-topic-relevance/feed/ 236
Attention Affiliates: Cloak Your Links https://diggitymarketing.com/attention-affiliates-cloak-your-links/ https://diggitymarketing.com/attention-affiliates-cloak-your-links/#comments Sat, 08 Nov 2014 07:06:52 +0000 http://diggitymarketing.com/?p=288 It’s no secret: Google isn’t a big fan of affiliate websites.  Look at it from their point of view.  They’re trying to rank the site with the best information.  An affiliate site’s primary purpose is selling a product, so they know you’re going to be saying damn well anything to get people to click on those affiliate links.  Which is likely far off the mark from “quality content.” Leaving your affiliate links looking how they usually do (i.e.: long and sloppy) is a dead giveaway that you’re an affiliate.  Not only to Google, but to your visitors as well.  If you site is a “unbiased” product review site, you’ll lose credibility when a reader sees that you’re directly making money off the product you reviewed. So whats the solution? Cloak those Affiliate Links Step 1: Create a Sub-folder on your site This can be easily accomplished by going to your cpanel, opening your file manager, and creating a new folder at the root level of your site. This is the directory where we’ll place your links, so name it something relevant. Some examples: www.site.com/products/ www.site.com/recommendations/ www.site.com/links/ Step 2: Edit your robots.txt Add a “Disallow” line for the directly you just made User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /products/ Note: Some WordPress XML sitemap plugins interfere with the robots.txt by overwriting it.  You don’t want that to happen. Step 3: Open WordPress and Install the Plugin: Redirection Redirection can be found here. Once installed, go to Tools > Redirection For each of your affiliate links, specify how you would like the links to look in the “Source Url” field.  Specify your gnarly affiliate link in “Target URL“. That’s it. Note: If you’re an Amazon Affiliate, cloak at your own risk.  Amazon is pretty unforgiving about Terms of Service volations and this is definitely a bannable offense.   It’s no secret: Google isn’t a big fan of affiliate websites.  Look at it from their point of view.  They’re trying to rank the site with the best information.  An affiliate site’s primary purpose is selling a product, so they know you’re going to be saying damn well anything to get people to click on those affiliate links.  Which is likely far off the mark from “quality content.” Leaving your affiliate links looking how they usually do (i.e.: long and sloppy) is a dead giveaway that you’re an affiliate.  Not only to Google, but to your visitors as well.  If you site is a “unbiased” product review site, you’ll lose credibility when a reader sees that you’re directly making money off the product you reviewed. So whats the solution? Cloak those Affiliate Links Step 1: Create a Sub-folder on your site This can be easily accomplished by going to your cpanel, opening your file manager, and creating a new folder at the root level of your site. This is the directory where we’ll place your links, so name it something relevant. Some examples: www.site.com/products/ www.site.com/recommendations/ www.site.com/links/ Step 2: Edit your robots.txt Add a “Disallow” line for the directly you just made User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /products/ Note: Some WordPress XML sitemap plugins interfere with the robots.txt by overwriting it.  You don’t want that to happen. Step 3: Open WordPress and Install the Plugin: Redirection Redirection can be found here. Once installed, go to Tools > Redirection For each of your affiliate links, specify how you would like the links to look in the “Source Url” field.  Specify your gnarly affiliate link in “Target URL“. That’s it. Note: If you’re an Amazon Affiliate, cloak at your own risk.  Amazon is pretty unforgiving about Terms of Service volations and this is definitely a bannable offense.   Read More

The post Attention Affiliates: Cloak Your Links first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/attention-affiliates-cloak-your-links/feed/ 27