Case Studies https://diggitymarketing.com Wed, 14 Feb 2024 14:47:48 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://eb75zekerce.exactdn.com/wp-content/uploads/2016/03/cropped-favicon-1.png?lossy=0&sharp=1&resize=32%2C32&ssl=1 Case Studies https://diggitymarketing.com 32 32 Case Study: The Programmatic SEO Approach that Got Attention from Oracle and Google https://diggitymarketing.com/case-study-programmatic-seo/ Mon, 15 Jan 2024 13:23:43 +0000 https://diggitymarketing.com/?p=3055172 What if I told you there’s an SEO tactic that allows you to create hundreds of thousands of high quality pages at scale and at a fraction of the cost of doing so manually? Welcome to Programmatic SEO. In this case study, you’ll learn how my team at The Search Initiative grew our client’s organic traffic by 38% sessions. This was achieved by adopting a programmatic approach to SEO. Instead of hiring writers, we designed a single page template and let programming do the rest, creating 500 pages.. On top of this, the site received links from over 700 referring domains including industry heavyweights like Oracle and Google – without us doing any link building ourselves. In this case study, you’ll learn how to: If you’d prefer to watch rather than read, I cover some of the key insights in this video. Before that, here’s some more information about our clients goals and the main challenges that we overcame during the campaign. The Challenge The client operates in the software industry, offering a platform to secure and manage open source development for processes that involve custom combinations of tools and platforms. The client didn’t have any visibility for keywords that would make them money which meant that there was a large pool of lucrative keywords that were missed out on. Ranking one or two new pages wasn’t going to cut it. So we created a long-term plan on how to attain tens of thousands of keywords and without manual effort optimizing each page… programmatically. Find out how you can overcome these challenges for your website by following the steps below. Generating Content Through Programmatic SEO Traditional SEO focuses on enhancing a website’s search engine visibility through high-quality content and keyword-centric landing pages and blogs. In most cases, it’s a perfect way of building keywords, links, and mentions, as long as your content is well-crafted and adds value to the end user. But what if there was a way to do this at scale? What is Programmatic SEO? Programmatic SEO (often referred to as pSEO) involves creating landing pages at scale through automation with the aim of ranking them on the search results pages. Each page is uniform, targets a single unique keyword and is created using automated tools and a database. You’ve probably already used websites that have created pages in this way. Real estate websites, eCommerce sites, and even content sites (examples below) have been using this method to pass information from one system to another, allowing them to create thousands of properties or product pages populated with data from suppliers with generated titles, headings, images, descriptions etc. Here’s an example on Realtor.com. If you were to search “houses for sale in [location]”, you’re likely going to find the following kind of pages in the results. Los Angeles New York Note how the layout of each of these pages is identical, but the content (i.e. the properties) change. A quick Google search reveals that there are 19,495 cities, towns and villages in the USA. Realtor.com can’t possibly create a page for every location manually. Instead, they use programmatic SEO. What’s the Difference Between Traditional & Programmatic SEO? Although the goal of both of these is the same, to grow organic traffic, the difference lies in how this is achieved. Traditional SEO focuses on growing search visibility over a long period of time with a focus on producing high-quality, unique content that targets higher competition keywords. Programmatic SEO achieves the same goal, but at a quicker pace as you’re creating multiple pages with an identical layout at scale from a database, templates and automation. In most cases, if your template takes a page and duplicates it, only changing just the keyword – that sucks for users and will likely also get penalized by Google for having large amounts of duplicate content. However, if you use a template to create thousands of products and generate helpful content that accurately describes the specific product, you’re creating value to the user, which is what Google is looking for. However, when it comes to local SEO you do indeed get a free pass and only need to change the city name in order to be “unique enough”. Example Uses of Programmatic SEO Apart from real estate websites like Realtor.com, here are some other examples where websites have employed programmatic SEO to build their pages. Informational Sites Informational sites like Nomad List use programmatic SEO to help users find the best places to live, work and travel. The site takes data about things like cost of living, internet speeds etc for places around the world and produces landing pages in a way that is easy to understand and read. The site is essentially taking readily available data and repurposing it to provide valuable insights to its audience. Directories and Workflow Apps Zapier, a workflow automation tool that integrates thousands of applications and tools, programmatically made landing pages for every single tool they work with showing custom workflows that can be made with the unique combination of the user’s chosen tools. In the financial sector, Wise made a template for a landing page for every single currency they work with, helping their clients convert between any number of currencies. Travel Itinerary Planners Platforms like TripAdvisor and Expedia that help users plan their travel itineraries implement programmatic SEO to dynamically generate destination guides, optimize meta tags for popular attractions, and create content tailored to specific travel preferences. For example, TripAdvisor aggregates hotels based on location so that when you search for keywords like “hotels in japan” or “restaurants in london”, you’ll see pages like this: Job Portals Job search websites like Indeed or Glassdoor often use programmatic techniques to generate job listings, company profiles, and other content for search engine optimization. The template for their job listings pages remain the same. The difference here is that the content is generated by their users i.e. companies and individuals uploading the job listings to their system so that they can be displayed in the relevant listing pages. These are just a handful of examples, but regardless of your industry or size of website, you can make use of pSEO to generate pages at scale. Pros and Cons of Programmatic SEO Here are some of the advantages of Programmatic SEO: But before you commit to pSEO, here are some things you need to be aware of: Despite these disadvantages, and even if you don’t have a massive website, you can still incorporate pSEO into your SEO strategy. Read on to find out how… How to Generate Content Through Programmatic SEO Creating truly programmatic content requires data that needs to be sourced to create content for hundreds, if not thousands of pages. We’ll start by identifying keywords that you can target at scale and data sets that you can use to integrate into your content before using Google Sheets and ChatGPT to build out your web pages. Finding Scalable Keywords For programmatic SEO to be effective, you need to be able to target hundreds or thousands of similar keywords using a single page template. This requires identifying relevant keywords that have many variations that are similar. One way to do this is by entering a seed keyword relevant to your website into Ahrefs’ Keywords Explorer tool. For example, if you had a website about everything vegan, you might use terms like “vegan restaurants”. Then click on the Matching Terms report, which shows an expanded list of possible keywords related to the seed keyword. Top tip: consider refining the search results by adjusting the Keyword Difficulty to less than 20 and the Domain Rating (DR) of top-ranking sites to less than 30. This filter will show keywords that are easier to rank for, even with limited backlinks. Identify patterns within the keywords to see which of these can be used to create pages for programmatically. In this case, many of the keywords refer to a particular location, preceded by the term “best” i.e. “best vegan restaurants in chicago”. You can do this by sorting the keywords by “Term” as opposed to “Parent Topics”. You now have 176 keywords (in the USA) that follow a similar pattern for you to create pages for. Identify Modifiers In the previous example, the keywords identified can be split into two categories: head terms and modifiers. The head term is the top-level category of the keyword that you’ll aim for i.e. best vegan restaurants. The modifier is what turns the head term into long-tail keywords (these are highly specific search terms that have lower search volumes) i.e. in chicago. Here’s another example… For a travel website, you could have the following head term and modifier combination: “things to do” + “in [location]”. You could also narrow things down further by adding a secondary modifier: “things to do” + “in [location] + “for [target audience]”. By the end of this process, you’ll have a long list of keyword variations that you can now start to generate content for. Read More Read More

The post Case Study: The Programmatic SEO Approach that Got Attention from Oracle and Google first appeared on Diggity Marketing.

]]>
88% Monthly Traffic Growth & Outranking Amazon (SEO Case Study) https://diggitymarketing.com/content-optimization-seo-case-study/ Mon, 02 Oct 2023 12:39:44 +0000 https://diggitymarketing.com/?p=3054800 Going up against big competitors in your industry, like Outlook India for Affiliate sites, Amazon for eCommerce sites or Wikipedia for informational queries, isn’t easy, but it’s not impossible either. With careful content optimization, following the right UX best practices and building authority with quality backlinks, there’s nothing stopping you from doing what we did with our client’s rankings. In this case study, you’ll learn exactly how my team at The Search Initiative increased our client’s organic traffic from 72.4k to 136.4k sessions in under a year – and outranking the likes of Amazon for important keywords. In this article, you’ll learn how to: If you’d prefer to watch rather than read, I cover some of the key insights in this video. Before that, here’s some more information about the website’s goals and the main challenges that we overcame during the campaign. The Challenge The client is an online adult store primarily targeting the United States. The main objective for this campaign was to grow visibility for commercial intent keywords so as to drive organic traffic to important category pages, with the ultimate aim of generating more online sales through eCommerce. One of the first things we did to build the site’s domain authority and ranking power so that it could compete against the likes of Amazon – one of their main competitors for adult toys – was to leverage the power of AI to execute a blogger outreach strategy. Didn’t think Amazon rank for these kinds of keywords? Think again… At the same time, with the client operating in such a lucrative market, we needed to build topical authority and present them as a credible source of information. To address a lack of topical authority, we implemented a content strategy that focused on creating informative blog articles that are related to the adult toys that the client sells, and the niche in general. For any website looking to improve conversions, you need to ensure that you’re providing users with a great experience; and that everything works as expected. The client’s website lacked basic UX best practices that were costing them both money and traffic. Finally, to improve the positions of the client’s “money” keywords (i.e. the ones that have a commercial intent and bring in the most conversions), we focused our attention on optimizing existing and creating new category pages (more on this later). Learn how to tackle these challenges for your website by following the steps below. Building Backlinks With Blogger Outreach (& AI) When it comes to boosting the rankability of a web page, few SEO tactics come close to being as powerful as link building. One such link building tactic that we employ (and have continued to get great success from) at TSI, is blogger outreach. Blogger outreach involves finding topically relevant websites who you can then reach out to for a link back to content that their audience will find useful. Let’s break down the main steps on how to do blogger outreach – with the help of generative AI tool ChatGPT. Link Prospecting There are a number of ways you can find link prospects, but in this case study, I’ll show you the cool new way of how to do it – using ChatGPT. Link prospecting is about finding niche relevant websites that can link back to your website. I recommend using GPT4 in conjunction with the WebPilot plugin enabled. To do this, go to Settings and enable plugins. Go back to the chat and open the Plugin dropdown and select WebPilot – this is a plugin that enables GPT to scan live websites as opposed to relying on its training data from 2021. So here’s a prompt that you can use to find these prospects: Using the WebPilot plugin for ChatGPT, please give me a list of 20 popular websites within the [enter your niche]. Here’s an example of a response from GPT providing websites from the audio tech niche. It was able to find an article that lists 30 popular audio tech sites, and it asked if that’s what I wanted. I said yes, and it listed the sites along with their links. Now, AI tools like ChatGPT aren’t perfect (yet), so It’s important to manually check each of these sites to ensure they’re relevant to your website. You can also apply a similar approach to identify online communities within your niche. Here’s a prompt: Using the WebPilot plugin for ChatGPT, please give me a list of 10 popular online forums/communities within the [enter your niche]. Here’s the response from ChatGPT: By the end of this process, you should have a considerable list of potential websites to get a backlink from. Getting Contact Details Once you’ve prepared your list of websites that you want to reach out to, the next step is to find the contact information for them i.e. an email address of a writer at the site, or via their general contact form. You can usually find email addresses on the contact us page of the website (see why having one is important?), but if you’re unable to find one, you can use a tool like Hunter.io. Hunter.io is a tool that you can use to do this – they have a Chrome extension that allows up to 25 searches per month for free. If you’re using their website, enter the domain into Domain Search and hit enter. You can see that the tool has found 21 email addresses for the example site. Pitch Creation The most important step in this process is the pitch – this is what will be the deciding factor on whether or not you’ll get a backlink from your prospective site. The key to crafting a great outreach pitch is ensuring that you’re adding personally relevant information about the website that you’re pitching to. This doesn’t mean that you can template the process, but that you still need to personalize the pitch based on who you’re pitching to. In general, you need to explain: To create a pitch worthy of securing backlinks, why not use the help of ChatGPT? Here’s an example of a prompt that you can use to do this: Using the WebPilot plugin for GPT, please write a blogger outreach pitch to [enter prospective website URL] (might want to include their about page) with the goal of writing a guest post for their site. Please include a subject line. My website is [enter your URL], and it’s on the topic of [insert your niche]. Guidelines: Here’s what ChatGPT came up with: GPT even provides a compelling subject line that you can use. Monitoring Progress Once the emails have been sent out, all is not done. You should monitor your progress to identify which pitches (and subject lines) worked the best. Blogger outreach is something that constantly evolves over time, especially as not every website will get back to you on the first go. You can also ask GPT to provide a template for follow up emails… Learn more about how to carry out blogger outreach in detail here. Here’s a look at the kind of results we’ve got from this link building campaign: Building Topical Authority with Engaging Blog Content Apart from optimizing and creating new content for your website’s core landing pages (i.e. the pages that drive your business), it’s important to grow your topical authority – which is a concept that’s been thrown around a lot in recent years. Read on to find out what it is, why it’s important and how you can build it for your own site. What is Topical Authority? Topical authority refers to a website’s perceived expertise and credibility on a specific topic or subject area. Search engines like Google aim to deliver the most relevant and high-quality content to searchers’ queries and use signals like topical authority to determine the ranking of web pages in search results. Websites that consistently produce high-quality, informative, and relevant content on a specific topic are more likely to be seen as having topical authority. Here’s an example… Imagine you’re venturing into the realm of sustainable fashion. Crafting a single article that revolves solely around ‘sustainable clothing’ might not be sufficient to thrive in this niche. Why? Because sustainable fashion is a broad subject, and attempting to encapsulate its entirety within a single article would be an overwhelming task. To establish yourself as an authority in this field, your content needs to cover a wide range of topics within sustainable fashion, including: Why is Topical Authority Important? Google associates a website with a topic in order to rank it as a relevant resource for search terms about the particular topic. By having a lot of content about a specific topic on your website, you create opportunities to add internal links between the pages so that Google and users are more likely to find your content more easily. Doing so also helps increase your topical authority in the eyes of Google and helps show that your website is the go-to Read More Read More

The post 88% Monthly Traffic Growth & Outranking Amazon (SEO Case Study) first appeared on Diggity Marketing.

]]>
How to Grow Traffic by 368% (Market Disruption Case Study) https://diggitymarketing.com/market-disruption-case-study/ Mon, 14 Aug 2023 07:49:21 +0000 https://diggitymarketing.com/?p=3052778 Whether you’re starting out from scratch or growing your existing brand’s organic search presence, there’s nothing stopping you from disrupting the search results pages of your industry one step at a time. The key to doing this is keyword selection. Combine this with killer, shareable content and a solid link building strategy, and you’re well on your way to boosting your organic traffic. In this case study, you’ll learn exactly how my team at The Search Initiative increased our client’s organic traffic by 368% and monthly revenue by 445% despite their industry being dominated by a big competitor. If you’d prefer to watch rather than read, I cover some of the key insights in this video. In this article, you’ll learn how to: Before that, here’s some more information about the website’s goals and the main challenges that we overcame during the campaign. The Challenge The client produces and sells nicotine alternatives that use natural ingredients instead of tobacco. The brand and product had close-to zero online presence when joining the agency and their main competitor had already monopolized the market – so the main objective for the campaign was to build organic traffic from the ground up and generate online sales through eCommerce. One of the main challenges was the fact that keywords that had any substantial search volume were extremely competitive and dominated by their main competitor. As a start-up, the client’s site lacked the topical and domain authority required to be able to compete for such terms. Instead, we focused on building visibility by targeting informational search queries about quitting smoking such as smoking alternatives. By focusing on producing more content on the blog, we needed to make sure that the client’s products were showcased in the articles so that users were enticed to click through and place an order. This was achieved by implementing a related posts section (designed to keep users longer on the site), a sidebar (so that we could add calls to actions) and adding internal links (to drive users towards relevant product pages). To build the brand’s online presence, we focused our link building efforts through digital PR, so that people would start to know that the brand and product range existed. Check this video below to learn more about digital PR. Finally, we rounded off the strategy by making sure that any technical issues holding back the site’s performance were addressed. The two main issues that we tackled was to implement a more user-friendly URL structure (so that they were easy to read, remember and share for users) and to improve the page performance (by compressing and minifying files that were large). Find out how you can overcome these challenges for your website by following the steps below. Informational Keyword Targeting With the commercial keywords out of bounds due to high competition and lack of topical authority, we focused on building the site’s keyword visibility for informational search terms instead by producing content around question-based keywords related to smoking. How to Find Question Based Keywords to Target I’ll show you two ways to find question-based keywords that are relevant to your site’s niche. Ahrefs Keywords Explorer The first is using Ahrefs’ Keywords Explorer tool. You can now go ahead and write an article about the lifespan of cats. People Also Ask If you don’t have Ahrefs, you can still find informational keywords from the People Also Ask section in Google’s search results. In the above example, Google added the two following questions “What is the friendliest cat to own?” and “What is the kindest type of cat?” when expanding the original question “What is the best cat for first time owner?”. Writing Content for Informational Search Terms Once you’ve found the informational keywords that you want to target, the next step is to start writing the content. Here are some tips on how you should do this: Here’s what you need to look for: Use your competitors as a baseline for what you need to include within your page. You should also look at how they have optimized other on-page elements such as the title tag, H1 heading, meta description and their URL for the target keyword. For example, here’s the featured snippet for the keyword “how long do cats live”. And here’s where the answer is taken from within the ranking page. As you can see, the page provides a clear and concise answer to the question right at the beginning of the article. Then it goes into much more detail about the lifespan of cats later on in the article by covering more specific topics like which type of cats live the longest etc. The short answer caters to users who just want an instant answer to their search query whereas the rest of the article is for users who want to learn more. This is especially important if you’re writing about a topic that can impact a user’s physical, mental or financial well being. But even websites that don’t fall into this category need to demonstrate their expertise in order to build trust. Here are a few ways you can achieve this: Following the above steps will help grow your keyword visibility for informational search terms. Redesigning The Blog for SEO Apart from optimizing and expanding the content on your blog, it’s also important to ensure you’re following best practices in making it as functional as possible for users. Below are things that you should include within your blog to improve navigation and overall usability. Add Related Posts A Related Posts section provides readers with additional relevant content that they might be interested in. It helps users discover more of your blog’s content, increasing their engagement and time spent on your website. In addition to this, related posts help: How Should I Choose Related Posts? When selecting the related articles, make sure that they are indeed relevant to the current post so that readers are naturally encouraged to continue reading. Where Should the Related Posts Be Added? The most common place to put your related posts section is towards the bottom of the page. How Many Related Posts Should I Include? When it comes to how many related posts to include, I recommend aiming for 3 to 5. Adding too many will confuse readers as to what to read next and dilute the importance of each related article. How Can I Add Related Posts? One of the most popular WordPress plugins for related posts is the Contextual Related Posts plugin. It’s a free plugin that automatically generates a list of related posts based on contextual cues from your blog post’s title and content. All you have to do is install and activate the plugin. If you’re looking for something that allows for more customization, check out the Custom Related Posts plugin which allows you to pick your related posts manually. Add A Sidebar The purpose of most blog posts is to provide information, not to sell. However, that doesn’t mean that you can convert users looking for information to customers. One way to do this is to leverage the space in your sidebar by adding calls to action (CTAs) and other enticing content. Here are some examples of what you can add to boost engagement and conversions: Building out your sidebar by adding these elements can transform the engagement levels on your website. Add Internal Links You’ve already learned about internal links in the form of Related Posts, but let’s focus on internal linking in its most traditional form. The idea of internal linking is to add a hyperlink on one page on your website to another page. These links help establish connections between different pages, improve website navigation, and assist search engines in understanding the structure and hierarchy of your website’s content. Here are some tips on how to implement internal linking on your blog: Use Relevant Anchors: use descriptive anchor text (the clickable part of the link) that accurately reflects the linked page’s content. If possible, try to use anchor texts that include the main keyword that you’re trying to rank on the linked page. Strategic Placement: incorporate internal links so that they fit naturally within the body of your content. Link to Relevant Pages: distribute your internal links across relevant pages to ensure a well-connected website structure. Don’t Overdo It: avoid excessive internal linking on a single page, which can confuse users and dilute SEO value. Link Back to Your Blog Post: don’t forget to also add internal links back to your blog post too so that it too can benefit from other pages’ authority. Building Brand Awareness With Digital PR Considering that the client’s brand had no online presence and the product range itself had zero search volume, the site needed a boost in domain authority. The best way to do this is with backlinks. Instead of general blogger outreach, we showcased the client’s product range with content that highlights smoking alternatives, targeted towards people looking to quit smoking and built awareness of their Read More Read More

The post How to Grow Traffic by 368% (Market Disruption Case Study) first appeared on Diggity Marketing.

]]>
How to Grow Traffic by 60.99% & SEO Topical Authority (Case Study) https://diggitymarketing.com/seo-topical-authority-case-study/ Mon, 08 May 2023 09:05:20 +0000 https://diggitymarketing.com/?p=3047472 Whether you’re working on a brand new website, or building your existing site’s organic presence, it’s important to showcase your topical authority to both Google and your audience. They want to see that the information you provide, or the products that you sell, are of great quality and are coming from a credible source. That’s why it’s essential to present yourself as an authority through your content and your backlink profile. In this case study, you’ll learn the exact steps that my team at The Search Initiative took to increase our client’s organic traffic by 60.99%. In this article, you’ll learn how to: Before that, let’s find out a bit more about the website’s goals and the main challenges faced during the campaign. The Challenge The client is a website selling health supplements in the United States, who joined the agency having already decided that they wanted to migrate their dated website and build a fresh site from the ground up with SEO in mind. One of the main challenges of this campaign was to make sure that the migration was carefully planned and executed to ensure that the site’s original rankings and traffic were sustained. We also added E-E-A-T signals that established the client as a trustworthy and authoritative voice within the industry – read on to learn how to do this. The next focus was to produce well optimized content for the product pages by writing compelling descriptions that convert as well as producing supporting blog content to build topical relevance and improve keyword visibility for informational keywords. Finally, to build the backlink profile of the new domain, we boosted the newly optimized product pages by executing a link building strategy, including HARO (which you’ll learn all our tricks soon). Find out how you can overcome these challenges for your website by following the steps below. Executing an SEO-Friendly Site Migration Making the decision to migrate your entire website shouldn’t be taken lightly. In some cases you might not have a choice, which is why it’s worth being aware of the risks and challenges that you might face along the way. Apart from potentially completely losing or breaking parts of your website, a site migration can also have significant impacts on your SEO. I’ll walk you through the process of an SEO-friendly site migration below, but for now, let’s look at what a site migration actually is. What Is a Site Migration? A site migration is the process of making significant changes to a website’s structure, technology or design. Here are some examples on why you might want to consider a site migration: Unsure about which hosting provider to use? Check out my video below.  How Site Migrations Can Impact SEO A site migration is a substantial task that requires rigorous and thorough research and planning before you even begin to execute the strategy. With many moving parts and teams involved, it’s important to be aware of the SEO impacts that this can have. For example, without careful planning, you may lose your site’s precious rankings and traffic due to incorrect URL mappings and redirects or, you may encounter slow site-wide loading times as a result of incorrect server configurations. Even after the migration is complete and everything has gone to plan, it’s crucial to monitor progress to ensure that your site maintains and/or builds as much organic visibility as possible post-migration. SEO Site Migration Checklist I’ve put together this checklist so that you can be sure that your migration will be executed in an efficient and SEO-friendly way. Due to the scope of the work and risks involved, it’s good practice to plan your site migration for a time where, should things go wrong or there is a temporary dip, the impact is minimal. This is of course, assuming that your business has some seasonality to it. A site migration before or during the holiday season is not advised. The last thing you want to do is lose out on all of the potential traffic from a peak season as a result of your migration not being executed properly. Likewise, some days of the week may be quieter than others. For example, you may find that the start of the week is the best time to do your migration as fewer people browse your site at this time. Personally, I always migrate on Saturdays. Therefore, make sure to pick a smart migration date that allows time for you to monitor progress and iron out any issues that are presented post-migration. Using an SEO web crawler tool of your choice (i.e. SEMRush, Screaming Frog, Sitebulb etc), crawl your website so that you have a comprehensive record of everything that is on your website before the migration. For example, this will ensure that you have: You should also create a backup so that if for whatever reason the launch of your new site doesn’t go to plan, you can always revert back to the original if needed. Here’s a comprehensive guide on how to backup your website. This precautionary step is vital in making sure that you have peace of mind before launching. The next step is to create a copy of your website so that you can test and revise all of the changes that you intend to make – this is called a staging site. This staging site should be uploaded to a new server and ideally, should be on a separate domain or subdomain so that you can compare your new website to the original site before the launch. This’ll allow you to check and verify important things like making sure that the URLs from your old site are correctly redirected to the new site. Another important step is to make sure that you block access to this staging website from Google. This is to prevent both the staging site AND the original site from being indexed by search engines and in turn, potentially competing against each other for rankings. There are several ways you can do this: Add the following line of code to your pages: <meta name="robots" content="noindex"> You can’t carry out a migration without a clear and well organized URL map. Mapping your URLs involves keeping a list of all of your existing URLs so that you can match them up with the new URLs on your new website and implement the necessary redirects correctly. This is why crawling your original website comes in useful. In addition to your crawl, you should also compile your URLs from your XML sitemap (this is a file that serves as a roadmap for Google to access your pages). You can access your sitemap via the following URL:  example.com/sitemap.xml Your URL map can be as simple as this: Original URL New URL Redirected? olddomain.com/old-url/ newdomain.com/new-url/ Yes/No If you’re changing the structure of your URLs or combinings old pages into a single page, remember to map all of the combined pages’ URLs to the correct new URL. When it comes to actually migrating the content (i.e. your HTML files, images etc), I strongly recommend doing so in small chunks. The more pages you move at once, the higher the chances of something going wrong. So take your time and carry out the migration whilst carrying out checks and tests at each stage. This’ll make it easier for you to identify any potential issues that may arise. This is also your chance to update any of the content if you need to. Without implementing proper 301 redirects, your new pages will not receive the traffic and PageRank (ranking power) of the old pages, which will significantly impact the new site’s rankability. 301 redirects are a way of indicating to users and search engines that the location of a web page has permanently moved. Using your URL map, you can go through each of your pages and implement a 301 redirect from the old URL to the new URL. If for whatever reason you have an old URL that is no longer needed, you can redirect it to another relevant page on your website, or serve a custom 404 page to let users know that the page no longer exists. Once you’ve redirected your URLs, you also need to remember to update all of the internal links (hyperlinks from one page on your site to another) on your new website. You can find all of the internal links from your website crawl that you did right at the start of the process. You should change all of the internal links from the old URLs to the new URLs to avoid unnecessary internal redirects. Imagine you had the following scenario. If you don’t update the internal link, then Page A would have an internal link to Page b. But, when clicked, Page b would redirect to Page B. This creates an unnecessary internal redirect that when multiplied across thousands of requests, can seriously affect page load times. Therefore, you should update the internal link so that, you have: Although your Read More Read More

The post How to Grow Traffic by 60.99% & SEO Topical Authority (Case Study) first appeared on Diggity Marketing.

]]>
How to Grow Traffic 114.25% With a Content Driven SEO Plan (Case Study) https://diggitymarketing.com/finance-lead-generation-seo-case-study/ Mon, 13 Feb 2023 14:47:57 +0000 https://diggitymarketing.com/?p=2542048 Great content is the bedrock of SEO. Without it: In this actionable case study, learn the exact strategy that my team at The Search Initiative took to increase our client’s organic traffic by 114.25% in 12 months. In this article, you’ll learn how to: But first, let’s find out some more about the site’s goals and the main challenges that were faced during the campaign. And if you prefer to consume your content via video, you can watch the full break down here: The Challenge The client is a finance lead generation website targeting the United Kingdom. The main objective of the campaign was to drive further organic traffic to the core landing pages and diversify the types of keywords that the client was ranking for by targeting terms with an informational search intent as opposed to just commercial intent. The site had many core landing pages which targeted commercial intent keywords, but was lacking in content that focused on ranking for information search terms. This was a missed opportunity to address different stages of the user’s journey as well as create content rich pages that could be used to build backlinks and to improve internal linking. We created a content plan for groups of topically related articles known as Power Hubs (aka Content Hubs), which you’ll learn how to do soon. The content plan also involved creating new core landing pages (aka “money pages” that are the most important for your service i.e. product pages for an eCommerce website) to target informational keywords. As the content strategy primarily focused on creating new pages via the Power Hubs, the next challenge was to carry out an outreach strategy with the aim of boosting their rankability while simultaneously building the overall authority of the domain. Finally, we identified two core technical drawbacks facing the website: poor main menu navigation which resulted in a bad user experience and lots of orphan pages (i.e. URLs with zero internal or external links pointing to them) which prevented the pages from being crawled by Google. Follow the steps outlined below to find out how you can overcome these challenges for your own website. Targeting Informational Keywords With Power Hubs There are four main types of keyword groups each of which have a specific purpose or intent. They are: It’s important to optimize your website to target a wide variety of keyword types as they correspond to different stages of the searchers journey. For example, users searching for informational keywords tend to be at the beginning of their search journey as they’re likely researching a topic that they aren’t familiar with whereas those searching transactional terms are further along the journey as they’re closer to making an actual purchase or performing an action (i.e. subscribing to a newsletter). This client’s website was lacking informational content that catered to the former group of searchers which meant that they were missing out on ranking for a large subset of keywords related to their niche. Instead of writing stand alone informational articles on your blog, you can grow the amount of informational content on your website by writing and publishing Power Hubs. What Are Power Hubs? Power Hubs (aka content hubs) are content pieces that comprise a single pillar page that is supported by a group of supplemental pages each of which are topically related to each other. These pages are then connected to each other with hyperlinks i.e. the pillar page links to each of the cluster pages and vice versa. What Are the SEO Benefits of Power Hubs? Publishing Power Hubs have the following SEO benefits: How to Find Potential Topics for Power Hubs I’ll show you three different techniques to find potential topics for Power Hubs – two of which are free! Ahrefs Keywords Explorer The first method is via Ahrefs’ Keywords Explorer tool. 💡 Top Tip: use the Terms filter on the left hand side to see groups of topically related keywords that contain similar (or common) words. This is a great way to find your core topic for the pillar page. Let’s look at the keyword opportunities for “art”. In this case, you can see that there are several keyword variations related to what nft art is with a parent topic of “nft art” – making it a great keyword choice for our pillar page. This is because the pillar page should target a broader topic (“nft art”) whilst cluster pages target more specific subtopics. As you did before, use the groups in the left sidebar to find topically related keywords. Without even clicking on any of the groups from the left sidebar, you can start to spot certain keyword groups and topics for your cluster page. Each of the above could be a cluster page that goes into detail about how people can create, buy and sell NFT art. For example, here are some 133 additional keywords that you can target for “how to make nft art”. Related Searches If you don’t have Ahrefs, you can also use Google’s Related Searches section to find pillar and cluster topics. In this case, we have “nft art” again. As you can see, Google and Ahrefs’ results both align as we’re seeing similar keywords appearing. The pillar page can target “nft art” with cluster pages targeting specific terms like “how to create nft” and “how to sell nft art”. People Also Ask Another great (free) way to find keyword ideas for your pillar and cluster pages is via Google’s People Also Ask section. This is especially useful for finding informational keywords. All three methods can be used to put together a comprehensive list of keywords to target for both the pillar page and the supplementary cluster pages. Before you know it, you’ve got the building blocks for your Power Hub… 💡 Top Tip: If you really wanted to take things to another level, you could repeat the same process to create a Power Hub within your Power Hub. I’ll explain… Let’s go back to Ahrefs Keywords Explorer and look at the additional keywords that we could target for “how to create nft art”. Use a similar approach to finding cluster page ideas by looking at whether there are further ways to group keywords. In this case, we have groups for “digital” and “3d” NFT art. This means you can turn your cluster page about “creating NFT art” into its own pillar page, with its own cluster pages. After identifying these keywords, the next step is to start writing your Power Hub! How to Write Content for Power Hubs Let’s take a look at how you should go about writing your content hubs: Here’s what you should look out for: Based on this, write your content so that it aligns with the content that Google has already rewarded on the competing pages. Always remember to write content that: Internal Linking Strategy for a Power Hub One area where Power Hubs differ slightly, is with the internal linking strategy. What Is Internal Linking? Internal linking is the process of adding a hyperlink from one page on your website, to another page on your website. Importantly, in the context of SEO, the links you add to other pages on your website should be relevant to the linking page. For example, if you have a website that sells mens socks, you might have an internal link from a blog article about Christmas gift ideas for men to one of your product or category pages. What Are the Benefits of Internal Linking? Internal linking has benefits for you, the user and search engines like Google. How to Structure Internal Links for a Power Hub Power Hubs provide many internal linking linking opportunities, for example you should add internal links from the: Following the above will create a network of links for Google and users to follow. Frequently Asked Questions About Power Hubs What Can a Power Hub Include? Apart from written text, Power Hubs may also include a range of other types of content that are beneficial to improving the overall user experience such as: What’s the Goal of a Power Hub? The main purpose of a Power Hub is to provide informative and educational content that is suitable for all users. This means that you should aim to include content that applies to both beginners and advanced users alike. Ultimately, you want to include as much information that showcases your expertise within a particular topic as possible. How Do Pillar and Cluster Pages Differ? The main difference between pillar and cluster pages is that: Improving Keyword Visibility for Informational Keywords As mentioned before, it’s important to ensure that your website targets a range of keywords so that you capture users at different stages of their search process. The client’s site was well optimized for transactional intent keywords (i.e. where users are looking to perform some sort of action such as making a purchase or subscribing to a newsletter), but the site was lacking pages that targeted an informational intent. This Read More Read More

The post How to Grow Traffic 114.25% With a Content Driven SEO Plan (Case Study) first appeared on Diggity Marketing.

]]>
How to Grow Organic Traffic by 131% With Cold Email (Case Study) https://diggitymarketing.com/link-building-outreach-email-study/ Mon, 26 Dec 2022 07:30:26 +0000 https://diggitymarketing.com/?p=2041466 Cold email outreach is still widely used for link building. However, the conversion rates have decreased drastically as many link builders utilized cold outreach the wrong way. Spending time on prospecting, finding email addresses, crafting email copy, and sending emails just to receive a couple of responses is very frustrating. So, what can you do to increase response rates, and get those high DR links? Below, you’ll find out the cold email outreach strategy that landed us more than 1,500 backlinks with an average domain rating (DR) of 68. Discover the tactics you can replicate and get featured in HubSpot, Ahrefs, G2, Zapier, and many other high-authority websites. The goal of this case study is to share tips that you can use in your cold email campaigns to get higher conversion rates. You’ll learn how to: Before you jump into the details, it’s important to understand a bit of context about the goals, approach, and tools used. Cold Email Is a Powerful Tool for Building Backlinks & Growing Organic Traffic (If Used Correctly) Creating good content that delivers value to the audience helps attracting backlinks, but it is not enough. Competing in a cold email software niche is extremely difficult, and most competitors publish a lot of fresh and quality content that is supported with backlinks from high-authority websites. The solution was to start actively building backlinks through cold outreach. The biggest issue with cold outreach is that everyone is doing it, and to build backlinks at scale, you need to be creative with your approach. Given the fact that Hunter is a tool that helps you with cold outreach, it was a no-brainer to use Hunter for all aspects of cold outreach — from finding valid email addresses to automating cold email campaigns. You can go crazy and spend quite a lot of budget on tools, but in nutshell, you only need Ahrefs and a system for tracking backlinks (Google Sheets does the work). This is how it looked before starting with active link building. The domain had quite a strong backlink profile, but the organic traffic and rankings plateaued, which wasn’t enough to keep up with the competition. We started actively building backlinks through cold outreach in January 2021. This is how it looks today. In just a year and a half, more than 6,000 relevant backlinks were built, and organic traffic increased by 2.3x. All that in an extremely competitive industry (it’s hard to find any keyword with a keyword difficulty lower than 25). Below, you’ll find out the key takeaways learned along the way and how you can apply them to your link building strategy. Takeaway 1: Guest Posting and Claiming Unlinked Mentions Have 13.7% Conversion Rates Key takeaway: When starting with guest blogging, getting featured in high-authority publications in your industry will be tough without a strong writing portfolio. Most editors ask that you share a couple of writing samples published under your name. The best approach here is to start writing for smaller but relevant blogs in your niche with positive traffic and authority growth trends. Slowly build up your writing portfolio and start pitching to higher authority websites. We’ll cover how to pitch to these higher authority sites shortly… Key takeaway: Before starting with tracking mentions, it is essential to mention that this tactic doesn’t work for everyone. Hunter is an established brand, and it was known before we started with this tactic, so for this tactic to yield results, you need to have an established brand to start with. Here is what you can do instead… Create image link bait content and reclaim unlinked attributions. The simplified process looks like this – create custom images and upload them to stock image sites such as Pixabay. Make sure to include all the details for attribution, so if people use your images, they can adequately attribute them to your website. The biggest challenge is that many people will use your stock images, but the link will often lead to the stock image website instead of yours. That’s where tracking the mentions come in handy. By setting up mention alerts, you can reach out to the sites that have used your image but linked to the stock image website. You can ask them to link to your website instead. Additionally, you can periodically use the reverse image tool to find where your images have been used without proper attribution. Tips on implementing this strategy: Key takeaway: Although the open rate was good and the highest response rate compared to other campaigns, the conversion rate is noticeably the lowest. The main reason is that many prospects respond with unrealistic paid options. Extra tip: How to achieve a 40% response rate for your guest blogging campaign Another good tactic that works well is collaborating with link-building partners and introducing each other to editors with whom you previously published guest posts. This is how you can get into a conversation with high-quality websites such as Zapier and G2 (and publish a guest post there). Here’s a template you can try. In the email opener, mention a shared connection and compliment their blog post. Also, if they have already published a guest post before, there is a high chance that they are interested in publishing another guest post. Key takeaway: The skyscraper tactic was popularized by Brian Dean from Backlinko almost 10 years ago. People tend to scrape lists and use the shotgun method — send as many generic emails as possible with minimal personalization. It worked before, but now it will likely hurt your email deliverability and sender reputation. What you can do, however, is try to send fewer emails and put your time into personalization. We do this, but over time noticed that open rates were slowly declining as well as conversions. This is, so far, the lowest-performing campaign we have run. Takeaway 2: Follow-Ups Can Increase Response Rates by 66% Crafting a winning subject line can be tricky. It really depends on the industry and context. At Hunter, we tried generic, one-word subject lines, as well as longer and personalized ones. Here’s what you can take away from the results: While reaching out to high authority websites and asking for a guest post opportunity, the more personalized your subject lines are, the better. Our best-performing subject line for guest post pitching is: {{company:”your company”}} x Hunter.io collab idea? Our worst-performing subject line for guest post outreach was: Content collaboration 🤝 For doing a skyscraper method (it still works, but the performance is decreasing), short, but personalized subject lines prove to work the best: {{first_name:”Hey”}}, collaboration? Longer, but personalized subject lines don’t work that well: {{first_name:”Hello”}}, I think you may find it useful. Key takeaways: Subject lines that worked best for us: Takeaway 3: Follow-Ups Can Increase Response Rates by 66% According to Backlinko, follow-ups are an essential part of cold outreach. Sending multiple follow-ups can increase replies by 66%. Here’s how it works for us: Roughly 65% of all replies come from follow-ups. Here’s another campaign where we sent emails to 600 recipients: Roughly 47% of all replies are coming from follow-ups. It’s safe to say that follow-ups are mandatory in any cold outreach campaign. Maybe the person you’re reaching out to is getting hundreds of emails per day and simply skipping your email. Or maybe they skimmed through their inbox quickly and forgot to open your emails. There can be many reasons why someone doesn’t get back to you. We tested multiple sequences, and what works for us is the following sequence: In case you didn’t get a reply at all, you can snooze the conversation and reach out again in a couple of months. Alternatively, you can find other potential decision-makers and try reaching them. Takeaway 4: Find the Right Decision-Maker & Don’t Quit After the First Sequence One of the most challenging parts of cold outreach (apart from crafting a personalized email) is reaching out to the right decision-maker. Cold outreach is not a one-size-fits-all approach, and it depends on the context. For instance, if your outreach goal is to score a backlink from a specific website, then the right decision-maker would be someone who manages content at the company. There are cases when you can’t find the contact information of a decision maker because the company is too small or their only available email address is a generic one such as info@company.com. In these instances, it’s fine to reach out to the CEO or the founder or simply to any available email address. The open and response rates will be lower, of course, but many times, they will refer you to the right person. Or, you can ask for the right person to contact. Here is an example of a follow-up email used: And here you can see the results this email achieved: As you can see, 20.5% of all replies come from the second follow-up, where you ask for an alternative contact. Most of these replies are helpful and point you Read More Read More

The post How to Grow Organic Traffic by 131% With Cold Email (Case Study) first appeared on Diggity Marketing.

]]>
How To Grow Traffic 90.97% With A Custom SEO Plan (Case Study) https://diggitymarketing.com/custom-seo-case-study/ Mon, 10 Oct 2022 12:18:29 +0000 https://diggitymarketing.com/?p=2038545 If you aren’t keeping every aspect of your website’s SEO performance in tip-top shape, you’ll struggle to see much organic growth. Every SEO campaign is different – some websites require more focus on just one or two of the core components of SEO (content, backlinks, and technical factors) whereas others may require optimization and improvements across the board. These SEO issues can be discovered by carrying out an SEO audit of your website which can uncover a whole range of action points that need to be addressed. That’s why it’s always essential to audit, analyze and optimize what’s already on your site. In this case study, you’ll learn the exact steps that my team at The Search Initiative took to increase our client’s organic traffic by 90.97%. In this article, you’ll learn how to: Before that, let’s find out a bit more about the website’s goals and the main challenges faced during the campaign. The Challenge Before joining The Search Initiative, the site was struggling to break into the first page of the search results for many important keywords. Therefore, the main goal of this campaign was to grow the site’s organic traffic with a focus on optimizing the editorial content. The client is a real estate website targeting people who want to rent and/or buy properties in Southeast Asia. This site saw a spike in referring domains. This was an attempted negative SEO attack, which is when a competitor intentionally attempts to sabotage your SEO efforts by building many poor-quality backlinks. If you believe your site has been deliberately attacked in this way, it’s best to audit and tidy up your link profile – read on to learn how to do this. The website had a lot of content, with many articles that were 10k+ words long. However, there were lots of keywords that these pieces of content were struggling to rank for. In cases like this, you should carry out a content optimization strategy that focuses on improving these long-form pieces for low-hanging keywords that were ranking just outside of the first page. Finally, we identified two core technical drawbacks facing the website: hundreds of internal redirects and missing breadcrumb navigation. As a real estate website with hundreds of listings across multiple cities and locales, missing breadcrumb navigation resulted in unnecessarily poor user experience as it made it much more difficult for visitors to navigate the website. Find out how you can overcome these challenges for your website by following the steps below. Pruning Your Link Profile With A Backlink Audit Your backlink profile is like a tree. Now and then, you want to snip off and prune a few faulty branches (low-quality backlinks) to ensure that the rest of the tree (link profile) is healthy. I.e., There are no spammy backlinks. This is a procedure you should carry out periodically by only checking the most recent links pointing to your site. But sometimes, the number of backlinks may suddenly shoot up, which could signify foul play. Google is able to ignore these spammy links in “most cases” but that means that some might slip through. If you get 1000 spam links, how many of these weren’t ignored? 50? 100? It’s not worth the risk, you need to take action. You’ll find out how to analyze the quality of a backlink pointing to your website later, but first, let’s see how you can identify whether your site’s link profile has seen unnatural growth, as described above. Identifying A Negative SEO Attack What Is A Negative SEO Attack? Your backlink profile may be the victim of a hostile SEO attack where your competitors (or another entity) purposely build hundreds, if not thousands, of unnatural, poor-quality links towards your site. Such an attack aims to trigger a Google penalty (or manual action) so that you lose rankings. If this happens to you, you will want to conduct a backlink audit to identify and disavow (ask google to ignore) these malicious links. How To Identify A Potential Negative SEO Attack To manually identify a potential negative SEO attack on your website’s backlink profile, you can use the Ahrefs Site Explorer tool. If you see a sharp spike like the one above, your site’s likely had a damaging SEO attack. As mentioned above, Google’s algorithms are getting better at identifying and ignoring poor-quality backlinks – but they aren’t perfect, so you’ll still need to cover your bases to make sure that your tree doesn’t have any faulty branches. This unnatural link velocity isn’t ideal – so it’s still worth seeing which links you can preemptively tell Google to ignore by disavowing them. Top tip: you can set up alerts on Ahrefs to automatically monitor new (and lost) referring domains to your website. This will enable you to quickly spot any unnatural increases (or decreases) within your backlink profile and allow you to act sooner to prevent any potential SEO damage to your site’s performance. Head over to: Alerts > Backlinks > New alert > Enter domain > New backlinks > Set email interval > Add Once you’ve identified that your site’s seen an unnatural links spike, the next step is to identify the bad links. It’s also worth noting that the spike may actually be a good thing. For example, one of your articles may have gone viral, so you may have naturally received a bunch of links. Either way, here’s how to identify whether these links are good or bad for your SEO. Investigating A Potential Negative SEO Attack Here’s a step-by-step breakdown of how to investigate a potential negative SEO attack using Ahrefs Site Explorer: When filtering these results, you’ll likely see some patterns cropping up. Let’s go through some of the most common culprits regarding links-based negative SEO attacks and how to spot them. Patterns To Look Out For: The Usual Suspects When identifying faulty backlinks from a negative SEO attack, there are a few usual suspects that you can look out for as giveaways for potential foul play. Why? Because these types of links are extremely cheap and easy to get in large quantities – meaning they aren’t too sophisticated. Remember, most of these websites aren’t legitimate. They’re often built especially for these tactics and won’t have any SEO value. Blogspot Domains  Blogspot domains are incredibly cheap and easy to build. This makes them perfect for webmasters to exploit and use to build spammy links to your domain. You can identify these quickly by clicking on “More filters” on the top right of the Backlinks report. Select Domain name and type in “blogspot”. Click Apply and Show results. You’ll now see all backlinks from Blogspot domains. In this example, there are 66 poor-quality Blogspot domains pointing to this website within the specified timeframe. Web Directories In most cases, adding a listing to a web directory is free. Again, this makes it easy for spammers to build hundreds, if not thousands, of links on irrelevant and/or suspicious-looking directories. They generally look something like this: Find potential spammy web directories by filtering the backlinks using the same method above, but instead, search for domains that contain “directory”. Comment Spam Comment spam usually results from automated software being used to place lots and lots of comments on blogs or forums towards a particular website. These links generally use exact match anchors, i.e. the clickable text of the links are keywords you are likely targeting. This can be problematic because a high number of keyword-rich anchor texts within your link profile will likely raise some flags in Google’s eyes as being unnatural, which could lead to your website being penalized via a manual links-based penalty. Here’s an example of a forum link taken directly from Google’s guide on Link Schemes. Scroll down to the Anchors report right at the bottom of the Site Explorer tool on Ahrefs – this shows you an overview of the most commonly used anchor texts to link to your website. If you spot lots of exact match (keyword-rich) anchor texts linking to you, it’s likely that these could be comment or forum spam links. In some cases, you may even find some extremely unnatural or even suspicious anchors used to link towards you. These are much easier to spot as they’ll likely have nothing to do with your website. Next, click on “View full report” to dig a little deeper. This is because the report only shows you the top 10 most common anchors (see screenshot below) – and there may be lots more! Here, you want to order by “/ dofollow” links because dofollow links are the kind that specifically instructs search engine bots to “follow” the link. Whereas sites that link to you using a nofollow link, are highly devalued by Google’s crawlers. Click on Details, then Referring domains, to explore further. In most cases, a website will only link to you once – so if you see that a site is linking to you a lot, it’s a strong indicator that something’s not quite Read More Read More

The post How To Grow Traffic 90.97% With A Custom SEO Plan (Case Study) first appeared on Diggity Marketing.

]]>
How To Increase Traffic By 96% (SEO Case Study) https://diggitymarketing.com/saas-seo-case-study/ Mon, 11 Jul 2022 12:09:41 +0000 https://diggitymarketing.com/?p=2033106 Regardless of your website’s size, the keys to a successful SEO campaign are (on-page), links (off-page), and technical factors. Your content acts like the body of a race car, and it’s referring backlinks like race fuel. The technical factors act like the nuts and bolts allowing everything else to perform at its best. If you miss either of these then your website will struggle to rank much like a race car struggling to get down the track. In this case study, you’ll learn the exact steps that my team at The Search Initiative took to increase our client’s organic traffic by 96%. You will get a crash course in the three pillars of SEO (on-page, off-page, and technical factors) through the lens of this case study. In this article, you’ll learn how to: Before getting into the details of the strategy, here’s some important information about the website’s goals and the main challenges that were faced. The Challenge The main objective for this campaign was to increase the amount of quality organic traffic on the site to grow the number of leads. The client is a US-based SaaS (Software as a Service) B2B company that builds and offers cloud software with web pages targeting a range of countries including English speaking countries such as the U.S.A., as well as Japan, China, Korea, and France. With this in mind, one of the main challenges was index bloat. There were over 30k crawlable URLs on the English version of the website alone – quite excessive for a SaaS website. Fixing these crawl budget issues and uploading the XML sitemap which was missing when the client joined TSI was a priority. More on that, below… The client’s hreflang (a way to tell Google about the language and target location of your content) setup had not been implemented correctly. This is a very important element of international SEO and needs to be addressed to avoid potential duplicate content issues. Although the core landing pages of the site were relatively well optimized, there was a lack of supporting content to drive traffic towards them. This is because the client’s blog was not active, with just a handful of articles published. This was tackled by researching and writing informational blog articles to target long-tail keywords. This helped build the client’s topical relevance within the niche as well as provide internal linking opportunities towards the main pages on the site. The final step was to build authority with a link-building strategy that focused on building page authority on the website’s most important pages: the homepage and service pages. Follow the steps below and find out how you can also overcome these challenges for your own websites. Crawl Budget Management What Is Crawl Budget & Why Is It Important? Google only has a limited amount of time and resources that it can allocate to crawling and indexing the World Wide Web. Therefore, Google sets a limit on how much time it spends crawling a given website – this is known as the crawl budget. The crawl budget is determined by two elements: If you have a large website with hundreds of thousands of pages, you’re going to want to make sure that only the most important pages are being crawled i.e. that you aren’t wasting your crawl budget on unimportant URLs. Crawl budget management is about making sure that you’re stopping Google from crawling irrelevant pages that cause index bloat. How To Fix Index Bloat Index bloat occurs when Googlebot crawls too many pages of poor quality. These pages may offer little to no value to the user, be duplicated, be thin in content, or may no longer exist. Too many unimportant and low-quality pages being crawled wastes precious crawl budget as Google spends time crawling those URLs instead of the important ones. Our client’s English site had over 30k legacy event pages indexed – these were pages that included very little content about industry events within the client’s niche i.e. a flier for the event along with essential information such as dates and times. Let’s look at some of the most common culprits that cause index bloat and how you can find them using a site search: You may also come across these kinds of pages: There are several ways to tell Google which pages you want crawled, and which ones you don’t: Here’s the robots.txt file for my site: You can generally find your robots.txt by accessing: yourdomain.com/robots.txt The basic format for blocking Google from crawling your page(s) is: User-agent: [user-agent name] Disallow: [URL string not to be crawled] Here’s an example: User-agent: * Disallow: /author/ The above rule prevents all robots from accessing any URL that contains /author/. Find out more about the best practices for your robots.txt file here. <meta name="robots" content="noindex"> If you have a WordPress website, you can do this easily via a plugin like Yoast SEO. On any page, scroll to the Advanced tab on the plugin and where it says “Allow search engines to show this Post in search results?” select No. An important thing to remember is that you need to ensure that these pages haven’t also been blocked on your robots.txt file. Otherwise, Googlebot will never see the “noindex” directive, and the page may still appear in the search results if, for example, other pages link to it. Find out more about how to manage your crawl budget here. XML Sitemaps While your robots.txt file is used to prevent search engine bots from accessing certain pages, there’s another important file that you need to guide Google in the right direction regarding which pages you do want it to find and index. That’s the XML sitemap – which our client happened to have missing from their website. What Is An XML Sitemap & Why Is It Important? The XML sitemap is a “map” of URLs using Extensible Markup Language. Its purpose is to provide information about the content on your website i.e. the pages, videos, and other files, along with the respective relationships between them. XML sitemaps are important because they allow you to specify your most important pages directly to Google. Here’s an example of what a sitemap looks like: https://diggitymarketing.com/sitemap.xml Providing this information makes it easier for crawlers like Google to improve crawl efficiency as well as understand the structure of your web pages. Think of it as a table of contents for your website. By doing this, you’re increasing your chances of your web pages getting indexed more quickly. Here’s an example of a basic XML sitemap: <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>https://domain.com/</loc> </url> <url> <loc>https://domain.com.com/blog/</loc> </url> </urlset> More often than not, your XML sitemap will likely look like this auto-generated one from Yoast: https://lakewoodrestorationpro.com/page-sitemap.xml Why? Because it’s much easier to use a plugin/tool to generate your sitemap than to hardcode it yourself manually. How To Create An XML Sitemap There are many ways to create an XML sitemap for your website depending on your CMS. Your sitemap should now be automatically generated and available at either com/sitemap.xml or yourdomain.com/sitemap_index.xml Name your file “sitemap” it will be saved in .xml format. There are also XML Sitemap generators like XML-Sitemaps.com where all you need to do is: One thing to note about the methods detailed above is that they may contain URLs or pages that you do not want to be included. For example, crawlers like Screaming Frog may include paginated pages or /tag/, /author/ pages – which as you learned above, cause index bloat. So, it’s always good practice to review the generated files and make sure that only the right pages are there. There’s also the option to code your XML sitemap manually, this is fine for small websites with very few pages but perhaps not the most efficient for massive sites. Regardless of which method you choose, remember to upload the sitemap.xml file to the public_html directory so that it will be accessible via domain.com/sitemap.xml. How To Submit Your XML Sitemap To Google To submit your XML Sitemap to Google, go to your Google Search Console and click Sitemaps > enter the location of your sitemap (i.e. “sitemap.xml”) > click Submit. That’s it! Remember to also add a link to your XML sitemap within your robots.txt file using the following directive: Sitemap: http://www.example.com/sitemap.xml If you have multiple sitemaps, simply add another directive, so you have something like this: Sitemap: http://www.example.com/sitemap-1.xml Sitemap: http://www.example.com/sitemap-2.xml Sitemap: http://www.example.com/sitemap-3.xml This final step makes it that extra bit easier for Google (and other crawlers) to find your sitemap and crawl your important pages. Many plugins like Yoast automate this process for you, they’ll automatically add your sitemap into your robots.txt file. Implementing hreflang Attributes Correctly Implementing hreflang attributes correctly is an advanced technique that should only be done by experienced web developers, SEOs and those who understand the risks. However, if your website’s content is available in multiple languages, then the hreflang attribute is especially important for you. If you set this up correctly, you could essentially clone your Read More Read More

The post How To Increase Traffic By 96% (SEO Case Study) first appeared on Diggity Marketing.

]]>
Doubling Organic Traffic (YMYL SEO Case Study) https://diggitymarketing.com/ymyl-seo-case-study/ Mon, 11 Apr 2022 05:01:57 +0000 https://diggitymarketing.com/?p=2030822 Over the past few years, Google has focused on ranking pages based on the authority, relevance and trust conveyed by a website’s content and backlinks. That being said, increasing a website’s authority, relevance and trust has become more difficult than ever before. If you have concerns that your site is falling behind your competition when it comes to these three factors, you’ll want to follow the steps outlined in this case study. Below, you’ll find how my team at The Search Initiative more than doubled our client’s organic traffic. You’ll learn how to: But, before you implement the strategy used, it’s important you know the site’s background, goals, and the main challenges that you’ll need to overcome as well. The Challenge The primary goal for this campaign was to increase the traffic from non-branded search terms (i.e. keywords that don’t include your brand name in them). Considering that the client is an independent insurance broker targeting the UK market, one of the biggest challenges we faced was the fact that the search volumes for many keywords in this niche are extremely low. After all, most people tend to search for their life insurance only once in their life. However, what we quickly identified is that despite the low search volumes, the revenue potential for each visitor could be substantial. As a YMYL (Your Money Your Life) website, the client’s site was lacking in E-A-T (Expertise-Authority-Trust), which is extremely important if you want to rank. We focused on building more E-A-T by optimizing the content on the site with text that established the client as a trustworthy source of information. We also added informational content, specifically FAQs, to the client’s website to increase their chances of ranking for featured snippets. The final piece of the puzzle was to build the site’s authority with a competitor-based link-building strategy. While E-A-T is essential for YMYL niches, it’s also an important factor for all websites that are looking to stay ahead of the competition. Below, you’ll find out, step-by-step, how to achieve these results so you can replicate them on your own websites. Optimizing For E-A-T (Expertise-Authority-Trustworthiness) If the content on your website could potentially impact a person’s happiness, health, financial stability, or safety – then it qualifies for Your Money Your Life (YMYL). In August 2018, Google introduced a new set of guidelines for it’s third-party Quality Raters (QRs)  (QRs are real humans who check the quality of web pages) to measure the expertise, authority, and trustworthiness (aka E-A-T) of a site. The guidelines outline certain signals that the raters should look out for based on characteristics of the content along with other ranking signals like backlinks, performance etc. In the case of this client, we needed to establish E-A-T within the insurance market so that Google and readers could see that the information provided was written by a credible source, from a respected firm. What does E-A-T actually mean though? Let’s break it down: There are certain steps that you can take to earn the trust of both Google and your visitors. Take a look at some ways that you can optimize your website for E-A-T. Building Authorship Including information about who’s responsible for the content on your website is crucial for building authorship. Why? Because it’s something that Google specifically asks its QRs to find. Adding author information, like a short description and links to other online profiles, helps to raise your author’s (and your website’s) expert status. This helps Google (and your readers) see that the information that you’re providing is written by a credible, trustworthy source. For example, if an author is visible on other authoritative pages within your niche (or their own social media channels), where they post (high-quality) content, then mentions or direct links from those sources can be beneficial for improving your website’s E-A-T. Here are some tips to help build more authorship on your website: Here’s an example of the About page from What Hi-Fi: Following the above steps makes it easier for both your readers and Google to establish the content creator’s E-A-T. Adding Contact Information User experience and transparency are important factors that Google considers when looking at E-A-T. You should make it as easy as possible for your visitors to get in touch with you. Here are some tips on how to do this: This includes contact information such as your address (if you have one), email address, telephone number, etc on your website. Here’s an example from the TSI website, where we’ve included our contact information in the footer of each page. A Positive Reputation According to Google, a positive reputation is one of the characteristics of “high-quality content”. Google instructs its quality raters to conduct “reputation research”. In other words, what people say about your business or brand online, matters. Google is trying to calculate the overall public sentiment for your brand and website. The odd negative review of a product or service isn’t going to hurt you – Google acknowledges that no website will have a perfect set of purely positive reviews. But if, for example, a particular product is receiving mostly negative reviews, or your brand is being talked about negatively in the press – then this is likely going to affect Google’s assessment of the quality of your website. You can’t control what people say or write about you, but you can monitor it. For example, you can set up a Google Alert to notify you when there are new mentions of your brand (or individuals from your website) online. Learn more about how Google perceives reputation here and find out how to manage your online reputation in the video below.  Building Topical Authority Topical Authority is all about a website’s perceived authority within a particular niche or subject. This signals to Google that you have a deeper understanding of your subject matter and therefore, can be trusted. Having strong Topical Authority is vital if you’re looking to improve your E-A-T and, as a result, rank in the SERPs. To build Topical Authority, you need to create content that showcases your knowledge across every aspect of your niche. In our client’s case, this was insurance. Look at how your website’s content compares to that of the competition – i.e. identify information that is missing from your website but is included in your competitor’s sites. It’s important to remember to cover all bases of the user’s journey. For example, a Law firm will have service pages that target specific commercial keywords like “personal injury lawyer london”. But to build Topical Authority, you’ll also want to target informational keywords that help support these pages e.g. “what is personal injury law”. You now have a list of relevant keywords that you can create content around to further expand your site’s Topical Authority. To find out more about Topical Authority, watch this interview with Koray Tugberk.  Reviewing & Updating Old Content As “freshness” is one of the ways Google judges the quality of content, regularly updating older content is a good way to get back on the search radar. Doing so can give you a ranking boost and subsequently increase traffic. This is especially important for YMYL sites (it’s specifically mentioned in Google’s guidelines – see below), or for time-sensitive content where facts, regulations, etc may change over time. For example, if you have a website where you review the latest turntables, you should avoid having articles like “Top 10 Turntables Under $500 for ”, as Google and your readers will see that it’s out of date. For this example, one way to identify outdated pages is by using a site search like this: site:yourdomain.com intitle:2021 Note: you can replace ‘2021’ with any year In the example below, you can see that the site’s page title includes 2021 instead of 2022. You should also update the “post-last-modified” date to be as recent as possible. If your website primarily contains informational content, it’s important to make sure that you regularly review it to ensure that it’s as up-to-date as possible. Content Optimization: Dealing With FAQs Regardless of what kind of website you have, you’ll likely need to rank for informational keywords. For example: Many informational keywords tend to be posed as questions – which means that you can add valuable content to your website to answer queries that users may have about what you’re covering. Below, you’ll find out how to find FAQ keywords to target, how to optimize them to increase your chances of appearing as a featured snippet, and how to use structured data to markup your FAQs. How To Find FAQs Here are three simple ways to find FAQ related keywords to target on your website: People Also Ask To find potential FAQs using People Also Ask: Related Searches To find potential FAQs using Related Searches, scroll down to the bottom of the search results page to find Related Searches. Google presents these related searches to help searchers find more information that is related to their original Read More Read More

The post Doubling Organic Traffic (YMYL SEO Case Study) first appeared on Diggity Marketing.

]]>
How to Double Monthly Traffic Within Six Months – [eCommerce 2024 Case Study] https://diggitymarketing.com/sneakers-ecommerce-case-study/ Mon, 10 Jan 2022 11:40:10 +0000 https://diggitymarketing.com/?p=2028263 Do you have an eCommerce website with hundreds of category pages and thousands of product pages? Learn how to grow your organic Google traffic in this follow along case study. I’ll guide you step-by-step so you can learn the same tactics that my team at The Search Initiative used to double our client’s organic traffic on their eCommerce sneaker website within six months. You’ll learn how to achieve similar results by: Before jumping into the strategy, here’s a summary of the main challenges that the site was facing when the client joined our agency. And if you prefer to consume your content via video, you can watch the full break down here:  The Challenge This eCommerce site operates in one of the most saturated and competitive markets – fashion. More specifically, the site sells sneakers from the biggest brands across the globe, which means that it has thousands of product pages and hundreds of category pages. The main objective of this campaign, which commenced in April 2021, was to increase the site’s keyword visibility and search traffic to the website in time for Thanksgiving and Christmas in 2021. To achieve this, we focused on optimizing the content and technical aspects of the site to provide the best experience for customers. The product and category descriptions on many of the client’s pages were poorly optimized with standardized text that didn’t convert. We therefore had to focus on making sure that these descriptions were well optimized so as to engage and inform the user. In addition, the site had hundreds of product pages that were not categorized properly which meant that navigating the site was difficult for customers. After optimizing the product pages with better descriptions, we build backlinks towards them in order to improve the site and page’s authority. Below, you’ll find out, step-by-step, how to achieve these incredible results so you can replicate them on your own website(s). Category & Product Page Optimization The category and product pages are probably the most important pages on your eCommerce site. These are the pages where you’re convincing users to buy what you’re selling. So it’s crucial that the content on these pages is optimized for both search engines (keywords) and users (content). In this section, you’ll learn how to optimize the content on your category (or collection) pages, as well as how to write descriptions for your products. Optimizing Category Pages The main purpose of category (or collection) pages is to organize the products you sell into logical groups, making it easier for visitors to “browse” and find what they’re looking for. Optimizing isn’t just about ensuring that your content includes the keyword(s) you want to target. You want to make sure that the content on your category pages is optimized to address the user’s search intent for the primary terms you want it to rank for. If the answer to both of these questions is no, here’s what you need to do: You may come across some keywords with mixed intent, i.e. the results displayed may address more than one type of intent. This is usually the case for broader search terms with a higher search volume like “cheese”. The top result from the above example is informational. If you scroll down a little further, you can see that Google ranks pages that satisfy an informational intent. Scrolling down even further, you can see that the results have changed and now address a transactional intent instead. In cases like this, you want to optimize your content to match the pages closest in intent to yours. For example, if you are selling cheese on your website, you should align your content with similar eCommerce pages that are ranking in the top positions. For instance, for an eCommerce website selling sweatshirts and looking to rank for “mens sweatshirts”, you may find that the competing pages have a lot less content than you on their collections pages. The above top-ranking page only has a couple of sentences at the top explaining the kind of products that the user can find, but yours (like our client’s) may have lots of content at the bottom of the page. Therefore, to align your page with this competing page, you would look to move your content “above the fold” (i.e. above the products) and add a “Read more” functionality. Why? Because it’s clear that having lots of text at the bottom of the page offers little value to the user as they may never reach the bottom. Adding the “Read more” functionality allows you to include your most important keywords within the first paragraph, which is displayed to all users. It also provides a better user experience because visitors who want to learn more about the products you offer have the choice to click through to read more without impeding the experience of those who don’t. For more insights into how to audit and optimize your content (for any page), check out this article. How To Write Product Descriptions Your product descriptions are a deciding factor on whether a visitor purchases on your site or not. The best product descriptions convey the product’s value to the visitor and convince them to buy it. Here are the top tips on how to write well-optimized product descriptions for your eCommerce website: 1. Write For Your Target Customer Write descriptions with your target customer in mind, not search engines. Remember that different products may be suitable for different types of customers. Factors like budget and experience will determine which products your customers look at. For example, using language like “great for beginners” or “perfect for beginners” can help make the user’s decision much easier as they’ll be able to quickly decide whether the product is right for them or not. The length of your product descriptions should also be based on your target customer’s needs i.e. the buyer’s awareness. High Buyer Awareness Customers who have a high buyer awareness will already have a good understanding of the product and why they need to purchase it. Here’s an example of a product description from Asos for men’s plain t-shirts that targets a customer with high buyer awareness. There’s no need to include hundreds of words of content to describe a product that everyone knows. Instead, the user’s likely going to skim the key pieces of information about the product such as the color, sizing, fit etc. Low Buyer Awareness Users with a low buyer awareness will need a lot more information about the product in order to convince them to buy. Here’s an example of a product description for a TV on John Lewis’ site that targets a customer with low buyer awareness. This product description contains considerably more information about the product than the t-shirt example. The site also includes drop down sections detailing the product specifications and delivery information. This is because choosing a television is more of an investment than a t-shirt, so the description needs to be thorough enough to convince the customer to click through and make a purchase. High And Low Buyer Awareness There’s nothing stopping you from writing descriptions that cater to both types of customers. You can have a shorter description with the most important facts/features about the product to cater a customer with high buyer awareness followed by more detailed information about the product. Here’s an example of a camera from Jessops, which caters to both types of users… There are short bullet points summarizing the key features and specs for the camera for high buyer awareness… … and more detailed information underneath for those with low buyer awareness. 2. Be Informative Users want to know everything they can about the product before deciding on whether to buy or not, so ensure that you include as much information about the product as you can. Apart from the basics like the product name, company name, and manufacturer name, here’s a list of all of the kinds of things that you should consider including (if applicable): 3. Make Your Descriptions Unique It’s no secret that having duplicate content on your site will affect your chances of ranking in the search results. Therefore, the biggest mistake you can make with product descriptions is copying and pasting the manufacturers’ descriptions on your site. Here’s a description taken directly from Dyson’s page for its hairdryer: Here’s an eCommerce site that’s copied and pasted the description word for word. Now, how many web pages have the exact same copy? 731 to be precise… Unsurprisingly, the above page doesn’t rank for any keywords that are related to “dyson hair dryer”. In fact, the page only ranks for 1 keyword (“hair dryer david jones”). This goes to show that Google doesn’t want to display multiple versions of the same content to its users and that it’s crucial that you ensure every product has its own unique description. 4. Think About The Layout How you structure and format your product descriptions is just as important as the copy itself. Make use Read More Read More

The post How to Double Monthly Traffic Within Six Months – [eCommerce 2024 Case Study] first appeared on Diggity Marketing.

]]>
What a Data Dump From Empire Flippers Told Us About the Best Niches for Affiliate Marketing https://diggitymarketing.com/best-niches-for-affiliate-marketing/ Thu, 12 Aug 2021 04:28:47 +0000 https://diggitymarketing.com/?p=1524286 As a seasoned affiliate marketer and founder of several successful niche websites, I understand the challenges that come with choosing the right niche for your affiliate marketing business. That’s why I was excited to partner with Empire Flippers to conduct a comprehensive data study on more than 300 affiliates, display advertising, and Amazon associate sites sold since 2019. We’ve broken down this data to give you some niche ideas and help you choose the most lucrative affiliate marketing niches for your 1st or 50th site. We’ll start by using the data to highlight the most important findings. Then we’ll take an eagle-eyed view of how all the sites in our study handle content, links, and time spent. Finally, we’ll reveal and analyze our favorite niches for affiliate marketing (in our opinion). If you’d prefer to watch rather than read, I cover some of the key insights in this video.  Quick Summary What We Learned—Insights and Revelations We took the 300+ sites we received as part of the data dump and organized them by affiliate marketing niches, assets, and monetization model. From there, we compared the groups against each other to get some answers that can be carried into your next affiliate marketing campaigns. Which niche brings in the most affiliate revenue? Which niche benefits most from a content-heavy or link-heavy affiliate marketing strategy? Here are some of the things we’ve learned. We’ll tell you more about the context of each of these insights in the sections below. Tables have been included to help you understand more about how these sites stand up to other niches. You may have questions about how some affiliate marketing niches performed. We’ll cover our list of some of the best niches for affiliate marketing in more detail right after these insights. The News & Education Niche Reported the Highest Average Adjusted Monthly Revenue at $8439 Per Month News & Education held the lead for revenue after we tidied up the data a little. That tidying was necessary. You can find affiliate programs in the education niche that pay over $80 to $120 for each sale. At first, the Religion & Spirituality niche appeared to be one of the most profitable niches by a long shot, with a commanding lead at $24,842. Sure, this is an interesting niche (and probably an evergreen niche if the last 5000 years of history are any indication), but we just didn’t have the faith to place it among the most profitable affiliate marketing niches without some digging. Before we simply took that number as proof of a trend, the team dove a little deeper into the data. What we found was that only two sites sold in that niche for the life of the data. One was making well over $20k, and the other was barely clearing $1000 a month. Unfortunately, our view of the Religion & Spirituality affiliate niche was badly contorted by the high-earning whale of a site. However, there are multiple metaphysical affiliate programs out there with lucrative commissions. We found a few other niches that had the same problem. We didn’t get enough data points to assess the following affiliate niches: Because of this, we removed the above niches from the list and came up with the “Adjusted” list below. So, beyond news, which niches have the best revenue when you adjust for measurable niches? Our data gave us this top 10 list of the highest revenue-earning niches for affiliate marketing. The Adjusted top 10 Revenue-Earners These sites report the most revenue, but that doesn’t solve our whole ROI equation. The most profitable affiliate marketing niches are the ones that also have low costs. We’ll be looking deeper into the best affiliate marketing niches later. Let’s look at the data about the baking hobby niche and whether low revenue means it’s not the best profitable niche. The Baking Niche Reports Some of the Lowest Monthly Revenues at $773 Per Month The baking niche had some of the lowest reported revenue among the affiliate marketing niches that could be examined. Don’t hold that against it, though. Remember, our data only includes sites that sell. The decent number of baking sites that made it into our analysis suggests that these sites don’t need much revenue to make it to market. That could make baking a safe (if possibly a little saturated) niche market for your first successful affiliate marketing business. These sites mostly host recipes and often monetize through affiliate offers for the ingredients or kitchen accessories you need to complete the recipe. Sub-niches and micro-niches may focus on a single type of recipe, such as cocktails. You may be interested to know who the baking niche is sharing the bottom of the ladder with. The table below covers the five lowest-earning niches and the average monthly income reported at the time of sale. The Bottom 5 Revenue-Earners Again, remember that this table only assesses the revenue. Low revenue niches can still be profitable niches. If you need niche ideas, a less competitive niche with lower revenue can be a great place to start your affiliate marketing business. The Medical Niche Reports the Best ROI on Content With a Revenue/Content Ratio of $33.14 Per Published Page The Medical niche was a standout winner of the revenue/content ROI among all the other lucrative affiliate marketing niches. Affiliate marketers reported approximately $33 a month for each published page on their site. A lot of factors may go on to explain why medical niche content performs so well compared to content on other affiliate business sites. For one, you don’t need that many pages of content to target some narrow medical issues (for example, snoring). If you’re looking for some alternatives for your next investment, the image below shows the rest of the niches that get great ROI for content. The Top 10 Niches for Content ROI For several other reasons, we believe that the medical niche is one of the most profitable affiliate marketing niches, and the best overall. After a few more insights, we’ll take a deeper dive into what this lucrative niche is doing right. The Office Niche Reports the Best ROI on Link Building With a Revenue/Backlink Ratio of $5.61 Per Backlink The office supply niche significantly beat out some of the nearest competition to establish itself as a profitable affiliate marketing niche that gives the best revenue return for backlinks built. Bear in mind, we’re talking about overall backlinks to a site, not individual referring domains. This niche is a bigger player than the name may suggest. It’s less about staplers and paper and more about high-end PCs, standing desks, and custom ergonomic chairs. Many of these items have great commission rates. Office supply was already a healthy niche in the past. Now that more people are working from home, it is growing significantly faster than before. This could be a profitable niche for you with a little market research. For comparison, we’ve included our list of the top 10 niches that deliver revenue for backlinks. The Top 10 Niches for Backlink ROI You may notice there is some crossover with the particular niches featured here. The medical, bed & bath, apparel, and music niches all found their way into both the top content ROI and top affiliate link ROI lists. That’s a good indication that these could be some of the most profitable niches for new affiliate marketers or affiliate program veterans who want to boost their margins. The Survival & Security Niche Reports the Best ROI on Time With a Revenue/Hour Ratio of $2860.25 Per Hour Wow! Quit your job today because survival is quite possibly one of the most profitable affiliate marketing niches offering $100,000 for a solid 40-hour workweek. Right? Unfortunately, probably not. And I talk about more below, you should be careful about trusting hours worked numbers. They aren’t based on anything except self-reports, and even if they weren’t, we have a problem… Once again, this is one of those particular niches where a whale undermines our data. In this case, it’s a giant affiliate business site that was earning tens of thousands. In the same period, only a few other (much smaller sites) sold in that specific niche. For your interest, we’ve included the rest of the top 5 niches for this measurement. The 5 Best Niches for Revenue/Hour ROI Non-Amazon Sites Earn the Most Compared to Other Monetization Models, With a Monthly Average of $5540  Among the monetization models of the studied sites (affiliate marketing, display advertising, and amazon associates), affiliate marketing was a clear winner. The affiliate marketing sites in this study brought in more than $5000/week in average revenue. I would argue that it’s no surprise to see this model do this well. Affiliate marketing has long offered better commission rates than other monetization models. With Amazon associates, there’s one game in town. They set the commission rates, and you can like it or take a hike. With affiliate marketing, there are a million offers out there, and big Read More Read More

The post What a Data Dump From Empire Flippers Told Us About the Best Niches for Affiliate Marketing first appeared on Diggity Marketing.

]]>
Marketing Services SEO: Growing Monthly Website Traffic By 3012% – Case Study https://diggitymarketing.com/marketing-services-seo-case-study/ Mon, 14 Jun 2021 11:59:54 +0000 https://diggitymarketing.com/?p=1023064 Starting a new website from scratch? I’m going to share with you the steps my team at The Search Initiative took to grow a relatively new marketing service website’s monthly organic traffic by a whopping 3012%. From this case study, you will learn how to achieve these results by: The Challenge This client operates in the marketing industry within certain states within the US, specializing in several different marketing services. The client came to The Search Initiative with the aim of becoming an authoritative voice within their niche. This meant making sure that the content on the site, in particular, expressed topical authority or E-A-T (Expertise-Authority-Trust). Although keyword difficulty wasn’t going to be too high for the areas that the client covered, we knew that since we were pretty much starting from scratch, there was still some catching up to do in all aspects of SEO. The following is a step-by-step guide on how you can achieve incredible results with your SEO campaign. Focus On Content If you’ve ever had trouble ranking for a website focusing on a specific niche, or couldn’t rank a new product/review/location page on an authority site, it’s very likely that the root cause for this is a lack of topical authority (which is also sometimes referred to as topical relevance). This video goes into detail about how to build topical relevance with interlinking:  What do I mean by “topical authority”? For that, you need to understand that these days, Google ranks domains that have several pages targeting a specific topic. Whereas before, you could rank with just a single landing page. This means that if you’re trying to rank a page on a topic, Google wants to see additional supporting content on your site that links to it. By building a cluster of pages to support the main page you want to rank, you’re essentially establishing topical relevance in Google’s eyes. Topical authority is currently one of the hottest topics (for lack of a better word) in Search – and rightfully so. Working toward topical authority can help you expand your site’s keyword visibility by ranking for a broader net of search terms. Let me show you how… Building Topical Authority With Supporting Content In order to establish and build topical authority, you need to make sure that you’re answering as many questions as you can about your topic, new product, or if you have a local SEO site, a new location that you want to target. So, instead of purely focusing on the core landing pages of your website, you can build topical relevance by creating supplemental content. To find out more about Topical Authority, watch this interview with Koray Tugberk.  Here’s a step-by-step breakdown of what you’ll need to do: Step 1: Identify The Core Topic It’s likely that you’ll already have a topic in mind. For example, you may be launching a new product/service for your online business, or you may be branching out your local business to operate in a new city or region. Alternatively, like us, you may have a new site that targets a micro-niche. For example, you may have something like the following: If you already have a topic in mind, you can skip on to Step 2. But what if you don’t fall into either of the above scenarios and don’t know where to start? Let’s say you have a site that reviews turntables, and are looking for a new topic to cover. Here’s how you can find new topics and keywords to cover using ahrefs’ Keyword Explorer tool: 1. Search For Broad Keywords 2. Review The Parent Topics Looking at the parent topics, you may identify that you don’t have any articles about “pioneer turntables”. Step 2: Create Supporting Articles For The Chosen Topic Now that you have your chosen topic, you’ll want to create some supporting articles for it – I recommend around 4 articles to support your main landing page on the topic. To find relevant keywords for these supporting articles, use Ahrefs’ Keyword Explorer to: 1. Find Competitors for the Topic 2. Reverse Engineer Your Competitor’s Keywords Pick a primary keyword for your main landing page along with primary keywords for the supplemental articles too. Step 3: Add Contextual Links From Supplementary Articles to the Main Page This step is really important. Google follows links to discover new content to rank. Therefore, you need to add contextual links from all of your supplementary articles to your main article within the topic. Add your contextual links in the middle of the body of an article. Doing this will help Google see that your supporting article is being reinforced by your main article on that topic. When it comes to deciding what anchors to use for onsite links like this, you don’t have to be as careful as you would for off site link building. For example, here’s one way you may want to link to your main page from the four supporting articles: With this method, you’re essentially creating clusters of content that are topically related. This allows you to pass value from the smaller articles to the main content piece and, as a result, have a better chance of ranking that core page. User Intent A significant element of your content strategy should be focused on nailing the user intent (or search intent) of your pages. For example, if you search for “buy nintendo switch”, Google displays results that allow you to purchase the games console – which is what you’d expect. User intent is about making sure that your pages reflect what the searcher is looking for. Unsure about how to identify what kind of content you should include on your pages to address the search intent for the keyword you want to target? A simple search for your core keywords will tell you exactly what kind of content Google is rewarding for that term. For example, you can see below that for “best turntables”, all 10 of the top ranking pages are review/opinion-based articles. Doing this will help refine your content so that it satisfies the search intent of the keyword you want to target. On-Page Inventory   After you create a content plan and start executing it, don’t forget about the existing content you already have on the site. Every word matters, and your old content needs to be held to the same standard. Therefore, you should also perform an on-page inventory on the core landing pages of your site. Page Titles Your page titles should be engaging, descriptive, not too long (max 65 characters) and include your primary keyword as close to the beginning as possible. Avoid page titles like this: Long Sleeve Sweatshirts For Men & Short Sleeve Sweatshirts For Men | SweatShirts.com It’s too long and repetitive. Aim for page titles like this: Buy Long & Short Sleeve Sweatshirts For Men | All Sizes On Sale Headings Headings help users quickly find what they’re looking for, as well as help search engines understand the layout and structure of your content. Here are some actionable steps you should take when optimizing your headings: Meta Descriptions Meta descriptions won’t help your website rank but should still be optimized as they are displayed within the search results. A well written meta description can be the deciding factor on whether a user clicks through to your site or not. URLs One of the first things that Google mentions in its SEO Starter Kit is URLs. Establishing E-A-T (Expertise-Authority-Trust) Google measures the perceived expertise, authority and trustworthiness of a site by looking at certain signals such as the characteristics of the content in addition to other ranking signals like links. Establishing E-A-T becomes increasingly important for YMYL (Your Money Your Life) websites where the information or services provided on a site may have a direct impact on a visitor’s wellbeing. For example, in our case, the client wanted to establish E-A-T within the marketing landscape. This means that the client’s potential customers (and Google for that matter) would expect the content on the website to be written by a marketing expert from a well-established and respected agency. Let’s break this down a little further. Expertise is achieved by the writer or creator of the content expressing a higher level of knowledge about the subject at hand. Authority is achieved if the content is created by an established or well-respected agency as opposed to a student or a lesser-known/experienced organization. Trust is achieved through ensuring that the content is factually correct or can be backed up by other sources to prove its credibility. Let’s take a look at how you can demonstrate E-A-T on your site: In fact, it’s specifically mentioned in Google’s Quality Raters Guidelines. For example, if you have a website that reviews turntables, you’re going to want to ensure that you’re reviewing and talking about the latest models. Therefore, you want to avoid having articles like “Top 10 Turntables Under $500 for 2020”, as Google and your readers will Read More Read More

The post Marketing Services SEO: Growing Monthly Website Traffic By 3012% – Case Study first appeared on Diggity Marketing.

]]>
The Definitive Technical SEO Audit Guide For 2024 https://diggitymarketing.com/technical-seo-audit/ Mon, 10 May 2021 06:28:17 +0000 https://diggitymarketing.com/?p=1021666 To put it simply, if your website’s technical SEO isn’t on point – you’re not going to rank well on Google. As a professional digital marketer with over 10 years of experience in conducting technical SEO audits, I understand this very well. Having great content and backlinks isn’t enough when it comes to ranking in the top positions of Google search. If you really want to dominate the search engine results pages, you want to make sure that your website’s Technical SEO is perfect. Conducting a full technical Search engine optimization audit can be a pretty daunting task. Lucky for you, I’ve broken down each of these SEO elements and have provided a helpful SEO audit checklist with ready-to-use templates for you to follow. From the most basic level of ensuring that Google can crawl and index your content to more advanced techniques that look into site speed, mobile SEO, JavaScript SEO, and more, I’ll be with you every step of the way. Quick Summary What is a Technical SEO Audit? A Technical SEO audit is a process that is meant to identify and fix the technical issues that could make it harder for Google to crawl, index and rank your site. And to put it simply, make life easy for Google, and they make life easy for you. Common issues often discovered during a Technical search engine optimization audit include poor site architecture, broken links, slow page load speed, or display issues on mobile devices. The technical SEO audit is an important part of a website’s SEO strategy and should be one of the first things that you look into to improve your visibility in the Google search results. Why Are Technical SEO Audits Important? Technical SEO Audits are important because even if you’ve spent a long time creating excellent content, your users may not even see it if there are issues with your website’s crawlability and indexability. However, even if your site can be found by internet users, its rankings could be hurt by performance-related technical factors. Page load time is a ranking factor, which means that a slow website is unlikely to reach top spots in SERPs (search engine results page). Internet users are even less patient than Google crawlers and will leave your website if it takes ages to load. Likewise, a poorly structured website can also lead to confusion among your users. A site that is easy to navigate leads to a better user experience, and consequently, generates more leads. During a technical search engine optimization audit, you could also find out that mobile users face numerous problems while browsing your website. Given the fact that mobile devices generate more than half of worldwide web traffic, such issues could lead to a terrible loss of revenue. Let’s also not forget that mobile-friendliness is a ranking factor. Technical SEO Audit Checklist Here’s what tools we recommend in order to complete your Technical site Audit: Crawlability & Indexability As we’ve already mentioned, making sure that your site content can be crawled and indexed is a critical aspect of technical SEO. Google “crawls” the internet through links in order to find content. If they can’t find it, then it doesn’t exist in their eyes. In this section of our technical SEO audit guide, we’ll walk you through the various in which you can audit your website for technical SEO issues related to crawling and indexing. Robots.txt What is the Robots.txt file and Why Is It Important? A robots.txt file is a file that instructs search engine crawlers which pages or files they can and can’t access on your website. For example, if you have an eCommerce site , then you don’t want search engines to access sensitive pages like the cart or checkout page. It’s worth noting that… The robots.txt should not be used to hide pages from Google (or other search engines). Why? Because your web page may still be indexed by Google if other sites have linked to it with descriptive text. Robots.txt Audit Checklist Here’s what ours looks like: https://thesearchinitiative.com/robots.txt And here’s the robots.txt for: https://searchengineland.com/robots.txt Below are examples of search engines and their respective user-agents (i.e. what the search engine identifies as): Helpful tip: use the asterisk (*) wildcard. This is a special character that allows you to assign rules to all user-agents. # this directive blocks all crawlers User-agent: * Disallow: / # this directive grants access to all crawlers User-agent: * Allow: / Below are some examples of wildcards you can use: For example, the below set of rules prevents all user agents from accessing URLs in the /product/ subfolder that may contain a question mark. # this directive grants access to all crawlers User-agent: * Disallow: /products/*? # this directive grants access to all crawlers # This directive blocks all user-agents from accessing PDF files. User-agent: * Disallow: /*.pdf$ For example, if your robots.txt had the following set of rules: User-agent: Googlebot Disallow: /subfolder-a/ User-agent: Googlebot Disallow: /subfolder-b/ Google would still follow both directives. But, it could quickly get pretty confusing if your robots.txt has many rules. Therefore, something like the following is much better and cleaner: User-agent: Googlebot Disallow: /subfolder-a/ Disallow: /subfolder-b/ Technical SEO Fact: the robots.txt file was originally introduced to prevent search engine bots from overloading websites with multiple requests. XML Sitemaps What is an XML Sitemap? An XML sitemap (or sitemap) is an XML (Extensible Markup Language) file used to tell search engines where to find the most important content on websites. XML Sitemap Audit Checklist Below, we’ve outlined some simple checks that you should follow when auditing your XML sitemap as part of your technical site audit. Here’s an example of how this might look like: If you want a full list of possible on-page optimizations, I’ve put together a guide on how to audit the On-Page elements of your web pages. Here’s the XML sitemap for The Search Initiative: https://thesearchinitiative.com/sitemap.xml If you do not currently have a sitemap, you can create one manually or by using a tool like this XML Sitemap Generator from Screaming Frog. If your website is on WordPress, your life is about to get much easier as there are many SEO plugins such as Google XML Sitemaps and Yoast that will automatically generate your sitemap(s) for you. Index Bloating A very common technical SEO issue that most websites tend to face, is index bloating. Sometimes, Googlebot (Google’s web crawler) will crawl and index pages that simply offer no value to the end-user. These pages “bloat” your index and use up precious crawl budget as Google spends time unnecessarily crawling and indexing them. Below are some simple checks you can make in order to identify the types of pages that cause index bloating. Once you’ve carried out these checks, follow this awesome guide from Ahrefs on how to go about removing them from Google’s index. Pagination Paginated pages being indexed by Google can cause serious duplicate content issues. This is a common problem for eCommerce websites that may have thousands of products and hundreds of categories. To quickly check whether your paginated pages being indexed, use the following site searches: In the below example, we can see that the Zara website has over 14,000 paginated pages that have been indexed. Tags Adding tags to your WordPress or eCommerce sites is useful in organizing the content on your website, but it can also create SEO issues such as duplicate content. To quickly check whether you have any tagged pages indexed in the SERPs (search engine results page), use the following site searches: Don’t believe this is important?  Here are the before-and-after results of a client we had in the eCommerce space where one of the things we did was remove /tag/ pages from the index. Check out the full case study here. HTTP Pages If your website isn’t on HTTPS (you really should move over to HTTPS!), then it’s a given that all of your HTTP pages will be indexed. However, if you’ve made the move to HTTPS, there’s still a chance that some of the HTTP versions of your pages are indexed. To check this, use the following site searches: We can see below, that the Zara website also currently has over 2k HTTP pages indexed by Google – these are unnecessarily wasting crawl budget and creating duplicate content issues. Serving Both www. and non-www. Pages If your website serves pages with www., then it’s important that there aren’t any non-www. pages being indexed by Google as this causes further duplication. To check this, use the following site searches: If we look at the River Island website, we can see that by default, it serves www. pages: https://www.riverisland.com/. However, there are still almost 56k pages without www. indexed by Google. Having this many duplicate pages can be incredibly problematic and impact a website’s performance in the search engine rankings. eCommerce Empty Category Pages As a customer, one of the worst feelings is landing on a page to find that the website whose Read More Read More

The post The Definitive Technical SEO Audit Guide For 2024 first appeared on Diggity Marketing.

]]>
SEO Testing: The Secret To Maximize Your Website Traffic and Profits https://diggitymarketing.com/seo-testing/ Sun, 07 Feb 2021 13:25:57 +0000 https://diggitymarketing.com/?p=518503 Search engine optimization (SEO) requires constant testing to be effective. If you don’t understand what changes drive growth (or why visitors respond the way they do), you’ll waste your time guessing at solutions while your competitors leave you in the dust. In this article, as someone with significant SEO expertise, I will provide a comprehensive understanding of SEO testing, the challenges of acquiring actionable data from Google Analytics, and how to conduct SEO tests that effectively improve your crucial performance indicators. Upon concluding this article, you will not only recognize the transformative potential of SEO testing but also benefit from three of my recent test results that defy traditional SEO principles. To begin, let’s briefly define SEO testing and examine its different manifestations. What is SEO Testing?  SEO testing refers to the experiments you perform to measure how search engines and users react to your site. There are many tools you can use to get data about each web page (such as Surfer, Ahrefs, and other free tools), but no Google Analytics tool or SEO checker can tell you what effect changes will have. The purpose of SEO testing is to manipulate different search traffic factors to determine if the changes result in an increase or decrease in organic traffic or your position in search results. There are many ways you can run SEO tests, including: Like your high school science classes, SEO testing relies chiefly on utilizing experimental and control groups’ concepts. The experimental groups are the cases (pages or entire sites) that experience your test variable (such as trying a new link strategy). The cases in the control groups are untouched.  Their rankings and organic traffic will give you the reference point to determine whether or not the experiment worked (or not). Great testing depends on well-built tests. Instructions on how to create great SEO tests will be covered a little later… But first, it may help you precisely understand what kind of benefits you can enjoy if you do the work to develop reliable SEO tests. What Impact Does an SEO Testing Have on SEO? The impact that testing has on your SEO strategy is immeasurable—no pun intended. It provides all of the following concrete benefits: Testing Gives You the Power to Be First Whenever you run an SEO test, you put yourself in a limited category of people on the industry’s cutting edge. Some of the most effective SEO tactics ever discovered have been neutered by Google search console shortly after they reached the wider community. If you’re a tester, you get to be in the rare company of people who use the latest techniques while still being powerful. That enables you to operate in the most profitable niches. Testing Helps You Develop a Ranking System That Works Repeated testing gives you the power to separate what you can and can’t know about SEO, based on experience. This allows you to skip time-wasting techniques and move directly to those with the most significant, most reliable SEO performance impact. That makes you a far more efficient SEO. This ability is one of the critical differences between amateur SEOs, and the kind who can successfully dominate even the most competitive niches. Testing Can Make You Algorithm-Proof If you test SEO changes often enough, you can develop a second sight for the trends that are guiding search engine updates. You’ll spot trends before they’re officially announced, and adjust your practices to avoid penalties. What are the Challenges of Performing Accurate SEO Tests? The biggest challenge with SEO testing is collecting untainted data. Your results become tainted when they become affected by outside factors or wrong assumptions. These obstacles make it impossible for you to say which results are a product of your experiments. Challenge #1: You Must Make Sure Your Variables Are Stable To understand what’s happening on each web page in the experiment, you need to isolate what changes affect your data. Stabilizing your variables means working to prevent changes from happening that are outside of your test. For example, if you are trying to measure the effect of content upgrades, you need to prevent your link profile from changing during the testing phase. If a lousy link gets disavowed or a useful link gets added, your site’s changes could easily mask any result of the content upgrades. Challenge #2: You Need to Make Test Cases as Similar as Possible The first thing you need to do is make each experimental and control group nearly identical. This will allow you to start measuring the changes once you start altering the test site. When you come back to check website changes, you’ll have a better idea of how they happened. Here are just a few examples of why you’ll have trouble with that: Challenge #3: You Need to Expect the “Random Ranking Factor” The random ranking factor is a phenomenon coined by SEO Terry Kyle (founder of WPX hosting). It is best illustrated with an example… Imagine that you launch five identical sites or landing pages on the same day. They are in the same niche, use corresponding keywords, and employ the same design style. You will likely notice the following: This is not a precise rule, but many SEOs have recorded the effect over the years. For reasons that are nearly impossible to measure efficiently, some sites simply behave as if they are blessed while others act as if they are cursed. What’s the Solution? There is one primary solution to the challenges that SEO testing poses.  You need to increase the number of test cases. The solution is to simply increase the number of test cases. One control page is never going to be enough, and neither is one SEO test page. Adapt to this challenge by creating larger groups for each test group and control group. Have 10 URLs in the control group, and 10 in the experimental group…or better yet, 50. Then, you’re going to measure the results by taking the average of the changes in both groups. Maximizing the number of test cases helps to resolve our three biggest challenges in the following ways: For example, if we have 50 test cases, then we know that a single random backlink hitting one of them isn’t going to throw off our test. For example, it helps you control all the minor differences caused by URLs, age, and other factors that play a role in SEO. We now have enough information to identify sites that behave differently. This will help prevent us from either getting overconfident or giving up on a great technique just because it failed randomly. Let’s look at all this through an example. Example Test Pretend you have a blog in the wellness niche, and you want to test the effect of double-counting keywords in the title tag. If you have 30 posts with organic traffic, a simple experiment might involve creating two groups: If you’re curious about how a test like this might play out, don’t miss the ‘clickbait title’ test that will be revealed near the end. For now, let’s focus on what roles these groups play: Naturally, testing like this is going to depend on the funding you have. If you can afford it, testing ten sites instead of 5 (or 20 instead of 10) will allow you to create a more accurate average. If your budget is very limited, you can test across multiple pages with only 1 or 2 sites. However, there will be more room for error with those numbers than most experienced testers will tolerate. The more web pages you can create for a test, the more effectively you can cancel out noise. Sometimes, the most effective way to cancel out noise is to remove important factors altogether. Best Practices for SEO Testing Follow these rules to draw some better data from each SEO test: Set Aside Enough Time Beyond your money budget, you need to put a significant time budget in place. Once again, the more time that you can give your experiment, the more sure of the accuracy of any data that you collect.  Some results (especially offsite related) may take months to yield a difference. Exercise Attention to Detail If you can’t measure it, you can’t manage it. The more information you’re tracking, the more awareness you’ll have of how sites are different, so you must be tracking even when you’re not testing. I can illustrate this with a real-life example. At one point, I had a great site in the Brazilian testosterone niche. One week, we suddenly jumped from page two to page one. I was only able to figure out why, because of my tracking. It turns out, some links I snagged from a local citation package were worth a lot more than I imagined it would be. Thanks to my research, I caught the factor. Thanks to that insight, I now had a new strategy to use for many other sites. Recent Findings in SEO Nothing illustrates the impact Read More Read More

The post SEO Testing: The Secret To Maximize Your Website Traffic and Profits first appeared on Diggity Marketing.

]]>
E-Commerce SEO Case Study – How to Increase Traffic, Transactions, and Revenue https://diggitymarketing.com/chocolate-case-study/ Thu, 10 Dec 2020 09:11:47 +0000 https://diggitymarketing.com/?p=518715 Throughout this case study, you will learn the techniques that were used to increase transactions on an e-commerce website by over 90%. This was done by building a custom, but replicable, strategy for a long-standing client of 2 years. You will soon learn the technical, onsite, and backlinks approach that allowed this e-commerce client to grow their traffic by 48% year on year. This traffic growth saw transactions increase by 93.20%, which generated an additional $49k for the year for the client or a 39.45% increase in overall revenue (from $123.6k up to $174.5k). The Challenge The client is a niche-specialist in the confectionery industry, offering high-end chocolates to customers in the USA and around the world. The kind of chocolate you eat until you explode. They specialize in both wholesale and retail chocolate sales and want to attract professional clients from the food industry as well as “off the street” buyers. Building relationships, authority, and a brand following are very important in this business. The client approached The Search Initiative two years ago and was looking to increase conversions, develop a solid link building strategy, and have an in-depth, on-site SEO audit to improve their traffic metrics. The following is a walk-through of the steps you can take as an e-commerce site manager to achieve similar gains to our favorite chocolate client. Perform a Technical SEO Audit Crawl Management One of the more common issues faced by e-commerce sites is crawl management. If Google crawls areas of the site that have no use to the bot or users, it can be the result of faceted navigation, query strings, or sometimes, Google’s temperamental flux. Crawling such pages is a waste of Google’s time, as these pages generally have no use for Google due to them being very similar / duplicates of an original page. Since Google only has a finite amount of time on a website before it bounces off, you want to be able to control that as much as possible. You need to make Google only spend time on pages that have value. This means value pages are more likely to be crawled more often and new changes on the site are more likely to be picked up quicker. What’s even better is: Google tells you what its algorithm “thinks” of your pages! How? In Search Console -> Coverage report! One of the areas that are especially worth inspecting with the greatest care is Coverage report > Excluded > Crawled but not indexed. When reviewing Search Console, you should be looking for URL patterns. In the “Crawled but not Indexed” section of Google Search on our client’s site, we found many random query strings URLs Google recognized, but wasn’t indexing. These URLs “in Google’s eyes” had no value. After manually reviewing them, we discovered that Google was right. To prevent the search engine spending more time on these URLs and wasting its crawl budget, the easiest approach is to use robots.txt The following directives were included in the robots.txt file: User-agent: * Disallow: /rss.php* Disallow: /*?_bc_fsnf=1* Disallow: /*&_bc_fsnf=1* This was enough to take care of it! Please bear in mind that when you are cleaning the index with the use of robots.txt, there will be a part of Search Console > Coverage report which will start going up: Blocked by robots.txt This is normal. Just make sure to review the URLs reported there every now and again, ensuring that only the pages you meant to block are coming up. If you suddenly see a big spike or URLs you did not want blocked, it means either you made a mistake or Googlebot crawled itself somewhere you did not know about. Index Management Index management involves removing pages that contribute no value to the user in Google’s Index of your site. Google’s Index of a site is a list of all the pages it could return to the user of a given website in the Google SERPs. Unlike crawl management, pages that should not be in the index are not always cases where they present no value to Google. For example, “tag” pages are useful for internally linking articles or products and therefore have value in that they can help Google understand the relationship between pages. However, at the same time, these pages are not the type of pages you want to see in the SERPs, and by having them indexed, Google will crawl them more regularly Consequently, ‘bloating’ the index in this way holds your site back, as the search engines use their limited resources – or crawl budget – to assess pages that did not convert or are naturally thin in content. The client had the site set up in such a way that internal search results and tag pages were also being indexed. These provided no value to a user whatsoever, nor would they effectively contribute to better rankings from the search engine’s perspective. The most common pages that usually mess up index management include: The tricky part is, you have to identify all URL parameters/types of pages that have no value to the SERPs, and then you can noindex these pages. As a quick note, it is important to understand that there are cases where index management and crawl management are both under the same umbrella. For example, Google may be crawling non-value query strings and indexing them at the same time. As a result, this is both an indexation issue and a crawling issue. Double the fun. Broken Links Broken links are a troublesome issue that needs to be resolved if you want a well-oiled website with free-flowing authority across your important pages through PageRank. Having broken links prevents users from navigating the site easily and effectively. It can also result in users missing the opportunity to navigate their way to valued e-commerce pages! A broken page or a 404 page is, in essence, a page that returns an error due to the URL not existing or no longer existing. It’s commonly caused through old pages being deleted that still have internal links pointing at them from within your site. The client had 404 errors in abundance, and many internal links were broken; the result of changing their site in the past and not updating the link structure (or doing a proper URL mapping). To find and resolve these, you need to crawl the website. Any popular crawler like Sitebulb, Ahrefs or Screaming Frog will do the trick. Here’s how you can do it using Sitebulb. Under Link Explorer > Internal Links > Not Found you can identify where the internal links to these 404 URLs are. After this, you should go through these URLs one by one and remove the broken links manually. Where possible, replace the links pointing at non-existent pages, with a link to a relevant, working page. This is particularly beneficial if you are replacing a link from an old, no longer existing, product page, to a new, functioning product page You may need to fix hundreds of these broken links using this manual technique. All this effort is to ensure no link equity gets lost between the pages you want to rank, especially the money-making pages. Yes, it’s mundane. Yes, it’s necessary. Yes, in most cases, it’s worth it. Internal 301 Redirects In addition to finding broken links, crawler tools are also great at picking internal redirects. This causes hops between intermediate URLs instead of going directly to the linking page, which is the optimal route. It looks quite something like this:   If you follow the Red Arrows: The link points from the source to a page which is redirected using a 301 HTTP response code, to only then, finally, land on the correct page (returning 200 OK code). In short: Not Good! Now, follow Green Arrow: The link is pointing from the source, directly, to the correct page (200 OK). There is no interim “hop” in a way of the redirected page (301). In short: Good! With this, don’t get me wrong. One internal redirect is normally not an issue in itself. Sites change, things get moved somewhere else. It’s natural. It becomes a problem when a site has many internal redirects – this then starts to impact the crawl budget of the site, because Google is spending less time on core content pages that actually exist (200) and more time trying to navigate a site through a thicket redirected pages (301). Similar to solving broken links, you have to run a crawl and go through the links identified manually to replace them with the correct, final page URL. Page Speed and Content Delivery Optimization I cannot stress enough how important speed optimization is. In this day and age, it’s a no-negotiation must for a site to be responsive to users. A slow site that takes time to load, in most cases, results in users bouncing off the site, which is not only a loss in traffic but also a loss in potential sales. And guess what Read More Read More

The post E-Commerce SEO Case Study – How to Increase Traffic, Transactions, and Revenue first appeared on Diggity Marketing.

]]>
SEO Agency to Affiliate: How Julie Adams 12x’d Her Income https://diggitymarketing.com/seo-agency-to-affiliate-interview/ Mon, 09 Nov 2020 04:59:31 +0000 https://diggitymarketing.com/?p=517906 I’m about to interview one of my favorite SEOs in the world, Julie Adams. I first learned about Julie’s story in The Affiliate Lab, where she shared her experience of quitting the digital marketing agency that she worked at and switching over to affiliate SEO. Now, she’s making more money in one month than she did in an entire year at the agency! Stick around for this interview because we’re going to get into not only her story but her entire process on how she ranks websites — from backlinks to on-site SEO. Basically, everything she does to get her affiliate portfolio ranked today. If you prefer video content, check this interview with Julie Adams.  Transitioning from Client to Affiliate SEO Matt: Hi Julie. Before we get started, why don’t you give everyone a little heads up on who you are and what you do in this world of SEO? Julie: Sure, thanks for having me, Matt. I’ve been in SEO for about seven or eight years. I started at the bottom, not knowing much, and worked in an agency. I began working with content and quickly fell in love with everything that goes into SEO — including the technical aspects, then seeing results and making money. So, I worked at the agency during the day, then I would go home and do affiliate SEO at night. So I just kind of stumbled my way into SEO, and I’ve had pretty good success. Matt: Awesome, we’re very excited to hear more about your SEO story — but why don’t we start at the beginning. First off, how old are you? Where are you from? Where are you living? What’s the full story there? Julie: I’m 27, out of Orlando, Florida. I’ve been here my whole life. I love it. I’m big into the outdoors and all that — when I’m not doing the techie stuff. Matt: Born and raised in the South? Julie: I grew up in the Sarasota/Venice area, so always around the beach. I moved here when I was in fourth or fifth grade, so I was big into Disney, big into everything that Floridians are into — all that good stuff! Matt: Awesome! And what kind of education do you have? Did you go to high school, college? Julie: I’m a college dropout. I was actually a business finance major. I went to Valencia Community College, got my associate’s degree in business administration, and started my bachelor’s degree. Then I almost failed one class — and I’m the type of person that gets really demotivated if I don’t absolutely excel at something. An opportunity came up to work at an SEO agency just as I was about to fail out of that class. I decided to just drop out of school and pursue digital marketing full time. So my bachelor’s degree is still on hold! Matt: Interesting, I have a similar personality type. If I can’t be really good at something, there’s no point in it for me — I’ve quit a million things because of that, so I get that… How did you get into SEO? How did the whole thing start? Julie: I honestly stumbled into it. When I started the agency job, I was working at a movie theater, so I was basically scooping popcorn. That was my first job. I also did some babysitting on the side, which is actually how I met my boss at the agency. He recognized that I was pretty smart, needed an opportunity and just kind of scooped me up like popcorn! I started working at the agency without even knowing what the term SEO stood for. It was a total mystery to me. I thought it was just these three letters. I started as an 18 or 19-year old going into this office setting, and my boss basically just said, “You’re in charge of content. You’re in charge of links.” Initially, I was like, “Great, what does that mean?” And then I just kind of learned on the job from there, in all honesty. I didn’t get any formal training or anything like that. Matt: Interesting, so did they just give you in-house training? Was it mostly based on their SOPs and internal knowledge? Or did you take any courses or learn SEO from any other source? Julie: This was when SEO just stopped being sketchy. People were using spun content and SAPE links and tactics like that. That was my initial idea of what SEO meant. So, when I first started, I was managing spun content, editing it, writing some content myself and managing link orders. When I got there, we had no standard operating procedures — it was just, “Do X, Y, and Z, and don’t lose a client!” Matt: Makes sense… I mean, that’s the basic plan! Let’s talk a little bit more about the agency. First off, I’m curious to know why you decided to leave it? Julie: Money and time. I’m fiercely independent. I love SEO. I stumbled into something that I really enjoy. I felt really lucky because of all that, but I was working 40 some odd hours a week. And as you can probably tell by how I’m describing this agency, it’s small. I was one of three core employees at the time, and I basically hit the ceiling. There was no room to grow, there was no opportunity to make more money, there was no really no room to learn anything else — Every day you go in, you do your work, and you go home… Then I discovered affiliate SEO, probably how everybody else discovers it. I was just hoping to make a little passive income. I was good at SEO, and I wanted the free time and the money! Matt: I get that 100%. And you can throw this question right back at me if you don’t feel comfortable with it, but what was your monthly or annual salary at that agency? Julie: I don’t remember what I started at. It was definitely around minimum wage because I had no experience — so it was probably eight or nine bucks an hour, whatever minimum wage was at the time. And I maxed out at about $40k a year. Matt: And how many clients did you manage for $40k a year? Julie:  The most that I managed at one time was somewhere around 70 or 80. I personally worked with hundreds of accounts when I worked there, but at the peak, I was managing upwards of 80 accounts at once. Matt: That’s insane! I mean, that’s an awesome ROI for the agency, but how about your stress levels? How were you able to handle 80 clients at a time? Julie: It was really stressful! I wasn’t always honest about what I could handle, so I would just do whatever I could. I was basically the brains of the operation. I had people to write content. I had people that would help me put all the pieces into place. So, you could kind of think of me as more of a conductor… I built out the plans, and then somebody else would implement them. That’s the only way it was possible… But it was definitely stressful to be expected to answer questions for 80 different accounts. Like, “Why aren’t they ranking?” You can’t pull up a report in a meeting. You were expected to know the answers off the top of your head. So, that was probably the most stressful part. Matt: Wow, that’s what they call a trial by fire! Julie: I’m grateful for that, though, because the number of websites I manage now is just chump change in comparison. Matt: That brings me to my next question. I’m sure you learned quite a few skills on the job that carried over to your affiliate career. Can you touch upon that a little bit? Julie: I mean, SEO, for sure. I had 80 clients to play with. In the beginning, it wasn’t that many, but I did have clients to play with, and it wasn’t like they were my clients. If I lost them, honestly, I still had a job. So in that sense, I had room to experiment —it was that kind of environment. Matt: Got it. When you were considering making the jump from agency to affiliate SEO, what fears or thought processes were going on in your mind? Julie: Everything that you can think of! I have generalized anxiety — I overthink everything. And jumping ship from a comfortable position in an industry that I love was really scary. Now, I always knew that I would have a job there because, as I said, it was a small company. I actually had to give them six months notice to leave! So I wasn’t worried that I wouldn’t be able to return to the agency… I was just afraid that I’d have to go back with my tail between my legs — having to Read More Read More

The post SEO Agency to Affiliate: How Julie Adams 12x’d Her Income first appeared on Diggity Marketing.

]]>
45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] https://diggitymarketing.com/infoproduct-seo-case-study/ Mon, 11 May 2020 04:19:23 +0000 http://diggitymarketing.com/?p=512380 You’re about to get the strategy behind one of the most challenging SEO campaigns my SEO agency has ever run. Why was it so challenging?  3 reasons: First, the niche is massively competitive: A make-money-online infoproduct in the financial niche.  Nuff said. Second, we only had 5-months to pull this off. Third, just like any other client, they were extremely hungry for results and demanded quality work. In the case study below, you’re going to learn the technical playbook, the onsite content strategy, and the link building techniques we carried out to get this 45.99% revenue growth win for this infoproduct business. The Case Study Our client takes advantage of the wide reach of the interwebs to teach his students how to earn money trading online. We’re talking currencies, forex, stock markets, crypto, etc. The business’ revenue is generated solely through the sale of digital download products – in this case, trading guides in an ebook format and video trading courses. When the owner of this profitable business (which already built some authority in the niche) approached The Search Initiative (TSI) about helping to grow their organic reach and find new students, we were excited to take on the challenge in one of the most competitive spaces there is. To accomplish this, the game plan was to focus hard on a quick-win strategy, while setting the stage for long term gains post-campaign. Our strategists were certain that the value we could provide would have a considerable impact on his business’ bottom line. How?  Because… Over the course of the campaign, our technically-focused SEO strategies were able to grow organic traffic by 23.46%. But what did the best job for the client’s business was the 45.99% increase in the number of conversions comparing 1st vs last month of the campaign. Sales went up from just over 2,100 a month to 3,095 – this really bumped their monetization. And we did it in time. These gains were achieved within only 5 months of the client signing with TSI and our team starting the campaign. Here’s how we did it… The SEO Playbook for Infoproduct Websites Phase 1: A Comprehensive Technical Audit I’ve said this in every TSI case study we’ve published so far… and I simply cannot emphasize enough: So before you begin any campaign, always start with a full technical audit. Starting with… Page Speed First, our technical SEO strategists started at the bottom of the client’s tech stack… and you should too. This starts with you digging into the web server’s configuration, and running a series of tests to measure the site’s speed. This enables you to ensure that the performance of the web server itself wasn’t causing a penalty or disadvantage on either desktop or mobile connections. So, what tests we run? PageSpeed Insights (PSI) – this should be everyone’s go-to tool and shouldn’t need an explanation. GTmetrix – it’s good to cross-check PSI’s results, therefore we use at least one other tool. In reality, we use GTmetrix together with Dareboost, Uptrends, and Webpagetest. HTTP/2 Test – this one is becoming a standard that can greatly improve your page speed, hence, it’s definitely worth looking into. If you’re not HTTP/2 enabled, you might want to think about changing your server or using an enabled CDN.You want to see this: Performance Test – I know it might sound like overkill, but we included this in our test suite earlier this year and use it for the sites that can expect higher concurrent traffic.We’re not even talking Amazon-level traffic, but say you might get a thousand users on your site at once. What will happen? Will the server handle it or go apeshit? If this test shows you a steady response time of under 80ms – you’re good. But remember – the lower the response rate, the better! In cases where transfer speeds or latency are too high, we advise you (and our clients) to consider migrating to faster servers, upgrading to better hosting or better yet, re-platforming to a CDN. Luckily, most of the time, you can achieve most of the gains through WPRocket optimization, as was the case with this case study. Your Golden WPRocket Settings Cache → Enable caching for mobile devices This option should always be on. It ensures that your mobile users are also having your site served cached. Cache → Cache Lifespan Set it depending on how often you update your site, but we find a sweet spot at around 2-7 days. File Optimization → Basic Settings Be careful with the first one – it may break things! File Optimization → CSS Files Again, this section is quite tricky and it may break things. My guys switch them on one-by-one and test if the site works fine after enabling each option. Under Fallback critical CSS you should paste your Critical Path CSS which you can generate using CriticalCSS site. File Optimization → Javascript This section is the most likely to break things, so take extreme care enabling these options!! Depending on your theme, you might be able to defer Javascript with the below: Note that we had to use a Safe Mode for jQuery as, without this, our theme stopped working. After playing with Javascript options, make sure you test your site thoroughly, including all contact forms, sliders, checkout, and user-related functionalities. Media → LazyLoad Preload → Preload Preload → Prefetch DNS Requests The URLs here hugely depend on your theme. Here, you should paste the domains of the external resources that your site is using. Also, when you’re using Cloudflare – make sure to enable the Cloudflare Add-on in WPRocket. Speaking of Cloudflare – the final push for our site’s performance we managed to get by using Cloudflare as the CDN provider (the client sells products worldwide). GTMetrix If you don’t want to use additional plugins (which I highly recommend), below is a .htaccess code I got from our resident genius and Director of SEO, Rad Paluszak  – it’ll do the basic stuff like: GZip compression Deflate compression Expires headers Some cache control So without any WordPress optimization plugins, this code added at the top of your .htaccess file, will slightly improve your PageSpeed Insights results: Internal Redirects You know how it goes – Google says that redirects don’t lose any link juice, but PageRank formula and tests state something different (there’s a scientific test run on 41 million .it websites that shows PageRank’s damping factor may vary). Whichever it is, let’s take all necessary precautions in case there is a damping factor and redirects drop a % of their link juice. As we investigated the configuration of the server, we discovered some misapplied internal redirects, which were very easily fixed but would have a considerable effect on SEO performance – a quick win. You can test them with a simple tool httpstatus.io and see results for individual URLs: But this would be a long way, right? So your best bet is to run a Sitebulb crawl and head over to the Redirects section of the crawl and look at Internal Redirected URLs: There you will find a list of all internally redirected URLs that you should update and make to point at the last address in the redirect chain. You might need to re-run the crawl multiple times to find all of them. Be relentless! Google Index Management Everyone knows that Google crawls and indexes websites. This is the bare foundation of how the search engine works. It visits the sites, crawling from one link to the other. Does it repetitively to keep the index up-to-date, as well as incrementally, discovering new sites, content, and information. Over time, crawling your site, Google sees its changes, learns structure and gets to deeper and deeper parts of it. Google stores in their index everything it finds applicable to keep; everything considered useful enough for the users and Google itself. However, sometimes it gets to the pages that you’d not want it to keep indexed. For example, pages that accidentally create issues like duplicate or thin content, stuff kept only for logged-in visitors, etc. Google does its best to distinguish what it should and shouldn’t index, but it may sometimes get it wrong. Now, this is where SEOs should come into play. We want to serve Google all the content on a silver platter, so it doesn’t need to algorithmically decide what to index. We clean up what’s already indexed, but was not supposed to be. We also prevent pages from being indexed, as well as making sure that important pages are within reach of the crawlers. I don’t see many sites that get this one right. Why? Most probably because it’s an ongoing job and site owners and SEOs just forget to perform it every month or so. On the other hand, it’s also not so easy to identify index bloat. With this campaign, to ensure that Google’s indexation of the site was optimal, we looked at these: Site: Search Google Search Console In our Read More Read More

The post 45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] first appeared on Diggity Marketing.

]]>
Ultimate Guide to Surfer Onsite Optimization in 2024 https://diggitymarketing.com/surfer-seo-guide/ Mon, 09 Mar 2020 07:45:14 +0000 http://diggitymarketing.com/?p=511034 Do you know the best thing about on-page SEO? Control. As an experienced SEO professional with more than 10 years of experience in the filed and founder of multiple 7 figure digital marketing businesses, I know firsthand the importance of on-page optimization in achieving higher search engine rankings. Even with a small budget, you can see real results with on-page optimization. You won’t get the #1 spot overnight, but you can still rise in the ranks with fewer backlinks than your competitors. There are two ways to deal with on-page optimization: you can do it on your own and build great content from scratch, or you can use tools that speed up your optimization work. Like we do. In this guide, you’ll learn more about Surfer, an SEO tool that specializes in on-page optimization. You’ll also get a step-by-step guide on how to use the tool to create better content, fill content gaps, and rank higher for your chosen keywords. Quick Summary What Is Surfer? Surfer is an SEO tool that analyzes why top pages are ranking well for your keyword. Based on that information, you’ll be able to figure out what you need to do to create content that will help you outrank your competitors. Surfer helps in two ways: Creating/outsourcing new optimized content Optimizing existing pages The Surfer Content Editor analyzes your content structure, keyword usage/density, phrasing, and more, comparing them to high-ranking results for the same keywords. Then, it provides you with guidelines on how to build content that has the right structure and wording to show up on page 1. There’s also the Keyword Analyzer, with Audit and Terms to Use features that help you optimize existing pages. Rounding out the Surfer SEO tool kit are the Keyword Research and Common Backlinks features—I’ll discuss all these in full detail later. Surfer Recommendations In Action How does this tool help you optimize your content? And what can you expect from Surfer? Let’s take one of my own pages as an example. I have an SEO coaching landing page that was ranked at #1 forever. All of a sudden, it had dropped to #2. After plugging it into Surfer, I found out that my landing page content was too long…and thus I cut it in half. The Terms to Use feature also let me know that my word usage was off.   Tweaking the landing page boosted the page back to #1, and now it’s at a similar length to other high-performing pages. Here’s another example: in November, I turned my Affiliate Networks page into a blog post and adjusted the densities of relevant phrases based on Surfer’s recommendations. The next day, I checked my keywords and saw this: My page jumped to the top three after my tweaks. Similarly, Matthew Woodward had an extremely comprehensive review of SEMRush that clocked in at a whopping 26,000 words and was ranked #7. Surfer told him to remove 22,000 words…or almost 85% of his content. While it sounds counterintuitive to reduce your long-form blog post to a “regular-sized” one, his review jumped to the #1 spot…the next day. These are three examples of Surfer giving you insights on pages that are currently doing well with their content so that you can use the same rewarding practices. Correlation SEO In On-Page Optimization Now that you know what Surfer can do, you’re most probably thinking, “How does Surfer know which ranking factors are the most crucial for SEO?” Unfortunately, Google and other search engines aren’t transparent about their algorithm. Enter correlational SEO. Correlation SEO analyzes various ranking factors in order to determine which ones have the biggest impact on ranking. Surfer’s data comes from reverse engineering the search engine results page (SERP)—it looks at what top-performing pages are doing that you aren’t. Instead of giving you vague advice (“long content is better than short content”) or ballpark figures (“aim for 1,000-2,000 words per blog post), Surfer provides recommendations that are based on pages that already rank for your target keywords. And this extends to more than just word count. Surfer also looks at what kinds of pages rank best (e.g. long-form vs. quick answers), what kind of media they contain (e.g. graphics, lists, etc.), what topics they cover, and what words and phrases are most commonly used. Surfer wasn’t the first correlational SEO tool to hit the market.  Cora and Page Optimizer Pro came earlier and are both exceptional tools as well. One of the major criticisms of correlation SEO is that correlation doesn’t mean causation—just because a competitor is ranking while using certain practices, it doesn’t mean that those practices are the reason they’re ranking. But by optimizing your content so that it’s similar (but higher-quality) than the content that Google ranks at the top, you are more likely to take the top spot. Check this video to learn more about Google SEO ranking factors. The trick is to know which pages to compare yourself to, so that you aren’t introducing the wrong kind of change. Choosing The Proper Competitors For Your Work With Surfer Although it sounds sensible to look at the top ten results for your analysis, you’ll end up getting a lot of imprecise data and ineffective recommendations. John from Freedom Bound Business found that out the hard way. When he didn’t pay much attention to picking the right competitors, his page dropped from rank 25 to rank 41. When he qualified competitors correctly, his affiliate review got bumped up from the second results page to the first. Source: https://www.freedomboundbusiness.com/surfer-seo-review/ What does this tell you? There is no point in comparing oranges to apples, and the same rule applies to competitor analysis. To get the most out of Surfer’s correlational SEO tools, look into pages that are similar to yours, and don’t compare yourself to websites that are not. Here’s a quick guide to choosing the right competitors: Don’t compare yourself to high-authority sites like Wikipedia or Amazon (unless you are a site of this level, of course). Find websites, pages, or competitors that are within the same niche or have the same format (e.g. review sites, blogs, etc.). Avoid listings and directories while optimizing for local SEO. Look at the word count of top pages and exclude outliers. Once you do the above, your data will be much more accurate. Use Case No. 1: Building High-Quality Content From Scratch Now that you know more about correlational SEO and Surfer, it’s time to put your knowledge to the test. Create a new page for your target keyword using Surfer’s Content Editor tool. Automate Your Content Brief Content Editor lets you create guidelines for your copywriter that includes all your requirements like keywords, topics, and optimum word count. I’ll show you how to create comprehensive briefs that outline your requirements and make it easy for your writers to understand. Preparing a good brief the traditional way is a lot of work. You have to manually research your competitors, extract some basic data about keywords, and follow good SEO practices—things that can take a lot of time. Here’s an alternative way: 1. Type Your Keyword And Location When working with Surfer, you always start with a keyword and location. In this case, our target keyword is “cordless circular saw” and our location is the United States. You can also choose to turn on NLP Analysis for more phrases and words suggestions from Google API (more on this later). Once the analysis is done, you can find your query in the history log below the input. Open it to access a customization panel. 2. Choose Pages To Compare Against The customization panel has five sections: pages to include, content structure, terms to use, topic and questions to answer, and notes. Let’s start with the “Pages to include” section. By default, Surfer checked the top five pages. These top five pages are your benchmarks. Pick URLs that are organic competition for your page. Exclude pages that rank high because of their extremely high authority, pages for different business models, and pages that target a different search intent. Also exclude word count outliers or pages that have word counts that are way shorter or longer than the others. Basically, everything I already told you about selecting comparisons. Here. Check out this example for an affiliate review: 3. Let Surfer Determine The Word Count Surfer automatically recommends a word count based on your chosen competitors, but you can also customize it if you prefer. However, if you chose your competitors wisely, there shouldn’t be much reason to adjust the number. Content length is critical—Surfer calculates phrase and keyword density based on it, so be cautious when modifying it! After you save your changes, the average length will appear in the requirements section. 4. Incorporate The Suggested Words And Phrases In addition to word count, Surfer also checks the top-performing pages for words and phrases relevant to your page. Surfer uses its own algorithms to reverse-engineer the top words and phrases that you should include in Read More Read More

The post Ultimate Guide to Surfer Onsite Optimization in 2024 first appeared on Diggity Marketing.

]]>
Case Study: Should you Put Display Ads on Affiliate Websites? https://diggitymarketing.com/display-ads-case-study/ Mon, 06 Jan 2020 05:35:29 +0000 http://diggitymarketing.com/?p=10452   You like money, right? Do you like cashing in a second time on websites and content you’ve already developed? I sure do, that’s why I designed a case study to determine whether it’s possible to turn a healthy affiliate site into a better earner by monetizing it through display ads. I get it, you’re immediately skeptical, right? Of course, you are. Multiple income streams are always awesome, but why would you ever monetize a site in a way that might get in the way of how it pays you now? Most affiliate marketers shun ads because their sites make money by drawing people directly to affiliate links, not ads. People on the affiliate side rightfully ask: Won’t ads distract people away from the links to my affiliate products? Won’t I lose the precious trust of my readers? Will my site load slower? Those concerns are exactly what I set out to test in this latest case study. In the following article, you’re going to learn what effect the introduction of ads had on load time, traffic, earnings, and other site growth factors. Finally, you’ll walk away with a definitive answer on if display ads are a valuable addition to your affiliate content sites. For those who prefer video content, here’s a video summary of what you’re about to read below.  For those that like the details, the numbers, and more, carry on… First, a quick overview of what display ads are and how they fit into the affiliate model. What are Display Ads? Though you likely filter out a lot of display ads you see every day, they are far too common to ignore completely. Google Ads remains the largest ad platform serving the internet and has only gotten better at delivering targeted ads wherever you go. You’ll see them on most sites. As an off-hand example, here’s one that’s been following me around. Unless you aggressively block scripts with a tool like Adblock, you’ve seen them, and you may have even been tempted to click. If you haven’t used display ads on your site before, the setup is pretty simple. You apply to an ad network, meet the qualifications, and then follow their custom instructions on how to set up your website to display their ads. It’s hard to be more specific because the qualifications will vary from website to website. Most will require a minimum level of traffic. AdThrive requires a minimum of 100,000 monthly page views. Smaller networks like Mediavine require 25,000 sessions a month. They are likely to have content restrictions when it comes to adult topics and imagery. As you’d expect, the best networks have demanding standards. If you still need more background on display ads, this short guide can fill in some of the blanks. Now that we’re all caught up, here’s how my case study rolled out. The Case Study The study involved a single affiliate website. After some consideration,  I chose a website that was already moderately successful with affiliate offers (earning roughly $8,000 per month). I had less-valuable sites to choose from, but I chose this one because I wanted a site that had a decent amount of traffic. Good traffic means sudden trends will stand out from the static. I wanted to have statistically significant behavior to analyze when the ads appeared. My plan was to place the ads across 37 informational content pieces on the site. I monetized the informational content pieces with email opt-ins to rewards like ebooks. The ROI for an email address is nowhere near as valuable as links to affiliate products that I placed on other pages. So even if the worst happened.  Even if the ads that ate all the clicks, turned away the readers, or destroyed my site speed… it still would only affect these low ROI informational pages. The idea that ads only be served to certain pages generated some major resistance from ad networks (more on that in just a bit), but I was able to limit ads to those pages in the end by finding the right ad partner. Choosing the ad network for the experiment Two networks came more highly recommended than the others: Ezoic and Media Vine. However, neither one was able to help with the experiment. Both demanded that I allow ads on all pages or none of them. That wasn’t something I was willing to do, given the risks stated above. I’ve had people come to me and say that both these networks allow pick-and-choose options, but that’s not my experience. Actual conversation with Mediavine’s rep: I really wasn’t enthusiastic about installing the sitewide code updates both platforms require. Once again, I’m trying to protect the site overall, and some ad platforms are known to slow down websites. Ultimately, I went with a smaller and more flexible network called Newor Media (invitation link). This one came highly recommended by some friends who had longtime ad experience. Newor claimed that their platform does not impact speed (we will cover how that went in the data). They let me pick and choose the pages that were served by ads. Newor worked with me to determine how the ads would appear on the pages. That resulted in the page layout changes you can see in the next section. How page layout changed Layout 1: The Original This is how the page looks without ads. It’s a layout designed to serve informational content. Just past the first section, my email inline opt-ins are distributed discreetly throughout the page. One was placed just past the introduction, and another every few hundred words. There’s also an email opt-in on the sidebar, a little further down the page. The inline opt-ins are just written calls-to-action, while the sidebar features a full cover image of the ebook that signups receive. The page is built to serve email opt-ins. Before Newor placed the ads, there’s nothing that distracted from that purpose. Layout 2: After the Ads This is the new template based on the best practices I got from Newor. I know what you’re thinking: An ad before the title? Yeah, I know. Newor insisted that people wouldn’t be annoyed and that an early ad would set the tone for further ads down the page. I wanted to keep things simple for the sake of the experiment. So, I followed suit. Also, I should say that the page layout isn’t as busy as it looks. We spaced out the opt-ins and ads so that you never saw more than one per full page scroll. In addition to some small horizontal banners spaced through the content, I also approved and set up two larger banners in the sidebar. Now, the ads were ready to go, and the study was about to begin. To make sure it meant something, I tracked as much data as I could. What was measured? I collected data on several website factors that I considered essential to my affiliate site’s revenue and performance. To understand how deep the impact was when ads showed up, I needed to know… Email Subscriber rate impact (would my conversions suffer?) Speed performance (did the website slow down?) Rankings (Did our pages start dropping?) Traffic (Are fewer people coming through) Time-on-page (Are they spending less time on the page?) Earnings from ads (Was it all worth it?) Some of these factors have less priority than others. Speed is one of the highest priorities. The Results Speed To determine this, we performed a cascade of requests for each of the 37 test pages using Pingdom. That allowed us to reach out from both a San Francisco and Washington D.C. server and see how the load time changed. Full Data for Showing Off   Actually Useful Summary: As you can see, there isn’t much evidence of a problem here. Sure, there’s a dip, but it’s matched by a dramatic improvement from the other coast. If I saw a deviation of say, 3 seconds, I would be panicking. But it’s nothing close to that. These results are firmly in the “who cares” territory. Next, I needed to track how search engines were responding to the changes. Rankings CORRUPTED Well, this is awkward. It turns out, I forgot to tell Google that I was running an experiment, and they decided to drop the BERT algorithm update right in the middle of it. This event colored all of my tracking data, but it isn’t all bad news. BERT loved this site. It gave my site such a big hug that I can’t separate what benefits came from the update and what came from our experiment. In all honesty, I didn’t think that adding ads would affect ranking much unless combined with a lot of other downward trends. I saw my first downward trend while tracking time visitors spent on-page. Results: Time on Page The results I was tracking were measured 7 and 14 days after I introduced ads to the site. Well, this doesn’t seem good. People are spending 20% Read More Read More

The post Case Study: Should you Put Display Ads on Affiliate Websites? first appeared on Diggity Marketing.

]]>
E-Commerce SEO Case Study: How we 4x’d Traffic and Doubled Revenue https://diggitymarketing.com/e-commerce-case-study/ https://diggitymarketing.com/e-commerce-case-study/#comments Mon, 02 Sep 2019 08:41:18 +0000 http://diggitymarketing.com/?p=9815 Whether you’re an e-commerce manager or an SEO specialist, you’ve invested a considerable amount of time and energy into working out the best practice approach for tackling organic search for online stores. An E-Commerce SEO Strategy Walk-Through In this case study, I’ll be showing you my agency The Search Initiative was able to double revenue by building a custom strategy for one of our e-commerce clients who operates within a small b2b furniture niche. My goal with this case study is to introduce you to a wide range of new ideas that will help you to expand and improve your e-commerce SEO game and better serve your customers. You’ll learn the strategies we used to improve UX, technical stability, onsite optimization, content, and of course backlinks. The approach that I will detail saw our e-commerce client grow their traffic by a massive 417% in 8 months. It also earned them $48k in consistent additional monthly revenue. This took them from generating $43k a month to $91k a month, or a 112% increase in overall revenue. The Challenge Our client is in the b2b furniture and equipment business and they offer their products only within specific locations in the UK. As well as offering their products for sale to clients, they also offer their products for hire. The client came to us with a solid foundation. They had an existing e-commerce business, a solid website, and a great brand. However, when setting up their company, SEO hadn’t been a top priority. Establishing E-commerce E-A-T (Expertise-Authority-Trust) & Earning Backlinks If you have a high-quality site and with a keen desire to establish your brand (like our client does), your approach needs to be particularly focused on sustainable, long-term growth. You need to create quality content that represents the brand well and earn backlinks naturally. In addition, focus on signalling trust in the online store and the brand by demonstrating transparency and authority. We’ll get to this later. Here’s how we did it… Step 1 – E-commerce User Experience To enjoy the benefits of some quick wins, first focus on the low-hanging fruit. User Experience The client came to us with robust branding already established and a professional-looking website, but we were able to identify a few small tweaks that created a significantly better experience for potential customers. Visual Changes Optimize visitor experience by adjusting color contrast (here’s a couple of great tools for choosing brand colors and color contrast), adjust placement and selection of images, and add zooming and scaling images to product pages to further improve user experience and increase the likelihood of generating a conversion. Mobile Optimization The majority of Internet traffic now originates from mobile devices, so local and mobile optimization are now crucial for small businesses. Make these small changes to your site that make a big difference to those viewing on mobile: Making phone numbers clickable Making emails addresses clickable. Increasing the font-size to a minimum of 16px for mobile users, as you can see in the screenshot below. These small tweaks contributed towards significantly increased conversions on mobile. Step 2 – Technical Auditing The foundation of any SEO strategy is technical optimization of the website, and since we were dealing with an e-commerce site with many functions and pages, there were plenty of opportunities to identify and resolve technical problems. They are as follows… Google Index Management This included removing all traces of their old website from the Google index, removing duplication in their category pages, managing index bloat, adding their XML sitemap to the robots.txt, and removing now-defunct meta keywords from their site. For example, the client’s login pages were indexed. In some cases, this type of unnecessary indexing can cause more valuable pages to be pushed out of the search results, or skipped over in a routine crawl, thus diluting your message. HTTP Pages and URL Parameters We also found HTTP pages and URL parameters in the index. URL parameters are parameters whose value is set dynamically in a page’s URL. For example, you may have a search bar on your website where customers can search your catalog. Whenever customers do an internal search, new URL parameters will be created, which ends up bloating the index with a bunch of URLs like:  website.com/category/search?pink+scarf In order to make it clear to Google’s Search algorithm what the different URL parameters do, specify them in Google Search Console. Cleaning Up Legacy Strategies Next, we looked at any technical issues caused by legacy strategies and began to clean them up. One example of an issue was that the site included meta keywords on the pages, which have been considered defunct since Google confirmed that these self-defined keywords hold no weight in their algorithm. Worse, competitors could look at your meta keywords and find a perfect list of keywords and niches that you’re targeting. We then looked at how the client’s CMS might be causing issues without them even knowing it. Managing Magento 2 Our client’s site is built on the popular Magento 2 ecommerce website builder, which is notorious for not having a robots.txt and sitemap.xml configured out-of-the-box. We created the sitemap ourselves using Screaming Frog web crawler, added it to robots.txt, submitted to Google Search Console, thus helping Google’s search algorithm to better understand the layout of our client’s site. Finally, we dealt with a considerable site-wide issue. The site used canonical tags that were meant to be self-referencing, but were actually canonicalized different pages. This is suboptimal because it confuses Google’s web crawler bots, making it a mess when trying to rank. We cleaned it all up, so that Google knew exactly which pages should rank. Step 3 – Internal Link Building Once you have done a technical audit, earned some quick wins and solved some user experience issues, start to think about improving the internal link structure. Adding Internal Links To Existing Content Quickly, we noticed that while the client did have a blog on their domain, there was very little content on it and much of it was out of date. Also, there weren’t many links between their blog and their category and product pages… a huge opportunity for spreading link juice and establishing topical relevance. Our plan was to create more high-quality blog content and expand its scope, allowing us to build more internal links to relevant product and category pages. We drew up a content strategy that involved producing a consistent number of new content pieces each month and went back through each old blog post, updating them with relevant links to product and category pages. We’ll get to the content plan in more detail later, but for now, let’s really dig into internal linking. E-commerce Topical Clustering Create “topical clusters”, which can be thought of as groups of pages that talk about different elements of the same key topic. For example, “protein powder” might be the topical cluster. It would be made up of a cornerstone article that you hope to rank for the keyword “protein powder”, as well as several other articles talking about sub-topics of “protein powder”. Some examples could be “How to Make Pancakes from Protein Powder”, or “Can Protein Powder Help you Lose Weight?” or “10 Side Effects of Synthetic Protein”. You would then create a content piece for each of these sub-topics and have each linking to the cornerstone article using a close anchor text to “protein powder”. Using this technique, you’re able to pass value from the smaller articles to the main piece and have a better chance of ranking the main piece for “protein powder” in Google Search. From these cornerstone articles, we were then able to link back to category and product pages, increasing their perceived authority too. Step 4 – Content Strategy Before you can implement a solid external backlink building strategy, you need to create a bedrock of content to be used to support your outreach. I suggest giving your writers the following guidelines for creating content. Evergreen, Algorithmically Optimized Content Focus on evergreen content, preferably creating linkable assets such as infographics, slideshows or documents containing industry insights. An example of an evergreen topic would be “why ergonomic chairs are good for your back”. Conversely, “the best chairs in ” would not be evergreen, as it will obviously lose its relevance at the end of the year. In the same line of thought, avoid using dates in the page title, headings or URL. Look at the people ranking on page 1.  Ask yourself: How many words did they write? Find the average and add 20% more. What sub-topics did they cover? When discussing “How to lose belly fat”, you’ll see that it’s necessary to talk about “avoiding trans fats”.  Do the same. What kind of layout are they going for? Are they presenting in tables?  Do the same. And don’t forget, write in an easy-to-read manner, and without any grammar mistakes. E-A-T and E-commerce Content Create content that referenced your products and services so that you can funnel users to your Read More Read More

The post E-Commerce SEO Case Study: How we 4x’d Traffic and Doubled Revenue first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/e-commerce-case-study/feed/ 19
What Does it Take to Rank in the Most Difficult Affiliate SEO Niches? https://diggitymarketing.com/difficult-affiliate-niches/ Mon, 01 Jul 2019 08:47:21 +0000 http://diggitymarketing.com/?p=8296 Ever wonder what it’s like to compete in some of the most difficult affiliate SEO niches? As someone who has built and sold multiple affiliate websites for a healthy profit, I understand the ins and outs of this competitive space. To give you an inside look, I sat down with 6 of the world’s top affiliate SEO experts to discuss what it takes to rank in various competitive affiliate marketing niches, from CBD to Pharma. You’re about to learn about the potential earnings of affiliate marketers, the necessary budget, team requirements, content, and backlink plans involved in each of them. And if you’d like to learn which SEO niches are perfect for AI watch this video. Quick Summary Pharma – James Upjohn, Serpify How much monthly recurring revenue can someone expect to make, ranking in the top Google spots in this lucrative niche? This really depends on the number of products and route to market. Affiliate marketers will earn smaller commissions, and drop shippers significantly more. We’ve done SEO rankings for a pharma client that was having their own pills produced in India and the profit was huge. Their revenue x4 within 3 months. What kind of budget does it take to compete in this niche? Surprisingly not a huge one! So many SEOs that use PBNs are tackling local SEO keywords! There’s probably 10 SEOs trying to rank and rent lead gen / local SEO for Plumbing Miami but hardly any PBN specialists ranking for terms like Viagra! How big of a team (if any) do you need, as an affiliate marketer, to tackle this niche? When we were ranking in this niche market we were a team of two. We did the SEO to make the pharma affiliate sites look like legit local businesses. Address, phone, map, schema etc and just a few pbns were needed. Is there a specific content plan you’d recommend? We didn’t focus too much on supporting content. We probably had 2-3 blog posts with internal links to our product pages. What kind of backlinks would you recommend for this niche? PBNs 100%. I had this exact conversation with a UK SEO living in Australia. I told them to charge $250 per month per keyword and ONLY send pbn links. They came back to me a month later and said they could not believe how much PBNs moved the needle in pharma. Is negative SEO prevalent in this niche? What kinds? How do you deal with it? Not so much negative SEO but a LOT of hacked sites ranking and expect to go up against SEOs with 100s of sites interlinking and tier 2/3 with Xrumer type links. You are competing against the blackest of black hat SEOs in this niche but just do what you know works in local SEO (like these 9 local SEO solutions) and you’ll do perfectly fine. Any parting words for those considering getting into this niche? Be mindful of the laws. My client got shut down by the FTC and lost the whole business overnight. Some of the pharma products were banned and he shouldn’t have been making them sell into the USA. Stick to legit products, maybe partner with a reputable pharma company on search on well-known pharma offers. This is one of the best affiliate marketing niche to get into that still seems wide open.   CBD – Vithurs (V for short), Rankfluence How much monthly recurring revenue can someone expect to make, ranking in the top Google spots in this niche? There are a handful of affiliate business sites in the CBD niche that is absolutely crushing it right now. CBD is definitely  a very profitable affiliate marketing niche and search volume continue to climb. Let’s talk numbers. If you were ranking in the #1 position for all main search terms - especially the keywords with high buyer-intent (“product + review” keywords for ALL major brands, “best” keywords, “where to buy” keywords, etc), there is no reason why you can’t do at least $1million/month in commissions. The lack of current regulation makes it an enticing opportunity for those who want to explore the affiliate marketing program (whether as an affiliate marketer or as a direct supplier of CBD products). Although there is currently somewhat of a monopoly in this industry (more on this later), there is still a viable opportunity for newer, smaller sites. If they do some solid keyword research and rank for those low-competition long-tail keywords, they will be able to grab a fair share of search traffic. It’s one of those best affiliate marketing niches where there is plenty of money to be made, even if you’re not ranking at #1. What kind of budget does it take to compete in this niche? Your biggest expense in this investment niche will be link building (followed by content). Taking into account the average link velocity (number of new links being built on a weekly/monthly basis) of the top sites in the lucrative affiliate marketing niche and the type of links being built (regular link placements? guest posts? editorials? hacked links?) you should be able to roughly calculate how much a custom-fit link building campaign will cost you. As a general guideline, you’d probably need several thousand ($x,xxx) per month (with at least a 9-month commitment) at the VERY minimum to compete in the CBD niche. The current profitable niche sites at the forefront of the industry are heavily investing in links, with some easily spending high 5 figures per month just to maintain their existing link velocity levels. This is the sort of budget you’re looking at if you want to make a real impact in this niche. How big of a team (if any) do you need to tackle this niche? If your SEO strategy is primarily based around link building (such as outreach), then having a large team can prove beneficial as you can delegate individual tasks to certain members to handle. On the other hand, you don’t really NEED an SEO team in order to tackle the CBD niche. Would it help? Probably. But with the SEO industry also growing fast on a yearly basis, there are thousands of choices on the market for link building services and quality vendors. In my case, I have a team of 3 who handle my outreach activities (they land anywhere from 50–150 links per month per site) and I outsource all my content. In some cases (such as when I need double the links, but don’t have the immediate resources), I’ll also outsource the link building. Is there a specific content plan you’d recommend? Your SEO content strategy will depend entirely on what exactly your affiliate site is about. With my CBD affiliate niche sites, I try to review every single popular brand. Since the CBD industry is growing at a quick rate, you’ll always find new companies setting up shop, and most of them will have affiliate marketing program you can join. If you hit a roadblock, this is where competitor analysis comes handy. More than likely, your competition is ranking better than you for a reason. Clearly they’re doing something right. Why waste time trying to figure out what content will work, when you already have the answers in front of you? Tools like Surfer and Ahrefs make spying on your competition’s SEO efforts easy. If you’re running out of ideas or need help identifying content gaps and topics, just audit your SEO competition to look for opportunities. What kind of backlinks would you recommend for this affiliate marketing niche? You’re going to need the very best. The backlinks I’d recommend for this niche are guest posts, link placements (contextual), and PBN links. Some of you might have expected a huge list of backlink types, but simplicity is the ultimate sophistication. This applies to every affiliate marketing niche: the backlinks you build should ALWAYS have some level of relevancy. Acquiring CBD-relevant links isn’t easy though. For example, when doing outreach, you’ll notice that some webmasters aren’t comfortable linking to CBD-related properties. This makes link building in this affiliate marketing niche challenging, but you know what they say about challenges… Is negative SEO prevalent in this niche? What kinds? How do you deal with it? I believe there is an element of negative SEO/unethicalness in every profitable affiliate marketing niche. It’s only to be expected since you’ll always have haters and jealous people envious of your success. Any parting words for those considering getting into this niche? There is a lot of search volume and money to be made in the CBD niche. Just like there is money to be made in payday, finance, adult, and casino. If you have the budget, mindset, and persistence, I’d suggest dipping your toes into this affiliate marketing business niche and giving it a go. If you’re considering entering this niche, I’d suggest properly doing your research and due diligence. I’m guilty of this myself, but it’s important not to let your excitement cloud your judgment. If an affiliate Read More Read More

The post What Does it Take to Rank in the Most Difficult Affiliate SEO Niches? first appeared on Diggity Marketing.

]]>
How to Prepare Your Website for a Google Algorithm Update [Case Study] https://diggitymarketing.com/algorithm-update-case-study/ https://diggitymarketing.com/algorithm-update-case-study/#comments Mon, 13 May 2019 09:39:30 +0000 http://diggitymarketing.com/?p=8013 I hope that you’ve never had to go through the pain of being hit by an algorithmic update. You wake up one morning, your traffic is decimated, and your rank tracker is littered with red arrows. Algorithmic penalties are not a subject I like to trivialize, that’s why the case study I am about to share with you is different than most you’ve read before. This case study is a testament of faith and hard work by my agency, The Search Initiative, in light of a huge shift in the SEO landscape. Unfortunately, with core algorithmic updates you can’t simply change a few things and expect to get an immediate ranking recovery. The best you can do is prepare for the next update round. If you’ve done all the right things, you experience gains like you’ve never seen before. Even if you’ve never been hit with an algorithmic penalty, you should care about these updates. Doing the right things and staying one step ahead can get your site in position for huge gains during an algorithm roll out. So what are “the right things”?  What do you need to do to your website to set it up for these types of ranking increases when the algorithms shift? This case study from my agency The Search Initiative will show you. The Challenge: “Medic Algorithm” Devaluation I want to start this case study by taking you back to its origins. There was a big algorithm update on the 1st of August 2018. A lot of SEOs called it a “Medic Update” because it targeted a huge chunk of sites related to health and medicine. https://www.seroundtable.com/google-medic-update-26177.html What Does an Algorithm Update Look Like? Let’s start with a few facts. Fact #1: Google is constantly running search experiments. To quote Google from their official mission page: “In 2018, we ran over 654,680 experiments, with trained external Search Raters and live tests, resulting in more than 3234 improvements to Search.” Here are the official numbers relating to the search experiments they ran last year: 595,429 Search quality tests – this is the number of tests they have designed to run in the search engines. Some of them were only conceptual and were algorithmically proven to be ineffective, therefore these never made it to the next testing stages. 44,155 Side-by-side experiments – this is how many tests they have run through their Search Quality Raters. The SQR team looks at the search results of old and new algorithms side-by-side. Their main job is to assess the quality of the results received, which, in turn, evaluates the algorithm change. Some changes are reverted at this stage. Others make it through to the Live traffic experiments. 15,096 Live traffic experiments – at this stage, Google is releasing the algorithm change to the public search results and assesses how the broader audience perceives them, most likely through A/B testing. Again, there will be some rollbacks and the rest will stay in the algorithm. 3,234 Launches – all the changes that they rolled out. Fact #2: Google releases algorithm improvements every day and core updates several times a year! Bearing in mind everything said above, Google releases algo improvements basically every day. Do the math… They’ve also confirmed that they roll-out core quality updates several times per year: When you suspect something is going on, you can confirm it by simply jumping over to your favorite SERP sensor to check the commotion: https://www.semrush.com/sensor/ During this period, rankings typically fluctuate and eventually settle. Like in the below screenshot: A lot of SEOs (myself included) believe that during the Heavy-Fluctuation Stage, Google is making adjustments to the changes they’ve just rolled out. It’s like while you’re cooking a soup. First, you add all the ingredients, toss in some spices, and let it cook it for some time. Then you taste it and add more salt, pepper or whatever else that is needed to make it good. Finally, you settle with the taste you like. (I’ve never actually cooked soup other than ramen, so hopefully, this analogy makes sense.) Fact #3: There will initially be more noise than signal. Once there is an algo update, especially an officially confirmed one, many budding SEOs will kick into overdrive writing blog posts with theories of what particular changes have been made. Honestly, it’s best to let things settle before theorizing: One strength we have as website owners is that there are lots of us – and the data that is collected by webmasters on forums and on Twitter is sometimes enough to give an indication of what changes you could possibly make to your sites. However, this is not usually the case, and when it is, it is usually difficult to tell if what the webmasters are signaling is actually correct. Keep an eye on those you trust to give good advice. That said… At my agency, we always gather a lot of data and evidence first, before jumping any conclusions… and you should do the same. Very shortly, we’ll be getting to that data. The Question: Algorithmic Penalty or Devaluation? When things go wrong for you during an algorithmic update, a lot of SEOs would call it an “algorithmic penalty”. At The Search Initiative, we DO NOT AGREE with this definition!   In fact, what it really is, is a shift in what the search engine is doing at the core level. Put it in very simple terms: Algorithmic Penalty – invoked when you’ve been doing something against Google’s terms for quite some time, but it wasn’t enough to trigger it until now. It’s applied as a punishment. Algorithmic Devaluation – usually accompanying a quality update or a broad algorithm change. Works at the core level and can occasionally influence your rankings over a longer period of time.Applied as a result of the broader shift in the quality assessment. Anyway, call it as you want – the core algo update hitting you means that Google has devalued your site in terms of quality factors. An algorithmic shift affecting your site should not be called a penalty. It should be viewed as a devaluation. You were not targeted, but a bunch of factors have changed and every single site not in compliance with these new factors will be devalued in the same way. The good thing about all this… once you identify those factors and take action on them, you’ll be a great position to actually benefit from the next update. How to Know You’ve Been Hit by an Algo Update? In some cases, a sudden drop in traffic will make things obvious, such as this particular site that I would like to look at more specifically. But we’ll get to that in a second. Generally speaking, if your traffic plummets from one day to the next, you should look at the algorithm monitoring tools (like the ones below), and check Facebook groups and Twitter. Google Algorithm Change Monitors: https://www.semrush.com/sensor/ https://moz.com/mozcast/ https://algoroo.com/ https://www.rankranger.com/google-algorithm-updates https://cognitiveseo.com/signals/ Useful Facebook Groups: The Lab Ahrefs Insider Inside Search Useful Twitter Accounts to Follow Cyrus Shepard Glenn Gabe Marie Haynes The Patient: Our Client’s Site The client came on board as a reaction to how they were affected by the August update. They joined TSI towards the end of October. This was the ‘August 2018 Update’ we were talking about – and still no one is 100% certain of the specifics of it. However, we have some strong observations. 😉 Type of the Site and Niche Now, let’s meet our patient. The website is an authority-sized affiliate site with around 700 pages indexed. Its niche is based around health, diet and weight loss supplements. The Symptoms As the industry was still bickering, there were no obvious ‘quick fixes’ to this problem. In truth, there likely will never again ever be any ‘quick fixes’ for broad algo updates. All we had to work with was this: You can see that in this particular case, the number of users visiting the site dropped by 45% in July-August. If we look at October, when we’re running all our analyses and creating the action plan, the organic traffic looks even more pessimistic: With the niche, site and timeline evidence, we could easily conclude what follows: 100% Match with The “Medic” Update How We Recovered it – What are the “right things”? To contextualize our decision making on this project, this is a rundown of what we know and what we knew then: What we knew then It seemed as many of the affected sites were in the health and medical niches (hence, the “Medic” update). Sites across the web have experienced a severe downturn in rankings. Rankings were affected from page one down. (This was surprising – most of the previous updates had less of an impact on page 1.) A lot of big sites with enormous authority and very high-quality have also been devalued. We had speculated that this would suggest a mistake on Google’s part… What we know now ‘The August Update’ affected Read More Read More

The post How to Prepare Your Website for a Google Algorithm Update [Case Study] first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/algorithm-update-case-study/feed/ 23
Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank https://diggitymarketing.com/page-rank-case-study/ https://diggitymarketing.com/page-rank-case-study/#comments Mon, 04 Mar 2019 09:55:34 +0000 http://diggitymarketing.com/?p=7421 Introduction I’ve been a director at The Search Initiative a while now. We’ve had some crazy results for a whole bunch of clients – all in very different niches. I’m going to share with you how my team and I took a low-authority website and put our foot on the gas to get it moving – fast! I truly believe there was a key ingredient that accelerated this website, but I’m going to share the whole process from the start. Why? Each website is going to be different, so you need to figure out what your site needs. You need to go through the process. Here’s a sneak peek of the growth. Now learn how we did it… Initial Analysis Since starting TSI’s organic seo services, I’ve realized that working with my own sites is hugely different from working with clients; especially if the website has weak foundations. I know how I want my money sites to look, so I build them using rigorous attention to detail. But if you take a website that’s been developed without a certain level of SEO knowledge – there’s normally quite a lot of on-site and off-site to fix. Here’s how my team broke down the initial analysis: Keyword Research On-Site Audit Backlink Audit Competitor Analysis Keyword Research My team tackled keyword research with two main workflows: one is used to monitor the health of a website, and the other is for content gap analysis. When we’re looking to track keywords for a website, we want to track some of the core terms, but also terms that are having problems. If a term is suffering from keyword cannibalization that we’re trying to fix – it’s worth tracking this daily until it’s resolved. Since this client needed a huge content strategy, we did both a health check and initial content gap analysis. This approach included breaking down all keywords for that industry into topics of relevant terms. In total, this process took over 20 hours and included thousands of keywords chunked into neat topics. This work later helped with choosing page titles, headings and content. Here’s an example of how we did it: Step 1. Search Broad Keywords Step 2. Review Parent Topics Step 3. Find Competitors for Parent Topics Step 4. Reverse Engineer Competitor’s Keywords Step 5. Exclude Outdated Keywords There is the option to also export all of these keywords into excel documents and then filter them that way. But most of the time, a lot of the top keywords are fairly similar. Here’s an example for the best dog food term: best dog food best dog foods healthiest dog food what is the best dog food top rated dog food best food for dogs While each keyword is unique, they all follow a singular intent. The users are interested in finding out what are the best dog foods in the market. On-Site Audit Finding all the technical and content issues with the website requires a full on-site audit. However, while big reports are easy on the eyes, it’s small changes that make the difference. We audited the website and found a whole bunch of technical issues, from lack of breadcrumbs, poor internal link structures, bad quality anchor text and unoptimized titles. A full on-site audit tutorial is too big for this post (perhaps coming soon), but here are some quick tips: Screaming Frog – A cheap way to regularly crawl your website. There are lots of ways to find errors, redirects, and missing metadata. You can also use a custom search to find all references of your keywords. Sitebulb – This tool is more expensive and is a monthly recurring fee. However, it gives you lots of extra data that would be impossible to spot manually and hard with Screaming Frog. An example would be empty hyperlink references. Site Search – By using Google’s site search (site:domain.com) and operators, you can find hundreds of issues with index management, outdated page titles, and multiple pages targeting the same keyword. There are a lot of quick wins here. Page Titles – If you wrote your page titles 1 – 2 years ago, you may find that they’re outdated now. A quick site search with “intitle:2018” will find all your content that is either not updated or not yet crawled by Google. Internal Links – A major way to pass relevance signals and authority to your core pages is through internal links. Make sure that your pages are well interlinked and you’re not using low-quality anchors from your power pages, such as “click here” or “more information”. We focused on fixing around 5 issues at a time varying from small changes like improving accessibility, to bigger changes like introducing breadcrumbs for a custom build website. Backlink Audit The website had a relatively small backlink profile, which meant it lacked authority, relevance signals and entry points for crawling. It also meant that a full in-depth link analysis was unnecessary for this campaign. In this instance, the initial check revealed there was nothing to be concerned about, so we moved on to technical implementation as soon as possible. Had the website experienced problems with the link profile, we would have done a full backlink audit to try and recover this. Here’s what to look out for: Link Distribution – Pointing too many links toward internal pages instead of your homepage can cause lots of issues. So make sure that you’re not overdoing it. Anchor Text Analysis – Using exact match, partial match and topical anchors are a great way to pass third-party relevance signals. Too many and you’ll be caught out over-optimizing, but too few and you won’t be competitive. Read more about anchor optimization. Referring IP Analysis – There are a finite number of IPv4 Addresses, so this isn’t often a big cause for concern. However, it’s worth making sure that you’ve not got too many links from the same IP address. Autonomous System Numbers – Since a server can be assigned any number of IP addresses, these systems often include an ASN. This is another way that Google could flag large numbers of websites from the same origin. My team did a case study on how to remove an algorithmic penalty, a lot of these audits come included in any penalty removal campaign. Competitor Analysis The difference between a search analyst and data scientist is how you approach the search engines. An analyst is focused on reviewing the SERPs and finding what is working best today, while a data scientist wants to understand how things work. We built our team to include both since competitor analysis requires a keen eye for reviewing the SERPs and algorithm analysis requires solid data scientists. If you want to do SEO at a high level, you’ve got to constantly be reviewing competitors using various analysis tools. You will notice that tons of best practices get ignored in the top positions and the devil is in the details. In this instance, we found that both more content and more links would be required for long-term success. Content Strategy Building any long-term authority website in competitive industries will include both an authoritative link profile and content plan. My team reviewed their existing content, looked at how other websites in their industry wanted to help users and then addressed these four cornerstones: User Intent – before we did anything, we wanted to nail the user intent on every page. This research meant that we identified three pillars of content for their site. We’ll get into this in further detail below. Service Pages – these pages were dedicated to explaining what service was offered, how to contact and what was included with that offering. Blog Content – these posts were dedicated to providing non-commercial, informative content that was interesting to the reader. Resource Center – this section was dedicated to giving basic information about topics in their industry. Instead of using Wikipedia for all our links to authority content, we wanted to use internal links instead. Here’s a little bit about each section and our strategy for them: User Intent The biggest mistake I see all the time is the simplest thing to check: What types of content is Google ranking in the top 10 positions? If you’re serving 10,000 words of content in a huge blog post, but Google is only interested in serving service pages with 50 words of content – you’ve missed the point. Another commonly found problem we find at The Search Initiative is including too much content in a single post, when your competitors have several shorter posts. One of the main attractions for Thailand are the yoga retreats. If you’re searching for this (yoga retreats) in America, you’re expecting to find destinations. Let’s take a look: The first position is called Yoga Journal and includes almost no content aside from images and headings. That’s exactly what the users were looking for. There are other websites doing a similar service and can help you make bookings. While others Read More Read More

The post Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/page-rank-case-study/feed/ 30
How to Build a Scalable White Hat Agency [Case Study from TSI] https://diggitymarketing.com/scalable-white-hat/ https://diggitymarketing.com/scalable-white-hat/#comments Wed, 14 Nov 2018 08:28:58 +0000 http://diggitymarketing.com/?p=6654 Introduction A few years ago, my agency The Search Initiative (TSI) decided to remove PBNs as a part of our link building strategy and move over completely to a white hat link building model. While PBNs only made up a small portion of our strategy, we knew this would be a fairly big undertaking. This would involve the re-design of the overall process and our approach to campaigns. Every decision that we make at the agency is data-driven. We constantly test, experiment and analyze. Our aim is to always put our clients first and to give them the best ROI for their campaign. We needed a sustainable and powerful link building strategy that is not only extremely safe but also moves the needle. In this article, you’ll learn our entire outreach strategy. We’ll discuss the role content quality plays in an effective outreach campaign, how to research and create fine pieces, how to prospect link opportunities, and how to pitch with a high success rate. Bonus: I’ve brought in Rad Paluszak (TSI Director and Chiang Mai SEO Conference 2018 speaker) to chime in with some knowledge bombs. Overall Goals and Strategy During our early brainstorming sessions, we knew that to fully transition from grey to white we needed to base the entire system on high-quality content. Why is high-quality content so important for outreach? Bloggers, website owners and journalists wised up a long time ago to the importance of links in SEO. They fully understand how important they are to us and the value that these links hold. Many prospects are extremely proud of their websites and extremely picky about what content is placed on their own personal soapbox. Just look at how few guest posts I’ve allowed on my site. Outreach link building is an exchange. You provide something that the website’s readers will love and adds true value to the website and in exchange, you attain that all-important link. My advice: Create a systemized approach that focuses on quality. Focus on: Content that is easy to understand Content that teaches Content that is easy to share Content that impresses But how do you create a content marketing machine that can produce this kind of content? We assembled a team who had zero SEO knowledge but were bright and creative, which were the main skills we were looking for. You want your content and outreach team members to be right-brained, creative types. I’m a left-brained robot, so I’m much better at stuff like link prospecting, which you’ll learn more about later. If you chose to train the new outreach team from scratch, you avoid the shortcut mentality that many trained SEO’s have. Think about it. You have your biases about how things work, and you’re likely going to carry them around with you from project-to-project and job-to-job. Essentially, you want a team of content marketers, not SEO’s. This is easier to do when working with a blank slate. One of the key skills we were looking for was someone who had an eye for copy. Thus, we put out job adverts on copywriting groups looking for people who had a background in copywriting but who were interested in being trained up in online marketing. Cult of Copy is a great facebook group to find niche-specific copywriters. [quote] Tip for you The group has very strict rules from the very beginning. To join the group you have to confirm that you understand the following: Only posts requesting or offering a copywriting job are allowed. Only comments accepting a copy job or giving testimonial are allowed. No public complaints or moaning – these kinds of things should go through admin. Not adhering to the above and you’re banned! It’s good to keep everything in order! Do you think it impacts the group’s activity? Not at all! Have a look at the stats:   Rad Paluszak – Director of SEO at the Search Initiative [/quote] Content Research There are endless tools and processes for content research available. To cut through the bullshit, keep things as simple as possible, stick to the 80/20, and focus your efforts on a few core resources: Reddit Quora Meta Filter Google Search The reason why these small set of tools are recommended is that communities like Reddit, Quora and Meta Filter have done a great job of keeping spam at a minimum and also encouraging active participation. You can, without too much effort, find the industry pain points, the subjects people care about, and the themes that are recurring time and time again. For a content marketer, this is phenomenal. Reddit If you are not using Reddit as part of your content research then you are seriously missing out on an awesome trick. The really great thing about Reddit is that spam and manipulation are very rarely tolerated. And the users are usually painfully honest, to say the least. Search for variations of topics to find out what questions people are asking and what specific questions receive lots of upvotes and replies. The key here is to create a documented record of the topics that had the most engagement, as this is a great indicator that the content is popular. At this early stage, you should aim to collect as much information as possible on anything and everything in our target niche that has a high level of upvotes and comments. To do this, simply head to Reddit, and search for your market defining keywords within the search bar. Next select ‘Communities and users’ to see which subreddits are in your niche. Look for the subreddits with the highest number of subscribers and ensure that they are active communities. Sort through the subreddits one by one by setting the results to, Sort > TOP, OF ALL TIME, to get the pieces of content and content themes that have had the biggest impact. Record all the data in a spreadsheet – you can create a copy of the spreadsheet or have it already integrated. [quote] Tip for you If you’re not familiar with Google Docs and would prefer to use Microsoft Excel or Libre Office Calc, you can download a copy of the shared spreadsheet as shown below: It’s also very easy to create your own copy of the spreadsheet on your Google Drive – as mentioned above, please use this method instead of requesting the access to our file: Rad Paluszak – Director of SEO at the Search Initiative [/quote] Quora and Meta Filter Quora and Meta Filter are both very highly used question and answer websites. These allow you to search for the themes and mini topics within your niche and look for the specific questions that people are struggling with. This is very important, as you’ll see later. It also allows you to look for further content ideas within the answers that are provided. Again, we are looking for questions that have received a large amount of engagement. In the example from Quora above we can see that both of these questions have a high answer rate (133 and 99 answers). We collect all the relevant questions in a spreadsheet which includes: The question The search used How many answers the question has Any relevant comments [quote] Tip for you If you’re only looking for a bunch of questions that people ask around your niche, my personal favorite is AnswerThePublic.com. Let’s type ‘link building’ and allow it do its magic: You can easily find a lot of common questions the users are searching for. It’s a great inspiration to write useful answers and a way to potentially attack these sweet “Position 0” answer boxes: Rad Paluszak – Director of SEO at the Search Initiative [/quote] Google Search Look at high authority sites to reverse engineer successful campaigns. This allows you to hedge your bets with outreach because you’re following in the footsteps of someone else. Let’s see what kind of SEO related infographics the Huffington Post is sharing. Using a simple search string returns 88 results. If we take one of those results we can see that this infographic campaign has returned links from 24 referring domains. These can simply be exported and saved ready for quality control checks and outreach. As these infographics have been featured on a site such as The Huffington Post, it is likely that it has also been featured on other authority sites. From this research, we can see what kind of themes and topics get featured on high authority sites. Record everything and add it to the research spreadsheet. Content Strategy Our content strategy was based on a few specific content types: High-quality guest posts Infographics Embeddable Interactive content Creative content pieces Guest Posts Guest posts are easy to scale and are a great way to ensure that you have a baseline of high-quality links. There’s ways to do them right and there’s way to do them wrong, so make sure you know the difference or get them from a place you can trust. Infographics Infographics have been done to Read More Read More

The post How to Build a Scalable White Hat Agency [Case Study from TSI] first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/scalable-white-hat/feed/ 26
How to 91x Website Traffic – A Case Study Blueprint for 2024 https://diggitymarketing.com/algorithmic-penalty-recovery-case-study/ https://diggitymarketing.com/algorithmic-penalty-recovery-case-study/#comments Mon, 06 Aug 2018 08:38:05 +0000 http://diggitymarketing.com/?p=5059 Every once in a while you run an SEO campaign that changes the way you do everything. The lessons you learn, the challenges you face, and the results you achieve inspire you to rewrite your whole SEO gameplan. This is the story of one of those SEO campaigns. As you might already know, I’m a director of a very talented SEO agency called The Search Initiative (TSI).  Since coming on, we’ve encountered many wins and this case study is one of them. In a few months, we lifted their algorithmic penalty and increased traffic by 9,109%.  You’re about to learn the exact steps we took to achieve this. You’ll learn: A detailed onsite, offsite, and technical SEO audit process How to repair algorithmic penalty problems A safe link building strategy for Conversion rate optimization strategies for fast growth Fair warning: the strategies detailed ahead are intense but worth it. Here’s the success one reader found after following this case study: Case Study: From 1,036 to 95,411 Organic Visitors Per Month This is the story of a campaign for a social media marketing website. Our client monetizes their website by selling monthly subscriptions to achieve better social proof on Facebook, Instagram, and other social networks. If you’ve ever been in this niche before, you’d know it’s not an easy one.  It’s one of the hardest niches there is. The Challenge The client joined The Search Initiative with a heavy algorithmic penalty. Traffic at the time had decreased significantly to almost 1/10th of the previous volume. If you’ve ever had an algorithmic penalty before, you can directly connect with the frustration and annoyance of such a disaster. The main challenge was to determine what type of a penalty hit the site and to take action on getting it lifted. General Approach We started by thoroughly analyzing the data based on the tools available to us and the details provided by the client. The initial analysis included looking into: Google Analytics Google Search Console Keyword tracker (Agency Analytics) SEMrush Ahrefs Cloudflare Server settings Previous link building reports and audits Once we determined the most probable cause of the penalty, we put together a plan of action. We created a comprehensive onsite, offsite and technical audit before building the overall domain authority through our own link building strategies and traditional outreach to relevant blogs and sites. How We Did It The Dynamic Start: Backlink Review The link profile of the domain included a lot of spammy, low-value domains. Since a previous automated backlink audit (most probably done using Link Research Tools) had been performed before the client joined our agency, we started by reviewing its results. At TSI we know that if it comes to potential link penalties, especially the algorithmic ones, we have to be very thorough with the link reviews. To start the analysis, we downloaded all the link data from the following sources: Google Search Console – it’s a real no-brainer to include all the links that Google definitely has in their database. However, according to this Google Webmaster Help page, you have to remember that GSC presents only a sample of links, not all of them. Ahrefs – it is our go-to and best 3rd party tool when it comes to links. Their database is an absolute beast and the freshness of the data is also outstanding. To gather all link data, go to Ahrefs, type in your domain and select Backlinks. Now you’re good to Export it to an Excel file: By the way, make sure you select the Full Export option, otherwise, you’ll be exporting only the first 1000 rows with the Quick Export: Majestic – even though their crawler might not be as complete as Ahrefs, you still want to have as many link sources as possible for your audit. With Majestic, you’ll have to type in your domain → Select “Root Domain”→ Export Data. Now, because of the link memory (AKA ghost links – links that are deleted, but Google still “remembers”), we export the data from both, Fresh and Historic indexes. Also, ensure to set the tool to “Show deleted backlinks”. Moz and SEMrush – Similarly to Majestic, with these two we just want to have as many links as possible and complement the database, in case Ahrefs missed some. How to get links data in Moz Open Site Explorer: Your site → Inbound Links → Link State: All links → Export CSV How to get links data in SEMrush: Your Site → Backlink Analytics → Backlinks → Export. Please make sure to select “All links” option. We had all the data now, so it was time to clean it up a bit. There’s no real secret in how to use Excel or Google Sheets, so I’ll just list what you’ll have to do with all the link data prior to analyzing it: Dump all Ahrefs data into a spreadsheet. If you’re wondering why we start with Ahrefs, it’s explained in step 4. Add unique links from GSC into the same spreadsheet. Add unique links from all other sources to the same spreadsheet. Get Ahrefs UR/DR and Traffic metrics for all the links (Ahrefs data will already have these metrics, so you’re saving time and Ahrefs’ credits). Spreadsheet ready! With the spreadsheet, we started a very laborious process of reviewing all the links. We classify them into 3 categories: Safe – these are good quality links. Neutral – these are links that are somehow suspicious and Google might not like them that much – although they’re quite unlikely to be flagged as harmful. We always highlight these in case we were to re-run the link audit operation (for example if the penalty did not get lifted). Toxic – all the spammy and harmful stuff you’d rather stay away from. Some of the main criteria we’re always checking: Does it look spammy/dodgy AF? Does it link out to many sites? Does the content make sense? What is the link type (e.g. comment spam or some sitewide sidebar links would be marked as toxic)? Is the link relevant to your site? Is the link visible? Does it have any traffic/ranks for any keywords? Ahrefs’ data helps here. Is the page/site authoritative? Ahrefs’ DR helps here. What’s the anchor text? If you have an unnatural ratio, then it might be required to disavow some links with targeted anchor texts. Is the link follow/nofollow? No point disavowing nofollow links, right? Is it a legit link or one of these scraping/statistical tools? Is it a link from a porn site? These are only desirable in specific cases, for example, you’re a porn site.  Otherwise, its disavow time. If it is likely that the whole domain is spammy, we’d disavow the entire domain using “domain:” directive, instead of just a single URL. Here’s a sneak peek of how the audit document looked like once we finished reviewing all the links: Then, we compared the results of our audit and current disavow file and uploaded a shiny new one to Google Search Console. We disavowed 123 domains and 69 URLs. Additionally, we also used our in-house, proprietary tool to speed up the indexing of all the disavowed links. Something quite similar to Link Detox Boost, but done through our own tool. Here’s a little screenshot from our tool: Crucial Stage 2: The Onsite Audit The next step taken was a full, comprehensive onsite audit. We reviewed the site and created an in-depth 30-page document addressing many onsite issues. Below is a list of elements covered in the audit: Technical SEO Website Penalties First, we confirmed what the client has told us and established what kind of penalty we’re dealing with. It has to be emphasized that there were no manual actions reported in GSC, so we were dealing with a potential algorithmic penalty. We searched Google for the brand name and did a “site:” operator search. If you were able to find your brand name ranking number 1 in Google (or at least among your other profiles, e.g. social media accounts, on the first page) and it’s no longer there, you know you’re in trouble. Basically, if Google devaluates or de-ranks you for your own brand, this is a very strong indicator that you’ve been hit with a penalty. With the site: operator search it’s a bit more tricky. However, as a rule of thumb, you could expect to have your homepage show as a first result returned for a simple query: “site:domain.com” in Google. Another way of confirming the content devaluation is to copy and search for a fragment of the text on your core pages. In the example below I do a Google search of 2 sentences from one of my articles (right-click to bring up a search of the text you highlight): As you can see below, Google finds it on my page and shows as a first result: If it was not the case and Google did not show me first or at all, then it would be a very strong indication that the article page or Read More Read More

The post How to 91x Website Traffic – A Case Study Blueprint for 2024 first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/algorithmic-penalty-recovery-case-study/feed/ 155
The Story of 10Beasts.com – An Uncensored Interview with Luqman Khan https://diggitymarketing.com/interview-with-luqman-khan-of-10beasts/ https://diggitymarketing.com/interview-with-luqman-khan-of-10beasts/#comments Mon, 08 Jan 2018 08:00:29 +0000 http://diggitymarketing.com/?p=4032 About a year ago, I was introduced to a site called 10beasts.com. At the time, it was an 8-page affiliate website in the technology niche that quickly busted out of the sandbox in six-figure profitability within 8 months.  December 2016, it broke $80k. This site became incredibly popular when it was featured on Glen Alsop’s Gaps.com.  I mean, how often does someone go public with an affiliate website of this level? Fast forward one year… 10Beasts grew in size and earnings and flipped for over half a million dollars. And then the unspeakable happened. It got penalized with an unnatural links manual action in Google Search Console. And guess what? The penalty had recovered in 5 days. Meet Luqman Khan. Luqman is the creator, builder, and recoverer of 10beasts. In this no-holds-barred interview, Luqman discusses the entire story of 10beasts, how he got it ranked, how he sold it, and how he recovered it.  In this interview we get into: The story of Luqman Keyword research Content planning Onsite optimization Backlink strategy Social signals The huge flip for $500k+ …and the miraculous 5-day recovery Resources: Tools Keyword Finder – Keyword Research CrazyEgg – Heatmap Monitor Backlinks Services AddMeFast – Social Signals Upwork Fiverr: Character Images AllTop.com Empire Flippers Guides An SEO’s Guide to Flipping  Blogs NichePie Backlinko Cloud Living Gaps.com NeilPatel.com Transcript Matt:                     Hey, Luqman. How’s it going, man? Thanks so much for coming on. Luqman:              Hey, nice to meet you, Matt. You’re absolutely welcome and thanks for inviting me for this interview. Matt:                     Absolutely. For the people that are watching that don’t know who you are, can you give us a brief introduction like what’s your name, how old are you, where you came from? Luqman:              Well, my full name is Mohammed Luqman Khan and I’m from Lahore, Pakistan. Lahore is actually the second biggest city of Pakistan. I’m like actually, I’m born here and I’m living since I’m born here, and I have been to England, Turkey, Egypt, and a few other countries. And for now, I am still living in England as a computer science student in University of Manchester. Matt:                     Oh, great, awesome, so you’re well-traveled. Tell me a little bit more about the home city you grew up in. Lahore, right? Luqman:              Yeah. Matt:                     Is that a city where people are doing like what you’re doing, working online? What’s it like where you’re from? Luqman:              Well, Pakistan is actually the second biggest country who are doing the freelance work like freelance jobs are more open here in Pakistan. And what I’m doing here a lot of people are doing here. Actually, I’m inspired from a Pakistani guy called … his name is Salman Baig. He’s from another city called Peshawar. It is the north side of Pakistan. So, yeah, that’s all. Matt:                     Yeah, cool. And what do your parents think about what you’re doing? Luqman:              Well, my parents really don’t like what I’m doing. They want me to really work. They ask me there what you do and I really tried to explain them what. They don’t know what you can do on internet. They want me to get a physical job, they want me to show doing something because the people think that I’m just a lazy guy who’s sitting in home all the time and do nothing but yeah, that’s actually what’s going on here. Matt:                     It’s not one of their familiar lawyer, doctor, so it’s garbage. I get that too. So tell me a bit about your background. You said you were getting a degree in computer science, right? Luqman:              Yeah. Matt:                     Okay. And are you working on your bachelors or masters? Luqman:              I’m doing bachelors yet. Matt:                     Okay. How are you doing there? Luqman:              It’s not that great. It has nothing to do with my career, so I’m actually just doing it to get a degree to satisfy my parents, that’s all. Matt:                     I hope your parents don’t watch this, and if they do, I apologize for instigating this guy. Okay, cool. Have you ever had a jobby job? Have you ever worked for someone else? Luqman:              I had a job in call center. It was in the sales department in some kind of product, I think security installment product in Canada but the call center was here in Pakistan. Matt:                     That sounds fun. Luqman:              Yeah. I had it back in 2011 maybe. I don’t remember. I don’t really … yeah. Matt:                     Okay, all right. Luqman:              So, I only worked for like one month. My back was already completely trashed of sitting on a chair for eight hours continuously. Matt:                     Yeah, I can agree. Luqman:              It’s like … Matt:                     Mm-hmm (affirmative). And so, when did you get into SEO and how did that happen? Luqman:              When I was a freelancer, I started, you know, I figured out online earning from an ad. It was a PTC website, I don’t know. I was I think smartphone were newly introduced back then and I was looking for a smartphone on GSMArena.com, and there was an add earn by clicking in it was a PTC website. I hope you know about PTC websites. Matt:                     Mm-hmm (affirmative). Luqman:              There you click on an add and you get a few cents and things like that. And that was a scam website but I ended up with an idea that earning, online earning is quite a possible thing. So I started research, I learned HTML, CSS, and WordPress. By the passage of time, I started to work on upwork.com, fiverr.com. And I had a project on Fiverr, they were the client who had a website. I think Amazon associate website. That’s how I figured out about Amazon associate and by the passage of time, I figured out about search engine marketing that how you can get visitor to your website and that’s how I ended up on backlinko.com, cloudliving.com. And I saw that guy, Suman Bake, whom I told you about earlier from Peshawar. I saw him. I know he was posting somethings on his Facebook walls so it was good. Matt:                     Okay, so you were doing some online freelance work. You started working for a website. And you’re like, “Okay, if he’s paying me this much, how much is he making?” Luqman:              Yeah. Matt:                     Then you went down the rabbit hole, I’m guessing. Luqman:              Exactly. Matt:                     And where have you learned from in the meantime? Do you read blogs? Luqman:              Yeah, the main learning source is backlinko.com for branding. And a few other Facebook pages, Facebook groups, sorry, and Neil Patel. You know neilpatel.com and Quick Sprout also. These famous blogs, they are really helpful. Matt:                     Awesome. And this was how long ago when you first started getting into SEO? Luqman:              I think in 2013 or ’14. Maybe … I’m not really remember. Matt:                     So like maximum like four, four-and-a-half years ago. Luqman:              Yes. Matt:                     And I would definitely say you classify as what I would call a very successful SEO. I’d say you’re probably in the 1% considering what you’ve done with 10Beasts. How does that sound to you? Luqman:              Oh, thank you. Matt:                     Like how does that make you feel? Luqman:              That sounds great. That sounds really great, man. Matt:                     I’m not just saying that because its coming from me… but just like you were not an SEO four years ago and now you’re … I would say you’re in the 1%. That … you’re awesome. Luqman:              I really do feel awesome actually. Matt:                     That’s good, that’s good. You deserve it. You did a lot of hard work and I’m excited to talk about that site but not quite yet. On the way to where you are now, did you ever face any setbacks or any big issues that kind of … roadblocks that got in your way. Luqman:              The biggest issue I faced was drop out of college in November 2015. I had a fight issue with my ex’s boyfriend and the fight really turned rough fight, fight. Matt:                     Okay. Luqman:              So you know, actually, that guy, he brought a few guys to beat me up from outside the college, those who weren’t students. So the students of the college, they find out that people came outside the college to beat me, the student of the college. So the fight really turned into a big scenario like there were more than 50 to 60 students fighting in the hockey ground. Matt:                     Oh, my goodness. Luqman:              And it really turned bad. They suspended like more than 16 students and including me and that other guy. I was suspended for five years. I cannot [inaudible 00:07:47]. Matt:                     Wow. So, I mean, that probably not just affected you in your school life. It probably affected every aspect of your life including the relationship with your parents. Luqman:              Yeah, exactly. The relationship with my parents, my family, my teachers, so it was really bad. Matt:                     How did you bounce back from that? Luqman:              I flew to England. Read More Read More

The post The Story of 10Beasts.com – An Uncensored Interview with Luqman Khan first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/interview-with-luqman-khan-of-10beasts/feed/ 100
Top 10 Most Common SEO Problems https://diggitymarketing.com/top-10-most-common-seo-problems/ https://diggitymarketing.com/top-10-most-common-seo-problems/#comments Mon, 02 Oct 2017 09:43:36 +0000 http://diggitymarketing.com/?p=3155 Since 2017, I’ve been involved as a director for a very talented agency called The Search Initiative (TSI). The reason I teamed up with these guys was very simple. These guys were testers.  Like me, they only rely on experience, data, and test results for their ranking strategies. It was a match made in heaven. Since then, we’ve had the pleasure of onboarding hundreds of new clients offering a wide range of organic SEO consulting services. Many of these new partners have had seen huge growth, others have had penalties removed, and all have clear roadmaps on how to grow in the future. I wanted to share with you the top 10 SEO problems we typically encounter when onboarding new clients. I hope that by sharing with you some of these common mistakes, you can use this knowledge to your advantage and make some serious improvements to your rankings. The 10 Most Common SEO Issues in Many of you will be familiar with the inverted pyramid writing style, where the most newsworthy content is at the top and the least is at the bottom. I’ve tried to follow this structure; however, all the points below are not to be slept on.  They’re all major issues that commonly appear amongst even the best sites. If you want to get the most out of this article, check out every point here.  As you know, there are no shortcuts when it comes to SEO. Index Management Localization Keyword Cannibalization Over Optimized Anchor Text Poor Linking Strategy Low-Quality Affiliate Content User Performance Metrics Titles & Meta Descriptions Internal Redirects Low-Quality Pillow Links And if you’re wondering which SEO Mistake is ruining you on Google watch this video. 1. Index Management Problems The first and most common issue that we’re seeing is accidental devaluation of the website because of indexing issues. It stems from a common misunderstanding about how Google actually works. (More on this in a bit…) Most people think that if they build links and noindex junk pages they’re fine. However, it’s not that simple – and I’m about to show you a real example. Below you will find a screengrab from Screaming Frog. This is from a crawl of an eCommerce website that had a lot of onsite issues that need to be fixed: It’s quite hard to see but you may notice I have highlighted the number of HTML pages that are filtered. It’s a whopping 32,064 pages and, yes, it took us a long  time to crawl. None of the 32,064 pages found in this crawl included a noindex tag, which means (in theory) Google should be able to crawl and index these pages. So, let’s check this against our numbers in the Google Search Console: When we check in Webmaster Tools, we’re seeing 14,823 pages indexed. While this is a large volume, it’s still less than 50% of the pages that were found with Screaming Frog. This is the first sign that something is seriously wrong, but the next screenshot will show you the extent of how badly our client had been stung with Panda’s low-quality algorithm.  We use the “site:domain.com” operator to pull up the number of indexed pages: Despite the website having 32,064 pages crawlable and with index tags, and despite Google having indexed 14,823 in Search Console – only 664 have made it into the actual index. This site search shows us that Google has highly devalued most of the website. It is a crawling nightmare. So, the question is, how can you fix this? Thankfully the answer for most people is quite simple. Start by performing a site:domain.com search and auditing Google’s index of your site. If you go to the final page and you’re greeted with the below message, you have work to do: Take a hard look at which pages shouldn’t be indexed and start proactively removing them from your crawl budget. The problem with Google is that despite you adding a noindex to your pages, they remain indexed until Google recrawls. Some people add robots.txt to block these pages and save crawl budget – which is a good idea, but only after the pages are removed. For the rest of us, we’re going to need to use the URL Removal Tool. To learn more about how to deal with crawl and indexing issues, check out this guide. 2. Localization Issues The second most common issue we are seeing is when clients have multiple languages. While it’s great to have international coverage and provide foreign users with localized text – it’s a nightmare for Panda penalties if not setup correctly. Many people are familiar with the URL structure that you should use for localized text, but many people forget to set up HREFLang on their website.  My buddy Tom talked it in this interview I did with him here. If you are looking to setup HREFLang codes, I suggest you use this website to get the right country and location code every time. Below is an example of an eCommerce client. Whereas the previous client had issues with index management, this time it’s caused by HREFLang, and one more thing that goes unnoticed… While the client has successfully included hreflang in their source code, they had not included both the location and language code. The one time they try to do this with en-GB, the page no longer exists and redirects to their sitemap. To add, this covers just 50% of the languages their website operates under. This has created an enormous amount of duplication to be indexed. However, there’s still one more thing that was missed.  Each page has the Open Graph local set for en_US: This includes the pages that aren’t in English. While this setting isn’t as clear cut as hreflang, it is indeed something that will provide Google with information on locale, and therefore creates confusion. If your website has a similar issue, we advise you make the locale dynamic to match the current language. For more help with locality and its importance, check out this page on local seo solutions. Client Testimonial “The guys at TSI make SEO look easy!.. We were a completely new website planning to operate in arguably the most competitive online marketplace which made the task ahead extremely difficult as we were going up against many well-known global businesses which also resulted in many different SEO agencies reluctant to work with us. TSI wasted no time in implementing their campaign and within 4 months our website was ranking on page 1 for some of our most profitable keywords. The guys at TSI are constantly keeping me updated and send me monthly reports on my campaign performance and keyword tracking. I would recommend their services in a heartbeat!” – Jon H 3. Keyword Cannibalization This is a surprisingly common issue for most websites that we encounter. Despite the large amount of resources online to help with cannibalization, you would be surprised how many people still suffer from it. Never heard of it? Quite simply, it’s when you have multiple pages on your site competing for the same keywords. And guess what?  Google doesn’t like it. The first step is to learn to diagnose the culprit pages, because if you cannot find cannibalization – how can you fix what you can’t see? At The Search Initiative we have a few ways to find cannibalization but here’s the easiest and most effective. Use Keyword Tracking Tools One of the benefits a client gets from working with TSI is that we track keywords up to twice daily with Agency Analytics, one of our partners. The tool includes an overview of the site’s overall keyword performance, such as below: Aside from showing us an overview of how this client has performed over the past 7 days, we can use this to track each keyword’s performance independently too: In this photo, you might notice that there had been a jump from the 3rd page to the 1st page for their target term after implementing some of our onsite advice. However, more importantly, you will also be able to see that the Google URL had started flipping between their category page and their homepage. This is an obvious sign of cannibalization and once noticed we jumped into action to fix the problem. To learn more about keyword cannibalization, I have a master guide here. 4. Over-Optimized Anchor Text There was a significant update in October 2016 as Penguin 4.0 rolled out. Penguin 4.0 was an update that changed how Google perceives and interacts with links.  I even wrote an article for Ahrefs covering how it affected anchor text optimization. As part of our auditing process for each new client, we analyze your existing anchor text and break the types down into the below values: Branded – an anchor text that includes your brand name or a slight variation, for example: ‘thesearchinitiative’, ‘visit thesearchinitiative’, or ‘TSI’. Generic – an anchor that uses a generic term but does not include branding, for example:‘here’, ‘read more’, Read More Read More

The post Top 10 Most Common SEO Problems first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/top-10-most-common-seo-problems/feed/ 101
SEO Spotlight: Daryl Rosser, Lior Ohayon, Joseph Elshazly https://diggitymarketing.com/seo-spotlight-episode-2/ https://diggitymarketing.com/seo-spotlight-episode-2/#comments Tue, 09 Feb 2016 13:00:49 +0000 http://diggitymarketing.com/?p=1021 Welcome back to Episode 2 of the SEO Spotlight series.  In the last episode, we visited three SEO’s that think outside-the-box in order to tackle their problems. Being a person who appreciates creativity, I’ll be keeping up with the same theme here by interviewing three more SEOs who have made great progress in the areas of affiliate SEO, PBN, and client scaling.  You’ll learn their thought processes and find out how you can apply these same ideas to your own SEO projects. Daryl Rosser – Breaking Down Problems into Bite Size Pieces Daryl has got to be one of the most likeable guys in SEO.  He’s got an incredible head for business, especially for such a young fella, and has a sound technical understanding as well.  Plus, he’s fun to party with.  His technique for deconstructing high-competition affiliate SEO is bound to get some ideas popping off in your head. Name: Daryl Rosser Age: 22 Location: Northamptonshire, England Matt: When did you get into SEO and how? Daryl: 2.5 years ago when I was seriously struggling to make money. A local business had contacted me, they said they wanted SEO and heard I could help. I had no idea about SEO, but I needed money, so I met them, read up on some SEO buzzwords before meeting them, and later on ended up closing them as a client.  After their first payment came through, I setup some backlinks, and got them ranked #1 in 3 weeks, and had an “oh shit” moment. Matt: Let’s talk more about how you make affiliate SEO ridiculously easy. Daryl: Why is it that most people prefer local SEO to affiliate SEO? It’s easier. When people think of affiliate SEO, they think of competing with all the other affiliates. This requires a higher budget, better knowledge of ranking, and most likely more time. This technique is how I approach affiliate SEO, where there is barely any competition, you don’t need a lot of PBN links, and the conversion rate is ridiculous. Matt: How exactly do you do it?  Spill the beans.  Daryl: I don’t sell products as an affiliate. That’s a great model, and it works, but it takes a lot more traffic. I specifically seek out national firms that have pay-per-lead and/or pay-per-call affiliate offers. Firstly, this makes it a lot easier to generate a commission, and some of these can pay upwards of $100, depending on the niche. (Lingo clarification: Pay-per-lead means they will pay for an enquiry. Pay-per-call means they will pay for a call.) But the smart part of this strategy is that I treat them as local clients. Rather than try to compete on the national keywords, I can pick out a few towns or cities, rank my site top in them, and reap the rewards. This leaves you competing with local businesses, not other affiliates. The search volume is low, but you can rank multiple sites easier, and the search traffic is highly qualified. Say you work with a national plumbing company. If someone searches “plumbers in [your city]”, how qualified do you think they are to wanting a plumbing company? Then all you need to do is drive an enquiry or call through their website, and you get paid. It’s like local lead gen, but it’s through affiliate networks, so no client management. The next question people ask me when I share this is, “where do I get the offers?”  All I can say is that you should be on every reputable network, and you should be looking out for them. Here is a list of 20 reputable CPA networks: https://mthink.com/top-cpa-networks-2015/. You can specific ask your affiliate managers for these types of offers and they’ll be able to help. Matt: Very clever.  What’s your biggest win from using this technique? Daryl: Last month one of my websites had 1,342 unique visitors. The revenue (entirely from SEO) was $2,600. That is around $1.94 per visitor to my website. Compare that to some affiliates that have over 1,000 visitors daily with lower revenues. This is a solid model. Matt: Any advice to others in this SEO game? Daryl: My advice isn’t related to SEO so much as it’s related to business. One of the biggest mistakes I’ve made in the past, and I see others make, is to focus on things in the wrong order. They do the right work, but in the wrong order. Take client SEO as an example. People invest thousands of dollars and several hours into building up a PBN before they even have a client. How does that help you? It doesn’t. Evaluate your goals and ask yourself what is the one thing you should be focusing on right now to get you closer to that. If you’re just starting out with client SEO, it’s probably prospecting or outreach. Once you’ve done that, then you can build a PBN or rent links. As you expand your business you get better at this, but then it becomes – are you spending all your time managing and maintaining what you have, or are you focusing on growth?  It’s easy to get complacent at a certain stage and just manage things, don’t, keep focusing on growth and working your ass off towards it. Be sure to check out Daryl’s Lionzeal.com, a blog and community (of over 2,000 members) committed to scaling SEO businesses. Lior Ohayon – No Software Tool Exists?  Create it Yourself.   We all have some tasks that repeatedly consume a large amount of time.  Tasks that would be better resolved by software, if it existed.  Lior took matters into his own hands and developed a tool for speeding up the PBN auditing process. Name: Lior Ohayon Age: 23 Location: Toronto, Canada Matt: When did you get into SEO and how? Lior: I got into the SEO world by accident in 2013. I was a typical college dropout looking for ways to make money online. Everything I would read about starting a business boiled down to getting traffic to websites and SEO was one of the main ways. I consumed a lot of content about affiliate websites and link building, mainly through Pat Flynn, and played around with my own sites a little. Then I realized instead of starting a bunch of websites, I could just offer these new skills I learnt to existing business owners and help them grow their traffic. I stumbled into client SEO by accident! Matt: Let’s talk more about the software you created.  What problem does it solve? Lior: When you are buying an expired domain for your PBN at auction or from a broker, you need to do a lot of research using third party tools to see if it is a powerful, clean domain. There could be dozens of metrics to research, and it can get very time consuming and a headache to manage all of the tools in your browser (multiple tabs, logins, etc). I got frustrated with this whole process so I created ScopeReveal.  It’s a free tool that aims to eliminate that problem by streamlining all the tools into one convenient dashboard, as well as providing our own metrics from our algorithm that combines many of the imported metrics. Matt: Many SEOs including myself have considered having software created.  What was the development process like? Lior: First I decided which metrics are the most important, and figured out how to use their APIs to capture that data into my own software. Next, I found my developer by using Elance (now UpWork). He was one of the first to respond and he even built a prototype of the software which he included in his initial SEO proposal. That blew me away and his English was perfect so it was the right fit all around. Some tips I have for hiring on these sites is to include a secret code they should put in their proposal so you know they read it, as well as requiring a small test to be done (can be a paid test – well worth it). Then you will know their communication skills, skill level and ability to deliver on schedule. It also helps to promise future work on other projects if the first one goes well. This will lower rates if you always choose fixed price like me. On a side note, do research before making any SEO software that makes your life easier. It could already exist, or it could be cheaper to have a virtual assistant do the same tasks. You really have to calculate how much time or money it will save you in the long run, including upkeep and all. Matt: How much time savings can one expect using ScopeReveal? Lior: I typically buy PBNs from brokers. So once I get the URL from them, I would have to open MOZ, Majestic, Google, Ahrefs, Archive Wayback Machine, Who Is. That’s maybe 5 minutes in total. Then adding the domain to each one, analyzing the data, another 15 minutes or so. Read More Read More

The post SEO Spotlight: Daryl Rosser, Lior Ohayon, Joseph Elshazly first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/seo-spotlight-episode-2/feed/ 3
SEO Spotlight: Jacob Kettner, Daniel Moscovitch, Michael Landau-Spiers https://diggitymarketing.com/seo-spotlight-episode-1/ https://diggitymarketing.com/seo-spotlight-episode-1/#comments Wed, 20 Jan 2016 15:00:47 +0000 http://diggitymarketing.com/?p=1007 I was a strange child.  I always seemed to do things a bit differently than the other kids.  The habit stuck. I use four fingers to hold a pen.  I’m right-handed but left-legged.  I never wear underwear.  You should see the process I use to tie my shoes. Needless to say, I’ve always been fascinated with the various ways we can solve the same problems.  In my own SEO business, I spend a majority of my time testing.  What I’ve found is there is no cookie-cutter blueprint for success in SEO.  It’s always going to depend on your particular situation, skill set, and resources. I appreciate SEOs who can think outside-the-box; clever folk that use their intuition and do things a bit differently, allowing them to get some amazing results.  I’d like to introduce you to three of them. Jacob Kettner – Using Sub-domains for Risk-Free Authority Building I met Jacob a few months ago.  He’s a really bright SEO and is a perfectionist in his work.  I helped him audit one of his sites and found nothing wrong with his SEO plan.  That happens pretty much never.  The subdomain technique he uses is a clever way to build up authority, yet at the same time, reducing penalty risk. Name: Jacob Kettner Age: 26 Location: Winnipeg, Canada Matt: When did you get into SEO and how? Jacob: I got into internet marketing at the age of around 16.  I tried a lot of “loophole” tactics off Digital Point and Warrior Forum and made enough cash to keep me in the game.  I ran a business dropshipping customized products on eBay for years.  Despite my many years of internet marketing experience, I had virtually no knowledge of SEO and hired 3 different SEO firms before deciding to learn how to do it myself. Eventually I sold off my site and used the capital to start a client SEO business in the fall of 2014.  In the following year I learned a lot and scaled from nothing to a low 6 figure per year business. Matt: Tell us about the sub-domain authority boosting technique Jacob: Juicing up subdomains helps increase DA quickly without risking your money site.  The theory comes from looking at Web 2.0 sites like WordPress, Tumblr etc.  It’s a well-known SEO concept that subdomains on these strong properties will rank more quickly and have a higher tolerance for abuse than a fresh domain. Hold on a second though… How did these sites get so powerful?  Was it from high quality content and powerful links linking to the main domain?  Let’s look at majestic stats.  The URL metrics for https://www.tumblr.com/ are TF88 CF83 RD 27,112 and Backlinks 4,337,075.  So yes, I’m not denying that this is a very strong site.  But let’s look at the majestic metrics for the root domain tubmlr.com… TF92 CF93 RD 2,061,256 Backlinks 30,957,773,997.  To be clear this site has over 30 BILLION backlinks going to it. Although Tumblr in of itself has a lot of power on its main URL, a LARGE amount of power that the domain holds only 1.4% of the referring domains going to the site go to the main page.  The bulk of the referring links go to its sub-domains. My conclusion is that this is a two-way street.  Tumblr’s power helps parasite subdomains rank, but also the subdomains give back DA to the main domain. Matt: Practically, how do you juice up your sub-domains? Jacob: Here are a couple of models that I’ve used: Create subdomain EMD then build an auto video blog and leave it alone (maybe send some social signals). Create a subdomain EMD and send high authority links at it (this is a good place to test Fiverr gigs).  Shitty PBN links will work fine here, as well.  No need to waste the good stuff on this. I’m not advocating for sending thousands of GSA links at a money site subdomain, but you can definitely afford to be a little bit more aggressive than you would to the main domain.  Keep in mind the purpose of the subdomain isn’t necessarily to rank.  It’s just to add authority to the overall domain.  I’ve also done this with auto video blogs and no links just to increase page count and fresh content on the domain. (Note: I don’t link from these subdomains to my main site.) Matt: What are the biggest accomplishments you’ve been able to attain from using this technique? Jacob: I’ve ranked for a competitive local term with this method and just three PBN links. Matt:  Nicely done.  Got any advice for other SEOsS, newcomers and experts alike? Jacob: Take action and test stuff.  Foundations and principles last forever. Tactics and loopholes die as soon as they go public.  If you like tactics and loopholes, test stuff yourself and use it before it gets ruined.  I’m sharing this one because it’s particularly hard for Google to address it, given the parallel between this and major Web 2.0 sites. To learn more about Jacob and his case studies, be sure to check out his agency at First Rank. Daniel Moscovitch – Slick Client Getting Tactics If you’re doing client SEO, you’ve likely run into the same problem everyone else does.  Most businesses don’t trust SEOs.  Daniel solves this problem by simply rewording his sales copy and avoiding the conversations that would be in the typical SEO sales pitch. Name: Daniel Moscovitch Age: 30 Location: Tel Aviv, Israel Matt: When did you get into SEO and how? Daniel: 2.5 years ago (just after the first Penguin update). I started off as a whitehat link builder at a really awesome online marketing agency here in Tel Aviv and just exploded from there. Matt: Tell us about your client getting technique. Daniel: Let’s face it. SEO has a bad name. This is totally understandable due to the sheer number of businesses out there who have just gotten screwed by the so called SEO “experts” out there. These ‘experts’ often use outdated tactics, undercharge and under-deliver on their promises, and businesses are fed up (and I DON’T BLAME THEM!). That is why I have had success as of late by taking the focus away from SEO, a ‘dirty’ word, and talking instead about online visibility, creating an online brand, and making websites more powerful and more Google friendly. The word “SEO” has somewhat been tainted and is often really hard to understand what it actually is. That is why I always like to start describing my services with words like “improving your site’s power and authority”, “increasing website visibility” and “connecting you with more customers online”; “increased rankings”. Of course, there is no hiding that what we do is SEO, but this way I don’t scare prospects off right away and can slowly develop a relationship that shows the value that I provide as opposed to selling them on SEO right off the bat. I also like to make sure that the focus is NOT on what WE will do, but rather what THEY will gain from working with us. Most business owners want to stop worrying about having to actively search for clients, and prefer to stick to what they know and love. On every proposal we send out, we make sure that it includes that sentiment. On top of that, we have experimented recently on not locking clients into any long term obligation, but rather starting off with a small proof of concept. So far, 100% of our clients who have started with this, have gone on to continue with us long term and for more money. SEO can be difficult enough as is. When you add selling it to potential clients, you have to make it as easy and beneficial to them as possible. That is why it is important to focus on what THEY have to gain, start small, gain their trust and go from there. Matt: In ROI, what has this change in marketing achieved for your agency? Daniel: In the past 2 months alone, we have increased our monthly earnings by $5K by just switching the focus of our pitch and changing our proposals around. Matt: Any newbie advice for upcoming SEOs? Daniel: Never get discouraged, never stop learning, and embrace the confusion! To learn more about Daniel and his agency, check them out at MoreHotLeads.com. Michael Landau-Spiers – Smart Keyword Research to Break into a Huge Affiliate Niche   Michael is my favorite SEO success story of last year.  With only a year’s experience in SEO, he found a way to break into one of the most competitive affiliate niches (I’m in it myself).  Instead of going for the super high volume keywords like most SEO’s would (including myself), he broke down the niche and found keywords that we’re easily attainable, yet still very profitable.  When the timing was right, he cashed out on a huge flip. Name: Michael Landau-Spiers Age: 21 Location: United Kingdom Matt: When did you get into SEO and how? Michael: Read More Read More

The post SEO Spotlight: Jacob Kettner, Daniel Moscovitch, Michael Landau-Spiers first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/seo-spotlight-episode-1/feed/ 26
Reduce Sandbox Length by Timing your Backlinks https://diggitymarketing.com/reduce-sandbox-length-by-timing-your-backlinks/ https://diggitymarketing.com/reduce-sandbox-length-by-timing-your-backlinks/#comments Sun, 01 Mar 2015 08:43:49 +0000 http://diggitymarketing.com/?p=503 What is Sandboxing? Definition: When you build links to a site and you see little to no ranking benefit. Back around mid-2014, we started to see a sandboxing effect on new domains that had been recently registered. Before that time, you used to be able to register a new domain, build some backlinks to it, and within 5 days it could potentially be on page 1.  Now a days, SEOs are reporting sandbox periods of 3-6 months long. People have been discussing various ways to beat the sandbox and reduce the time it takes to rank.  In general, the most widespread theory is that you want to create a viral effect. Here’s my recipe for sandbox reduction, and the results I’ve gotten by using it… Social Fortress – For most “real” businesses, a website’s first links are usually from social websites.  Once their site is made, a business naturally goes out to brand itself on Facebook, Twitter, G+, etc.  This social fortress should constitute the first links that you create to your website. Business Citations – After social profiles are created, a typical business would like to make sure potential customers can find them in local and industry-specific directories.  Leverage directories like these to build free but natural links to your homepage. Social Signals – After that, create a social signal campaign to drip out 50-200 Facebook shares, Twitter tweets, etc over the next 30 days.  This is where the virality comes from.  To generate signals, I use SEO Butler’s 200 signals package, namely for their ease of use, delivery of quality signals, and repeatedly good results.  Use coupon code “DIGGITY10” for 10% off. Guest Posts – Now that the stage is set, it’s a safe time to build links.  I stick to quality guest posts from websites with traffic in this beginning phase.  Why?  Because in the natural course of the internet, most new links are created on brand new pages.  Similar to how guest posts are created.  The name of the game is to look as natural as possible. Results Here are the results of two separate tests running the sequence listed above. Of course, there are various techniques people use to skate around the sandbox, but this is what I use and it seems to be working quite well, and quite consistently.  Give it a shot and share your results as well.   What is Sandboxing? Definition: When you build links to a site and you see little to no ranking benefit. Back around mid-2014, we started to see a sandboxing effect on new domains that had been recently registered. Before that time, you used to be able to register a new domain, build some backlinks to it, and within 5 days it could potentially be on page 1.  Now a days, SEOs are reporting sandbox periods of 3-6 months long. People have been discussing various ways to beat the sandbox and reduce the time it takes to rank.  In general, the most widespread theory is that you want to create a viral effect. Here’s my recipe for sandbox reduction, and the results I’ve gotten by using it… Social Fortress – For most “real” businesses, a website’s first links are usually from social websites.  Once their site is made, a business naturally goes out to brand itself on Facebook, Twitter, G+, etc.  This social fortress should constitute the first links that you create to your website. Business Citations – After social profiles are created, a typical business would like to make sure potential customers can find them in local and industry-specific directories.  Leverage directories like these to build free but natural links to your homepage. Social Signals – After that, create a social signal campaign to drip out 50-200 Facebook shares, Twitter tweets, etc over the next 30 days.  This is where the virality comes from.  To generate signals, I use SEO Butler’s 200 signals package, namely for their ease of use, delivery of quality signals, and repeatedly good results.  Use coupon code “DIGGITY10” for 10% off. Guest Posts – Now that the stage is set, it’s a safe time to build links.  I stick to quality guest posts from websites with traffic in this beginning phase.  Why?  Because in the natural course of the internet, most new links are created on brand new pages.  Similar to how guest posts are created.  The name of the game is to look as natural as possible. Results Here are the results of two separate tests running the sequence listed above. Of course, there are various techniques people use to skate around the sandbox, but this is what I use and it seems to be working quite well, and quite consistently.  Give it a shot and share your results as well.   Read More

The post Reduce Sandbox Length by Timing your Backlinks first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/reduce-sandbox-length-by-timing-your-backlinks/feed/ 163
SEO Testing Results: CTR & Social Shares in 2015 https://diggitymarketing.com/seo-testing-results-ctr-social-shares-in-2015/ https://diggitymarketing.com/seo-testing-results-ctr-social-shares-in-2015/#comments Tue, 06 Jan 2015 04:18:17 +0000 http://diggitymarketing.com/?p=332 There’s a ton of buzz going around about what the next face of SEO is going to look like.  Those who stay ahead of the curve are going to reap the benefits.  In case you’re not up to speed, here’s a summary on a few SEO techniques are being hyped up right now and my testing results on each. Testing Results: Social Signals Tweets, Shares, G+’s, Facebook likes, Pins, and bookmarks. There’s been a ton of talk about social signals over the past 3 months.  People have come forth saying that they’re ranking local search YouTube videos solely on social signals.  The main debate seems to be over how the social signals are obtained.  Namely, if they should come from real accounts or automatically generated accounts. Here’s the test scenario I setup, comparing three different groups: FCS Networker – Auto-generates accounts on many social bookmarking platforms and Twitter; and posts to them.  No limit on the # of accounts created.  (5 testcases) SocialAdr + Link Collider – Uses a credit system to encourage people to post on each other’s real accounts (with the option to simply buy credits).  SocialAdr focuses around bookmarks, while Link Collider provides signals from G+, FB, Twitter, Pinterest, etc.  (5 testcases) Control Group – No social signals.  (3 testcases) Signals were sent to sets A & B on week 2 and were dripped over the course of 6 additional weeks.  All sites had zero social signals before the test began.  Each site received the same amount of signals. Results:  Negligible difference between approach A and B – Whether or not social signals are coming from brand new auto-generated accounts or old accounts, G doesn’t seem to notice.  I’ve confirmed this with many others, including an actual developer of a social signal service similar to Approach B.  As of now, it seems a signal is just a signal.  I actually expected a better result from B because it includes more “important” social platforms (FB, G+) than A. B did however achieve a faster ranking increase, most likely because their accounts are already indexed and ready to be crawled. Groups A and B showed a slight positive increase (3.6-3.7%) in result compared to the control group C. Summary: Moz recently said that they believe social signals only contribute to a total ranking factor of 5%.  I agree. In low competition niches, this 5% can make a big difference because it’s you against other sites with zero SEO.  Anything you do will yield a positive result. In high competition niches, my analysis shows me that that social signals will give you a boost, but more importantly, social signals are absolutely needed to justify the backlinks you’re getting.  A site with 20 high authority backlinks and zero social signals simply doesn’t make sense to Google, and it shouldn’t.  This is further backed up by some real money sites of mine that have died with no social signals, while others survived in the same niche that had social signals.  If you’re in a remotely competitive niche, then you need social. Recommendation: At this point in time, FCS Networker (click for best price) is my recommendation.  It’s cheap ($27/month), completely automated, and you can run it on an unlimited number of domains and send an unlimited amount of signals.  The smallest package for SocialAdr and Link Collider togeter about $30/month but only supports 1-2 URLs with minimal amounts of shares.  Since they both yield the same result, I’d rather go with FCS which is cheaper, easier to use, and scalable. Will FCS be a solution for years to come?  Probably not.  As we’ve seen with backlinks, the quality control noose will tighten and eventually we’ll need aged accounts with followers/fans and non-spun content.  But that’s not going to happen anytime soon. Click-Through Rate (CTR) For years now, CTR is has been known to be a ranking factor in Google’s algorithm.  When a search is made, the sites that are clicked on the most often (i.e. have the highest CTR) are deemed to be more relevant, and a ranking boost is theoretically applied. I tested CTR with a upcoming service that outsources individual users across the US (and other countries).  Software is installed on their computers to ensure they make a search, find your site, spend time on your page, and randomly browse on other pages (reducing bounce rate). Results: I have to say I was pretty excited for this to work out.  The logic behind it makes sense and the solution’s implementation seems sound.  Over 6 testcases spread out over brand, EMD, aged, non-aged, competitive national searches, and local low competition searches… the results were flat. That being said, I’ve talked with others in my circle of SEO’s and people are reporting positive results only if their original rank was on the first page already.  In all fairness, my testcases all had page 2-7 ranks.  I’ll get back to you later with my tests on first page rankers. Summary: Right now, I won’t recommend this because I personally haven’t seen a positive result so far. There’s potential safety issues, especially if G catches onto the IPs of the searchers.  However, climbing 1-3 spots on the first page for a minimal price, while being completely automated, is a no brainer.  I’m waiting to see it for myself though. PBN Sites Are Still Killing It With all the testing I’ve been doing, I’m still finding that the main tool in your SEO arsenal remains to be PBNs.  There simply doesn’t exist a more effective, faster way to rank a site.  Here’s just a couple examples of it kicking ass just a few weeks ago (both results are from a single PBN link pointing to a real, customer money site).   There’s a ton of buzz going around about what the next face of SEO is going to look like.  Those who stay ahead of the curve are going to reap the benefits.  In case you’re not up to speed, here’s a summary on a few SEO techniques are being hyped up right now and my testing results on each. Testing Results: Social Signals Tweets, Shares, G+’s, Facebook likes, Pins, and bookmarks. There’s been a ton of talk about social signals over the past 3 months.  People have come forth saying that they’re ranking local search YouTube videos solely on social signals.  The main debate seems to be over how the social signals are obtained.  Namely, if they should come from real accounts or automatically generated accounts. Here’s the test scenario I setup, comparing three different groups: FCS Networker – Auto-generates accounts on many social bookmarking platforms and Twitter; and posts to them.  No limit on the # of accounts created.  (5 testcases) SocialAdr + Link Collider – Uses a credit system to encourage people to post on each other’s real accounts (with the option to simply buy credits).  SocialAdr focuses around bookmarks, while Link Collider provides signals from G+, FB, Twitter, Pinterest, etc.  (5 testcases) Control Group – No social signals.  (3 testcases) Signals were sent to sets A & B on week 2 and were dripped over the course of 6 additional weeks.  All sites had zero social signals before the test began.  Each site received the same amount of signals. Results:  Negligible difference between approach A and B – Whether or not social signals are coming from brand new auto-generated accounts or old accounts, G doesn’t seem to notice.  I’ve confirmed this with many others, including an actual developer of a social signal service similar to Approach B.  As of now, it seems a signal is just a signal.  I actually expected a better result from B because it includes more “important” social platforms (FB, G+) than A. B did however achieve a faster ranking increase, most likely because their accounts are already indexed and ready to be crawled. Groups A and B showed a slight positive increase (3.6-3.7%) in result compared to the control group C. Summary: Moz recently said that they believe social signals only contribute to a total ranking factor of 5%.  I agree. In low competition niches, this 5% can make a big difference because it’s you against other sites with zero SEO.  Anything you do will yield a positive result. In high competition niches, my analysis shows me that that social signals will give you a boost, but more importantly, social signals are absolutely needed to justify the backlinks you’re getting.  A site with 20 high authority backlinks and zero social signals simply doesn’t make sense to Google, and it shouldn’t.  This is further backed up by some real money sites of mine that have died with no social signals, while others survived in the same niche that had social signals.  If you’re in a remotely competitive niche, then you need social. Recommendation: At this point in time, FCS Networker (click for best price) is my recommendation.  It’s cheap ($27/month), completely automated, and you can run it on an unlimited number of domains and send an unlimited amount of signals.  The smallest package for SocialAdr and Read More Read More

The post SEO Testing Results: CTR & Social Shares in 2015 first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/seo-testing-results-ctr-social-shares-in-2015/feed/ 22