Client SEO https://diggitymarketing.com Thu, 07 Sep 2023 14:21:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://eb75zekerce.exactdn.com/wp-content/uploads/2016/03/cropped-favicon-1.png?lossy=0&sharp=1&resize=32%2C32&ssl=1 Client SEO https://diggitymarketing.com 32 32 45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] https://diggitymarketing.com/infoproduct-seo-case-study/ Mon, 11 May 2020 04:19:23 +0000 http://diggitymarketing.com/?p=512380 You’re about to get the strategy behind one of the most challenging SEO campaigns my SEO agency has ever run. Why was it so challenging?  3 reasons: First, the niche is massively competitive: A make-money-online infoproduct in the financial niche.  Nuff said. Second, we only had 5-months to pull this off. Third, just like any other client, they were extremely hungry for results and demanded quality work. In the case study below, you’re going to learn the technical playbook, the onsite content strategy, and the link building techniques we carried out to get this 45.99% revenue growth win for this infoproduct business. The Case Study Our client takes advantage of the wide reach of the interwebs to teach his students how to earn money trading online. We’re talking currencies, forex, stock markets, crypto, etc. The business’ revenue is generated solely through the sale of digital download products – in this case, trading guides in an ebook format and video trading courses. When the owner of this profitable business (which already built some authority in the niche) approached The Search Initiative (TSI) about helping to grow their organic reach and find new students, we were excited to take on the challenge in one of the most competitive spaces there is. To accomplish this, the game plan was to focus hard on a quick-win strategy, while setting the stage for long term gains post-campaign. Our strategists were certain that the value we could provide would have a considerable impact on his business’ bottom line. How?  Because… Over the course of the campaign, our technically-focused SEO strategies were able to grow organic traffic by 23.46%. But what did the best job for the client’s business was the 45.99% increase in the number of conversions comparing 1st vs last month of the campaign. Sales went up from just over 2,100 a month to 3,095 – this really bumped their monetization. And we did it in time. These gains were achieved within only 5 months of the client signing with TSI and our team starting the campaign. Here’s how we did it… The SEO Playbook for Infoproduct Websites Phase 1: A Comprehensive Technical Audit I’ve said this in every TSI case study we’ve published so far… and I simply cannot emphasize enough: So before you begin any campaign, always start with a full technical audit. Starting with… Page Speed First, our technical SEO strategists started at the bottom of the client’s tech stack… and you should too. This starts with you digging into the web server’s configuration, and running a series of tests to measure the site’s speed. This enables you to ensure that the performance of the web server itself wasn’t causing a penalty or disadvantage on either desktop or mobile connections. So, what tests we run? PageSpeed Insights (PSI) – this should be everyone’s go-to tool and shouldn’t need an explanation. GTmetrix – it’s good to cross-check PSI’s results, therefore we use at least one other tool. In reality, we use GTmetrix together with Dareboost, Uptrends, and Webpagetest. HTTP/2 Test – this one is becoming a standard that can greatly improve your page speed, hence, it’s definitely worth looking into. If you’re not HTTP/2 enabled, you might want to think about changing your server or using an enabled CDN.You want to see this: Performance Test – I know it might sound like overkill, but we included this in our test suite earlier this year and use it for the sites that can expect higher concurrent traffic.We’re not even talking Amazon-level traffic, but say you might get a thousand users on your site at once. What will happen? Will the server handle it or go apeshit? If this test shows you a steady response time of under 80ms – you’re good. But remember – the lower the response rate, the better! In cases where transfer speeds or latency are too high, we advise you (and our clients) to consider migrating to faster servers, upgrading to better hosting or better yet, re-platforming to a CDN. Luckily, most of the time, you can achieve most of the gains through WPRocket optimization, as was the case with this case study. Your Golden WPRocket Settings Cache → Enable caching for mobile devices This option should always be on. It ensures that your mobile users are also having your site served cached. Cache → Cache Lifespan Set it depending on how often you update your site, but we find a sweet spot at around 2-7 days. File Optimization → Basic Settings Be careful with the first one – it may break things! File Optimization → CSS Files Again, this section is quite tricky and it may break things. My guys switch them on one-by-one and test if the site works fine after enabling each option. Under Fallback critical CSS you should paste your Critical Path CSS which you can generate using CriticalCSS site. File Optimization → Javascript This section is the most likely to break things, so take extreme care enabling these options!! Depending on your theme, you might be able to defer Javascript with the below: Note that we had to use a Safe Mode for jQuery as, without this, our theme stopped working. After playing with Javascript options, make sure you test your site thoroughly, including all contact forms, sliders, checkout, and user-related functionalities. Media → LazyLoad Preload → Preload Preload → Prefetch DNS Requests The URLs here hugely depend on your theme. Here, you should paste the domains of the external resources that your site is using. Also, when you’re using Cloudflare – make sure to enable the Cloudflare Add-on in WPRocket. Speaking of Cloudflare – the final push for our site’s performance we managed to get by using Cloudflare as the CDN provider (the client sells products worldwide). GTMetrix If you don’t want to use additional plugins (which I highly recommend), below is a .htaccess code I got from our resident genius and Director of SEO, Rad Paluszak  – it’ll do the basic stuff like: GZip compression Deflate compression Expires headers Some cache control So without any WordPress optimization plugins, this code added at the top of your .htaccess file, will slightly improve your PageSpeed Insights results: Internal Redirects You know how it goes – Google says that redirects don’t lose any link juice, but PageRank formula and tests state something different (there’s a scientific test run on 41 million .it websites that shows PageRank’s damping factor may vary). Whichever it is, let’s take all necessary precautions in case there is a damping factor and redirects drop a % of their link juice. As we investigated the configuration of the server, we discovered some misapplied internal redirects, which were very easily fixed but would have a considerable effect on SEO performance – a quick win. You can test them with a simple tool httpstatus.io and see results for individual URLs: But this would be a long way, right? So your best bet is to run a Sitebulb crawl and head over to the Redirects section of the crawl and look at Internal Redirected URLs: There you will find a list of all internally redirected URLs that you should update and make to point at the last address in the redirect chain. You might need to re-run the crawl multiple times to find all of them. Be relentless! Google Index Management Everyone knows that Google crawls and indexes websites. This is the bare foundation of how the search engine works. It visits the sites, crawling from one link to the other. Does it repetitively to keep the index up-to-date, as well as incrementally, discovering new sites, content, and information. Over time, crawling your site, Google sees its changes, learns structure and gets to deeper and deeper parts of it. Google stores in their index everything it finds applicable to keep; everything considered useful enough for the users and Google itself. However, sometimes it gets to the pages that you’d not want it to keep indexed. For example, pages that accidentally create issues like duplicate or thin content, stuff kept only for logged-in visitors, etc. Google does its best to distinguish what it should and shouldn’t index, but it may sometimes get it wrong. Now, this is where SEOs should come into play. We want to serve Google all the content on a silver platter, so it doesn’t need to algorithmically decide what to index. We clean up what’s already indexed, but was not supposed to be. We also prevent pages from being indexed, as well as making sure that important pages are within reach of the crawlers. I don’t see many sites that get this one right. Why? Most probably because it’s an ongoing job and site owners and SEOs just forget to perform it every month or so. On the other hand, it’s also not so easy to identify index bloat. With this campaign, to ensure that Google’s indexation of the site was optimal, we looked at these: Site: Search Google Search Console In our Read More Read More

The post 45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] first appeared on Diggity Marketing.

]]>
How to Prepare Your Website for a Google Algorithm Update [Case Study] https://diggitymarketing.com/algorithm-update-case-study/ https://diggitymarketing.com/algorithm-update-case-study/#comments Mon, 13 May 2019 09:39:30 +0000 http://diggitymarketing.com/?p=8013 I hope that you’ve never had to go through the pain of being hit by an algorithmic update. You wake up one morning, your traffic is decimated, and your rank tracker is littered with red arrows. Algorithmic penalties are not a subject I like to trivialize, that’s why the case study I am about to share with you is different than most you’ve read before. This case study is a testament of faith and hard work by my agency, The Search Initiative, in light of a huge shift in the SEO landscape. Unfortunately, with core algorithmic updates you can’t simply change a few things and expect to get an immediate ranking recovery. The best you can do is prepare for the next update round. If you’ve done all the right things, you experience gains like you’ve never seen before. Even if you’ve never been hit with an algorithmic penalty, you should care about these updates. Doing the right things and staying one step ahead can get your site in position for huge gains during an algorithm roll out. So what are “the right things”?  What do you need to do to your website to set it up for these types of ranking increases when the algorithms shift? This case study from my agency The Search Initiative will show you. The Challenge: “Medic Algorithm” Devaluation I want to start this case study by taking you back to its origins. There was a big algorithm update on the 1st of August 2018. A lot of SEOs called it a “Medic Update” because it targeted a huge chunk of sites related to health and medicine. https://www.seroundtable.com/google-medic-update-26177.html What Does an Algorithm Update Look Like? Let’s start with a few facts. Fact #1: Google is constantly running search experiments. To quote Google from their official mission page: “In 2018, we ran over 654,680 experiments, with trained external Search Raters and live tests, resulting in more than 3234 improvements to Search.” Here are the official numbers relating to the search experiments they ran last year: 595,429 Search quality tests – this is the number of tests they have designed to run in the search engines. Some of them were only conceptual and were algorithmically proven to be ineffective, therefore these never made it to the next testing stages. 44,155 Side-by-side experiments – this is how many tests they have run through their Search Quality Raters. The SQR team looks at the search results of old and new algorithms side-by-side. Their main job is to assess the quality of the results received, which, in turn, evaluates the algorithm change. Some changes are reverted at this stage. Others make it through to the Live traffic experiments. 15,096 Live traffic experiments – at this stage, Google is releasing the algorithm change to the public search results and assesses how the broader audience perceives them, most likely through A/B testing. Again, there will be some rollbacks and the rest will stay in the algorithm. 3,234 Launches – all the changes that they rolled out. Fact #2: Google releases algorithm improvements every day and core updates several times a year! Bearing in mind everything said above, Google releases algo improvements basically every day. Do the math… They’ve also confirmed that they roll-out core quality updates several times per year: When you suspect something is going on, you can confirm it by simply jumping over to your favorite SERP sensor to check the commotion: https://www.semrush.com/sensor/ During this period, rankings typically fluctuate and eventually settle. Like in the below screenshot: A lot of SEOs (myself included) believe that during the Heavy-Fluctuation Stage, Google is making adjustments to the changes they’ve just rolled out. It’s like while you’re cooking a soup. First, you add all the ingredients, toss in some spices, and let it cook it for some time. Then you taste it and add more salt, pepper or whatever else that is needed to make it good. Finally, you settle with the taste you like. (I’ve never actually cooked soup other than ramen, so hopefully, this analogy makes sense.) Fact #3: There will initially be more noise than signal. Once there is an algo update, especially an officially confirmed one, many budding SEOs will kick into overdrive writing blog posts with theories of what particular changes have been made. Honestly, it’s best to let things settle before theorizing: One strength we have as website owners is that there are lots of us – and the data that is collected by webmasters on forums and on Twitter is sometimes enough to give an indication of what changes you could possibly make to your sites. However, this is not usually the case, and when it is, it is usually difficult to tell if what the webmasters are signaling is actually correct. Keep an eye on those you trust to give good advice. That said… At my agency, we always gather a lot of data and evidence first, before jumping any conclusions… and you should do the same. Very shortly, we’ll be getting to that data. The Question: Algorithmic Penalty or Devaluation? When things go wrong for you during an algorithmic update, a lot of SEOs would call it an “algorithmic penalty”. At The Search Initiative, we DO NOT AGREE with this definition!   In fact, what it really is, is a shift in what the search engine is doing at the core level. Put it in very simple terms: Algorithmic Penalty – invoked when you’ve been doing something against Google’s terms for quite some time, but it wasn’t enough to trigger it until now. It’s applied as a punishment. Algorithmic Devaluation – usually accompanying a quality update or a broad algorithm change. Works at the core level and can occasionally influence your rankings over a longer period of time.Applied as a result of the broader shift in the quality assessment. Anyway, call it as you want – the core algo update hitting you means that Google has devalued your site in terms of quality factors. An algorithmic shift affecting your site should not be called a penalty. It should be viewed as a devaluation. You were not targeted, but a bunch of factors have changed and every single site not in compliance with these new factors will be devalued in the same way. The good thing about all this… once you identify those factors and take action on them, you’ll be a great position to actually benefit from the next update. How to Know You’ve Been Hit by an Algo Update? In some cases, a sudden drop in traffic will make things obvious, such as this particular site that I would like to look at more specifically. But we’ll get to that in a second. Generally speaking, if your traffic plummets from one day to the next, you should look at the algorithm monitoring tools (like the ones below), and check Facebook groups and Twitter. Google Algorithm Change Monitors: https://www.semrush.com/sensor/ https://moz.com/mozcast/ https://algoroo.com/ https://www.rankranger.com/google-algorithm-updates https://cognitiveseo.com/signals/ Useful Facebook Groups: The Lab Ahrefs Insider Inside Search Useful Twitter Accounts to Follow Cyrus Shepard Glenn Gabe Marie Haynes The Patient: Our Client’s Site The client came on board as a reaction to how they were affected by the August update. They joined TSI towards the end of October. This was the ‘August 2018 Update’ we were talking about – and still no one is 100% certain of the specifics of it. However, we have some strong observations. 😉 Type of the Site and Niche Now, let’s meet our patient. The website is an authority-sized affiliate site with around 700 pages indexed. Its niche is based around health, diet and weight loss supplements. The Symptoms As the industry was still bickering, there were no obvious ‘quick fixes’ to this problem. In truth, there likely will never again ever be any ‘quick fixes’ for broad algo updates. All we had to work with was this: You can see that in this particular case, the number of users visiting the site dropped by 45% in July-August. If we look at October, when we’re running all our analyses and creating the action plan, the organic traffic looks even more pessimistic: With the niche, site and timeline evidence, we could easily conclude what follows: 100% Match with The “Medic” Update How We Recovered it – What are the “right things”? To contextualize our decision making on this project, this is a rundown of what we know and what we knew then: What we knew then It seemed as many of the affected sites were in the health and medical niches (hence, the “Medic” update). Sites across the web have experienced a severe downturn in rankings. Rankings were affected from page one down. (This was surprising – most of the previous updates had less of an impact on page 1.) A lot of big sites with enormous authority and very high-quality have also been devalued. We had speculated that this would suggest a mistake on Google’s part… What we know now ‘The August Update’ affected Read More Read More

The post How to Prepare Your Website for a Google Algorithm Update [Case Study] first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/algorithm-update-case-study/feed/ 23
Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank https://diggitymarketing.com/page-rank-case-study/ https://diggitymarketing.com/page-rank-case-study/#comments Mon, 04 Mar 2019 09:55:34 +0000 http://diggitymarketing.com/?p=7421 Introduction I’ve been a director at The Search Initiative a while now. We’ve had some crazy results for a whole bunch of clients – all in very different niches. I’m going to share with you how my team and I took a low-authority website and put our foot on the gas to get it moving – fast! I truly believe there was a key ingredient that accelerated this website, but I’m going to share the whole process from the start. Why? Each website is going to be different, so you need to figure out what your site needs. You need to go through the process. Here’s a sneak peek of the growth. Now learn how we did it… Initial Analysis Since starting TSI’s organic seo services, I’ve realized that working with my own sites is hugely different from working with clients; especially if the website has weak foundations. I know how I want my money sites to look, so I build them using rigorous attention to detail. But if you take a website that’s been developed without a certain level of SEO knowledge – there’s normally quite a lot of on-site and off-site to fix. Here’s how my team broke down the initial analysis: Keyword Research On-Site Audit Backlink Audit Competitor Analysis Keyword Research My team tackled keyword research with two main workflows: one is used to monitor the health of a website, and the other is for content gap analysis. When we’re looking to track keywords for a website, we want to track some of the core terms, but also terms that are having problems. If a term is suffering from keyword cannibalization that we’re trying to fix – it’s worth tracking this daily until it’s resolved. Since this client needed a huge content strategy, we did both a health check and initial content gap analysis. This approach included breaking down all keywords for that industry into topics of relevant terms. In total, this process took over 20 hours and included thousands of keywords chunked into neat topics. This work later helped with choosing page titles, headings and content. Here’s an example of how we did it: Step 1. Search Broad Keywords Step 2. Review Parent Topics Step 3. Find Competitors for Parent Topics Step 4. Reverse Engineer Competitor’s Keywords Step 5. Exclude Outdated Keywords There is the option to also export all of these keywords into excel documents and then filter them that way. But most of the time, a lot of the top keywords are fairly similar. Here’s an example for the best dog food term: best dog food best dog foods healthiest dog food what is the best dog food top rated dog food best food for dogs While each keyword is unique, they all follow a singular intent. The users are interested in finding out what are the best dog foods in the market. On-Site Audit Finding all the technical and content issues with the website requires a full on-site audit. However, while big reports are easy on the eyes, it’s small changes that make the difference. We audited the website and found a whole bunch of technical issues, from lack of breadcrumbs, poor internal link structures, bad quality anchor text and unoptimized titles. A full on-site audit tutorial is too big for this post (perhaps coming soon), but here are some quick tips: Screaming Frog – A cheap way to regularly crawl your website. There are lots of ways to find errors, redirects, and missing metadata. You can also use a custom search to find all references of your keywords. Sitebulb – This tool is more expensive and is a monthly recurring fee. However, it gives you lots of extra data that would be impossible to spot manually and hard with Screaming Frog. An example would be empty hyperlink references. Site Search – By using Google’s site search (site:domain.com) and operators, you can find hundreds of issues with index management, outdated page titles, and multiple pages targeting the same keyword. There are a lot of quick wins here. Page Titles – If you wrote your page titles 1 – 2 years ago, you may find that they’re outdated now. A quick site search with “intitle:2018” will find all your content that is either not updated or not yet crawled by Google. Internal Links – A major way to pass relevance signals and authority to your core pages is through internal links. Make sure that your pages are well interlinked and you’re not using low-quality anchors from your power pages, such as “click here” or “more information”. We focused on fixing around 5 issues at a time varying from small changes like improving accessibility, to bigger changes like introducing breadcrumbs for a custom build website. Backlink Audit The website had a relatively small backlink profile, which meant it lacked authority, relevance signals and entry points for crawling. It also meant that a full in-depth link analysis was unnecessary for this campaign. In this instance, the initial check revealed there was nothing to be concerned about, so we moved on to technical implementation as soon as possible. Had the website experienced problems with the link profile, we would have done a full backlink audit to try and recover this. Here’s what to look out for: Link Distribution – Pointing too many links toward internal pages instead of your homepage can cause lots of issues. So make sure that you’re not overdoing it. Anchor Text Analysis – Using exact match, partial match and topical anchors are a great way to pass third-party relevance signals. Too many and you’ll be caught out over-optimizing, but too few and you won’t be competitive. Read more about anchor optimization. Referring IP Analysis – There are a finite number of IPv4 Addresses, so this isn’t often a big cause for concern. However, it’s worth making sure that you’ve not got too many links from the same IP address. Autonomous System Numbers – Since a server can be assigned any number of IP addresses, these systems often include an ASN. This is another way that Google could flag large numbers of websites from the same origin. My team did a case study on how to remove an algorithmic penalty, a lot of these audits come included in any penalty removal campaign. Competitor Analysis The difference between a search analyst and data scientist is how you approach the search engines. An analyst is focused on reviewing the SERPs and finding what is working best today, while a data scientist wants to understand how things work. We built our team to include both since competitor analysis requires a keen eye for reviewing the SERPs and algorithm analysis requires solid data scientists. If you want to do SEO at a high level, you’ve got to constantly be reviewing competitors using various analysis tools. You will notice that tons of best practices get ignored in the top positions and the devil is in the details. In this instance, we found that both more content and more links would be required for long-term success. Content Strategy Building any long-term authority website in competitive industries will include both an authoritative link profile and content plan. My team reviewed their existing content, looked at how other websites in their industry wanted to help users and then addressed these four cornerstones: User Intent – before we did anything, we wanted to nail the user intent on every page. This research meant that we identified three pillars of content for their site. We’ll get into this in further detail below. Service Pages – these pages were dedicated to explaining what service was offered, how to contact and what was included with that offering. Blog Content – these posts were dedicated to providing non-commercial, informative content that was interesting to the reader. Resource Center – this section was dedicated to giving basic information about topics in their industry. Instead of using Wikipedia for all our links to authority content, we wanted to use internal links instead. Here’s a little bit about each section and our strategy for them: User Intent The biggest mistake I see all the time is the simplest thing to check: What types of content is Google ranking in the top 10 positions? If you’re serving 10,000 words of content in a huge blog post, but Google is only interested in serving service pages with 50 words of content – you’ve missed the point. Another commonly found problem we find at The Search Initiative is including too much content in a single post, when your competitors have several shorter posts. One of the main attractions for Thailand are the yoga retreats. If you’re searching for this (yoga retreats) in America, you’re expecting to find destinations. Let’s take a look: The first position is called Yoga Journal and includes almost no content aside from images and headings. That’s exactly what the users were looking for. There are other websites doing a similar service and can help you make bookings. While others Read More Read More

The post Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/page-rank-case-study/feed/ 30
How to Build a Scalable White Hat Agency [Case Study from TSI] https://diggitymarketing.com/scalable-white-hat/ https://diggitymarketing.com/scalable-white-hat/#comments Wed, 14 Nov 2018 08:28:58 +0000 http://diggitymarketing.com/?p=6654 Introduction A few years ago, my agency The Search Initiative (TSI) decided to remove PBNs as a part of our link building strategy and move over completely to a white hat link building model. While PBNs only made up a small portion of our strategy, we knew this would be a fairly big undertaking. This would involve the re-design of the overall process and our approach to campaigns. Every decision that we make at the agency is data-driven. We constantly test, experiment and analyze. Our aim is to always put our clients first and to give them the best ROI for their campaign. We needed a sustainable and powerful link building strategy that is not only extremely safe but also moves the needle. In this article, you’ll learn our entire outreach strategy. We’ll discuss the role content quality plays in an effective outreach campaign, how to research and create fine pieces, how to prospect link opportunities, and how to pitch with a high success rate. Bonus: I’ve brought in Rad Paluszak (TSI Director and Chiang Mai SEO Conference 2018 speaker) to chime in with some knowledge bombs. Overall Goals and Strategy During our early brainstorming sessions, we knew that to fully transition from grey to white we needed to base the entire system on high-quality content. Why is high-quality content so important for outreach? Bloggers, website owners and journalists wised up a long time ago to the importance of links in SEO. They fully understand how important they are to us and the value that these links hold. Many prospects are extremely proud of their websites and extremely picky about what content is placed on their own personal soapbox. Just look at how few guest posts I’ve allowed on my site. Outreach link building is an exchange. You provide something that the website’s readers will love and adds true value to the website and in exchange, you attain that all-important link. My advice: Create a systemized approach that focuses on quality. Focus on: Content that is easy to understand Content that teaches Content that is easy to share Content that impresses But how do you create a content marketing machine that can produce this kind of content? We assembled a team who had zero SEO knowledge but were bright and creative, which were the main skills we were looking for. You want your content and outreach team members to be right-brained, creative types. I’m a left-brained robot, so I’m much better at stuff like link prospecting, which you’ll learn more about later. If you chose to train the new outreach team from scratch, you avoid the shortcut mentality that many trained SEO’s have. Think about it. You have your biases about how things work, and you’re likely going to carry them around with you from project-to-project and job-to-job. Essentially, you want a team of content marketers, not SEO’s. This is easier to do when working with a blank slate. One of the key skills we were looking for was someone who had an eye for copy. Thus, we put out job adverts on copywriting groups looking for people who had a background in copywriting but who were interested in being trained up in online marketing. Cult of Copy is a great facebook group to find niche-specific copywriters. [quote] Tip for you The group has very strict rules from the very beginning. To join the group you have to confirm that you understand the following: Only posts requesting or offering a copywriting job are allowed. Only comments accepting a copy job or giving testimonial are allowed. No public complaints or moaning – these kinds of things should go through admin. Not adhering to the above and you’re banned! It’s good to keep everything in order! Do you think it impacts the group’s activity? Not at all! Have a look at the stats:   Rad Paluszak – Director of SEO at the Search Initiative [/quote] Content Research There are endless tools and processes for content research available. To cut through the bullshit, keep things as simple as possible, stick to the 80/20, and focus your efforts on a few core resources: Reddit Quora Meta Filter Google Search The reason why these small set of tools are recommended is that communities like Reddit, Quora and Meta Filter have done a great job of keeping spam at a minimum and also encouraging active participation. You can, without too much effort, find the industry pain points, the subjects people care about, and the themes that are recurring time and time again. For a content marketer, this is phenomenal. Reddit If you are not using Reddit as part of your content research then you are seriously missing out on an awesome trick. The really great thing about Reddit is that spam and manipulation are very rarely tolerated. And the users are usually painfully honest, to say the least. Search for variations of topics to find out what questions people are asking and what specific questions receive lots of upvotes and replies. The key here is to create a documented record of the topics that had the most engagement, as this is a great indicator that the content is popular. At this early stage, you should aim to collect as much information as possible on anything and everything in our target niche that has a high level of upvotes and comments. To do this, simply head to Reddit, and search for your market defining keywords within the search bar. Next select ‘Communities and users’ to see which subreddits are in your niche. Look for the subreddits with the highest number of subscribers and ensure that they are active communities. Sort through the subreddits one by one by setting the results to, Sort > TOP, OF ALL TIME, to get the pieces of content and content themes that have had the biggest impact. Record all the data in a spreadsheet – you can create a copy of the spreadsheet or have it already integrated. [quote] Tip for you If you’re not familiar with Google Docs and would prefer to use Microsoft Excel or Libre Office Calc, you can download a copy of the shared spreadsheet as shown below: It’s also very easy to create your own copy of the spreadsheet on your Google Drive – as mentioned above, please use this method instead of requesting the access to our file: Rad Paluszak – Director of SEO at the Search Initiative [/quote] Quora and Meta Filter Quora and Meta Filter are both very highly used question and answer websites. These allow you to search for the themes and mini topics within your niche and look for the specific questions that people are struggling with. This is very important, as you’ll see later. It also allows you to look for further content ideas within the answers that are provided. Again, we are looking for questions that have received a large amount of engagement. In the example from Quora above we can see that both of these questions have a high answer rate (133 and 99 answers). We collect all the relevant questions in a spreadsheet which includes: The question The search used How many answers the question has Any relevant comments [quote] Tip for you If you’re only looking for a bunch of questions that people ask around your niche, my personal favorite is AnswerThePublic.com. Let’s type ‘link building’ and allow it do its magic: You can easily find a lot of common questions the users are searching for. It’s a great inspiration to write useful answers and a way to potentially attack these sweet “Position 0” answer boxes: Rad Paluszak – Director of SEO at the Search Initiative [/quote] Google Search Look at high authority sites to reverse engineer successful campaigns. This allows you to hedge your bets with outreach because you’re following in the footsteps of someone else. Let’s see what kind of SEO related infographics the Huffington Post is sharing. Using a simple search string returns 88 results. If we take one of those results we can see that this infographic campaign has returned links from 24 referring domains. These can simply be exported and saved ready for quality control checks and outreach. As these infographics have been featured on a site such as The Huffington Post, it is likely that it has also been featured on other authority sites. From this research, we can see what kind of themes and topics get featured on high authority sites. Record everything and add it to the research spreadsheet. Content Strategy Our content strategy was based on a few specific content types: High-quality guest posts Infographics Embeddable Interactive content Creative content pieces Guest Posts Guest posts are easy to scale and are a great way to ensure that you have a baseline of high-quality links. There’s ways to do them right and there’s way to do them wrong, so make sure you know the difference or get them from a place you can trust. Infographics Infographics have been done to Read More Read More

The post How to Build a Scalable White Hat Agency [Case Study from TSI] first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/scalable-white-hat/feed/ 26
Infographic: Affiliate SEO vs Client SEO https://diggitymarketing.com/infographic-affiliate-vs-client-seo/ https://diggitymarketing.com/infographic-affiliate-vs-client-seo/#comments Wed, 03 Oct 2018 14:59:56 +0000 http://diggitymarketing.com/?p=6276 Looks like we’re starting a tradition… My link building service, Authority Builders, ran another contest in The Lab Facebook group. Over 300 people responded to the age-old question: Affiliate SEO vs Client SEO: Which One is Better? Since the cool thing to do these days is put data in infographic format, here are their responses, all wrapped up for you. Want to add your own feedback to this battle?  See you in the comments section.   Share this Image On Your Site: <p><strong>Please include attribution to DiggityMarketing.com with this graphic.</strong></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /> <p><a href=”https://diggitymarketing.com/infographic-affiliate-vs-client-seo<span data-mce-type=”bookmark” style=”display: inline-block; width: 0px; overflow: hidden; line-height: 0;” class=”mce_SELRES_start”></span> “><img src=”https://diggitymarketing.com/wp-content/uploads/2018/10/Infographic-Affiliate-vs-Client-SEO.jpg” alt=’Affiliate vs Client SEO Infographic’ width=’700px’ border=’0′ /></a></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /> <p> Shout out goes out to Win Bound Digital for creating this sexy infographic.   Looks like we’re starting a tradition… My link building service, Authority Builders, ran another contest in The Lab Facebook group. Over 300 people responded to the age-old question: Affiliate SEO vs Client SEO: Which One is Better? Since the cool thing to do these days is put data in infographic format, here are their responses, all wrapped up for you. Want to add your own feedback to this battle?  See you in the comments section.   Share this Image On Your Site: <p><strong>Please include attribution to DiggityMarketing.com with this graphic.</strong></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /> <p><a href=”https://diggitymarketing.com/infographic-affiliate-vs-client-seo<span data-mce-type=”bookmark” style=”display: inline-block; width: 0px; overflow: hidden; line-height: 0;” class=”mce_SELRES_start”></span> “><img src=”https://diggitymarketing.com/wp-content/uploads/2018/10/Infographic-Affiliate-vs-Client-SEO.jpg” alt=’Affiliate vs Client SEO Infographic’ width=’700px’ border=’0′ /></a></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /> <p> Shout out goes out to Win Bound Digital for creating this sexy infographic.   Read More

The post Infographic: Affiliate SEO vs Client SEO first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/infographic-affiliate-vs-client-seo/feed/ 34
How to 91x Website Traffic – A Case Study Blueprint for 2024 https://diggitymarketing.com/algorithmic-penalty-recovery-case-study/ https://diggitymarketing.com/algorithmic-penalty-recovery-case-study/#comments Mon, 06 Aug 2018 08:38:05 +0000 http://diggitymarketing.com/?p=5059 Every once in a while you run an SEO campaign that changes the way you do everything. The lessons you learn, the challenges you face, and the results you achieve inspire you to rewrite your whole SEO gameplan. This is the story of one of those SEO campaigns. As you might already know, I’m a director of a very talented SEO agency called The Search Initiative (TSI).  Since coming on, we’ve encountered many wins and this case study is one of them. In a few months, we lifted their algorithmic penalty and increased traffic by 9,109%.  You’re about to learn the exact steps we took to achieve this. You’ll learn: A detailed onsite, offsite, and technical SEO audit process How to repair algorithmic penalty problems A safe link building strategy for Conversion rate optimization strategies for fast growth Fair warning: the strategies detailed ahead are intense but worth it. Here’s the success one reader found after following this case study: Case Study: From 1,036 to 95,411 Organic Visitors Per Month This is the story of a campaign for a social media marketing website. Our client monetizes their website by selling monthly subscriptions to achieve better social proof on Facebook, Instagram, and other social networks. If you’ve ever been in this niche before, you’d know it’s not an easy one.  It’s one of the hardest niches there is. The Challenge The client joined The Search Initiative with a heavy algorithmic penalty. Traffic at the time had decreased significantly to almost 1/10th of the previous volume. If you’ve ever had an algorithmic penalty before, you can directly connect with the frustration and annoyance of such a disaster. The main challenge was to determine what type of a penalty hit the site and to take action on getting it lifted. General Approach We started by thoroughly analyzing the data based on the tools available to us and the details provided by the client. The initial analysis included looking into: Google Analytics Google Search Console Keyword tracker (Agency Analytics) SEMrush Ahrefs Cloudflare Server settings Previous link building reports and audits Once we determined the most probable cause of the penalty, we put together a plan of action. We created a comprehensive onsite, offsite and technical audit before building the overall domain authority through our own link building strategies and traditional outreach to relevant blogs and sites. How We Did It The Dynamic Start: Backlink Review The link profile of the domain included a lot of spammy, low-value domains. Since a previous automated backlink audit (most probably done using Link Research Tools) had been performed before the client joined our agency, we started by reviewing its results. At TSI we know that if it comes to potential link penalties, especially the algorithmic ones, we have to be very thorough with the link reviews. To start the analysis, we downloaded all the link data from the following sources: Google Search Console – it’s a real no-brainer to include all the links that Google definitely has in their database. However, according to this Google Webmaster Help page, you have to remember that GSC presents only a sample of links, not all of them. Ahrefs – it is our go-to and best 3rd party tool when it comes to links. Their database is an absolute beast and the freshness of the data is also outstanding. To gather all link data, go to Ahrefs, type in your domain and select Backlinks. Now you’re good to Export it to an Excel file: By the way, make sure you select the Full Export option, otherwise, you’ll be exporting only the first 1000 rows with the Quick Export: Majestic – even though their crawler might not be as complete as Ahrefs, you still want to have as many link sources as possible for your audit. With Majestic, you’ll have to type in your domain → Select “Root Domain”→ Export Data. Now, because of the link memory (AKA ghost links – links that are deleted, but Google still “remembers”), we export the data from both, Fresh and Historic indexes. Also, ensure to set the tool to “Show deleted backlinks”. Moz and SEMrush – Similarly to Majestic, with these two we just want to have as many links as possible and complement the database, in case Ahrefs missed some. How to get links data in Moz Open Site Explorer: Your site → Inbound Links → Link State: All links → Export CSV How to get links data in SEMrush: Your Site → Backlink Analytics → Backlinks → Export. Please make sure to select “All links” option. We had all the data now, so it was time to clean it up a bit. There’s no real secret in how to use Excel or Google Sheets, so I’ll just list what you’ll have to do with all the link data prior to analyzing it: Dump all Ahrefs data into a spreadsheet. If you’re wondering why we start with Ahrefs, it’s explained in step 4. Add unique links from GSC into the same spreadsheet. Add unique links from all other sources to the same spreadsheet. Get Ahrefs UR/DR and Traffic metrics for all the links (Ahrefs data will already have these metrics, so you’re saving time and Ahrefs’ credits). Spreadsheet ready! With the spreadsheet, we started a very laborious process of reviewing all the links. We classify them into 3 categories: Safe – these are good quality links. Neutral – these are links that are somehow suspicious and Google might not like them that much – although they’re quite unlikely to be flagged as harmful. We always highlight these in case we were to re-run the link audit operation (for example if the penalty did not get lifted). Toxic – all the spammy and harmful stuff you’d rather stay away from. Some of the main criteria we’re always checking: Does it look spammy/dodgy AF? Does it link out to many sites? Does the content make sense? What is the link type (e.g. comment spam or some sitewide sidebar links would be marked as toxic)? Is the link relevant to your site? Is the link visible? Does it have any traffic/ranks for any keywords? Ahrefs’ data helps here. Is the page/site authoritative? Ahrefs’ DR helps here. What’s the anchor text? If you have an unnatural ratio, then it might be required to disavow some links with targeted anchor texts. Is the link follow/nofollow? No point disavowing nofollow links, right? Is it a legit link or one of these scraping/statistical tools? Is it a link from a porn site? These are only desirable in specific cases, for example, you’re a porn site.  Otherwise, its disavow time. If it is likely that the whole domain is spammy, we’d disavow the entire domain using “domain:” directive, instead of just a single URL. Here’s a sneak peek of how the audit document looked like once we finished reviewing all the links: Then, we compared the results of our audit and current disavow file and uploaded a shiny new one to Google Search Console. We disavowed 123 domains and 69 URLs. Additionally, we also used our in-house, proprietary tool to speed up the indexing of all the disavowed links. Something quite similar to Link Detox Boost, but done through our own tool. Here’s a little screenshot from our tool: Crucial Stage 2: The Onsite Audit The next step taken was a full, comprehensive onsite audit. We reviewed the site and created an in-depth 30-page document addressing many onsite issues. Below is a list of elements covered in the audit: Technical SEO Website Penalties First, we confirmed what the client has told us and established what kind of penalty we’re dealing with. It has to be emphasized that there were no manual actions reported in GSC, so we were dealing with a potential algorithmic penalty. We searched Google for the brand name and did a “site:” operator search. If you were able to find your brand name ranking number 1 in Google (or at least among your other profiles, e.g. social media accounts, on the first page) and it’s no longer there, you know you’re in trouble. Basically, if Google devaluates or de-ranks you for your own brand, this is a very strong indicator that you’ve been hit with a penalty. With the site: operator search it’s a bit more tricky. However, as a rule of thumb, you could expect to have your homepage show as a first result returned for a simple query: “site:domain.com” in Google. Another way of confirming the content devaluation is to copy and search for a fragment of the text on your core pages. In the example below I do a Google search of 2 sentences from one of my articles (right-click to bring up a search of the text you highlight): As you can see below, Google finds it on my page and shows as a first result: If it was not the case and Google did not show me first or at all, then it would be a very strong indication that the article page or Read More Read More

The post How to 91x Website Traffic – A Case Study Blueprint for 2024 first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/algorithmic-penalty-recovery-case-study/feed/ 155
Top 10 Most Common SEO Problems https://diggitymarketing.com/top-10-most-common-seo-problems/ https://diggitymarketing.com/top-10-most-common-seo-problems/#comments Mon, 02 Oct 2017 09:43:36 +0000 http://diggitymarketing.com/?p=3155 Since 2017, I’ve been involved as a director for a very talented agency called The Search Initiative (TSI). The reason I teamed up with these guys was very simple. These guys were testers.  Like me, they only rely on experience, data, and test results for their ranking strategies. It was a match made in heaven. Since then, we’ve had the pleasure of onboarding hundreds of new clients offering a wide range of organic SEO consulting services. Many of these new partners have had seen huge growth, others have had penalties removed, and all have clear roadmaps on how to grow in the future. I wanted to share with you the top 10 SEO problems we typically encounter when onboarding new clients. I hope that by sharing with you some of these common mistakes, you can use this knowledge to your advantage and make some serious improvements to your rankings. The 10 Most Common SEO Issues in Many of you will be familiar with the inverted pyramid writing style, where the most newsworthy content is at the top and the least is at the bottom. I’ve tried to follow this structure; however, all the points below are not to be slept on.  They’re all major issues that commonly appear amongst even the best sites. If you want to get the most out of this article, check out every point here.  As you know, there are no shortcuts when it comes to SEO. Index Management Localization Keyword Cannibalization Over Optimized Anchor Text Poor Linking Strategy Low-Quality Affiliate Content User Performance Metrics Titles & Meta Descriptions Internal Redirects Low-Quality Pillow Links And if you’re wondering which SEO Mistake is ruining you on Google watch this video. 1. Index Management Problems The first and most common issue that we’re seeing is accidental devaluation of the website because of indexing issues. It stems from a common misunderstanding about how Google actually works. (More on this in a bit…) Most people think that if they build links and noindex junk pages they’re fine. However, it’s not that simple – and I’m about to show you a real example. Below you will find a screengrab from Screaming Frog. This is from a crawl of an eCommerce website that had a lot of onsite issues that need to be fixed: It’s quite hard to see but you may notice I have highlighted the number of HTML pages that are filtered. It’s a whopping 32,064 pages and, yes, it took us a long  time to crawl. None of the 32,064 pages found in this crawl included a noindex tag, which means (in theory) Google should be able to crawl and index these pages. So, let’s check this against our numbers in the Google Search Console: When we check in Webmaster Tools, we’re seeing 14,823 pages indexed. While this is a large volume, it’s still less than 50% of the pages that were found with Screaming Frog. This is the first sign that something is seriously wrong, but the next screenshot will show you the extent of how badly our client had been stung with Panda’s low-quality algorithm.  We use the “site:domain.com” operator to pull up the number of indexed pages: Despite the website having 32,064 pages crawlable and with index tags, and despite Google having indexed 14,823 in Search Console – only 664 have made it into the actual index. This site search shows us that Google has highly devalued most of the website. It is a crawling nightmare. So, the question is, how can you fix this? Thankfully the answer for most people is quite simple. Start by performing a site:domain.com search and auditing Google’s index of your site. If you go to the final page and you’re greeted with the below message, you have work to do: Take a hard look at which pages shouldn’t be indexed and start proactively removing them from your crawl budget. The problem with Google is that despite you adding a noindex to your pages, they remain indexed until Google recrawls. Some people add robots.txt to block these pages and save crawl budget – which is a good idea, but only after the pages are removed. For the rest of us, we’re going to need to use the URL Removal Tool. To learn more about how to deal with crawl and indexing issues, check out this guide. 2. Localization Issues The second most common issue we are seeing is when clients have multiple languages. While it’s great to have international coverage and provide foreign users with localized text – it’s a nightmare for Panda penalties if not setup correctly. Many people are familiar with the URL structure that you should use for localized text, but many people forget to set up HREFLang on their website.  My buddy Tom talked it in this interview I did with him here. If you are looking to setup HREFLang codes, I suggest you use this website to get the right country and location code every time. Below is an example of an eCommerce client. Whereas the previous client had issues with index management, this time it’s caused by HREFLang, and one more thing that goes unnoticed… While the client has successfully included hreflang in their source code, they had not included both the location and language code. The one time they try to do this with en-GB, the page no longer exists and redirects to their sitemap. To add, this covers just 50% of the languages their website operates under. This has created an enormous amount of duplication to be indexed. However, there’s still one more thing that was missed.  Each page has the Open Graph local set for en_US: This includes the pages that aren’t in English. While this setting isn’t as clear cut as hreflang, it is indeed something that will provide Google with information on locale, and therefore creates confusion. If your website has a similar issue, we advise you make the locale dynamic to match the current language. For more help with locality and its importance, check out this page on local seo solutions. Client Testimonial “The guys at TSI make SEO look easy!.. We were a completely new website planning to operate in arguably the most competitive online marketplace which made the task ahead extremely difficult as we were going up against many well-known global businesses which also resulted in many different SEO agencies reluctant to work with us. TSI wasted no time in implementing their campaign and within 4 months our website was ranking on page 1 for some of our most profitable keywords. The guys at TSI are constantly keeping me updated and send me monthly reports on my campaign performance and keyword tracking. I would recommend their services in a heartbeat!” – Jon H 3. Keyword Cannibalization This is a surprisingly common issue for most websites that we encounter. Despite the large amount of resources online to help with cannibalization, you would be surprised how many people still suffer from it. Never heard of it? Quite simply, it’s when you have multiple pages on your site competing for the same keywords. And guess what?  Google doesn’t like it. The first step is to learn to diagnose the culprit pages, because if you cannot find cannibalization – how can you fix what you can’t see? At The Search Initiative we have a few ways to find cannibalization but here’s the easiest and most effective. Use Keyword Tracking Tools One of the benefits a client gets from working with TSI is that we track keywords up to twice daily with Agency Analytics, one of our partners. The tool includes an overview of the site’s overall keyword performance, such as below: Aside from showing us an overview of how this client has performed over the past 7 days, we can use this to track each keyword’s performance independently too: In this photo, you might notice that there had been a jump from the 3rd page to the 1st page for their target term after implementing some of our onsite advice. However, more importantly, you will also be able to see that the Google URL had started flipping between their category page and their homepage. This is an obvious sign of cannibalization and once noticed we jumped into action to fix the problem. To learn more about keyword cannibalization, I have a master guide here. 4. Over-Optimized Anchor Text There was a significant update in October 2016 as Penguin 4.0 rolled out. Penguin 4.0 was an update that changed how Google perceives and interacts with links.  I even wrote an article for Ahrefs covering how it affected anchor text optimization. As part of our auditing process for each new client, we analyze your existing anchor text and break the types down into the below values: Branded – an anchor text that includes your brand name or a slight variation, for example: ‘thesearchinitiative’, ‘visit thesearchinitiative’, or ‘TSI’. Generic – an anchor that uses a generic term but does not include branding, for example:‘here’, ‘read more’, Read More Read More

The post Top 10 Most Common SEO Problems first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/top-10-most-common-seo-problems/feed/ 101
SEO Spotlight: Daryl Rosser, Lior Ohayon, Joseph Elshazly https://diggitymarketing.com/seo-spotlight-episode-2/ https://diggitymarketing.com/seo-spotlight-episode-2/#comments Tue, 09 Feb 2016 13:00:49 +0000 http://diggitymarketing.com/?p=1021 Welcome back to Episode 2 of the SEO Spotlight series.  In the last episode, we visited three SEO’s that think outside-the-box in order to tackle their problems. Being a person who appreciates creativity, I’ll be keeping up with the same theme here by interviewing three more SEOs who have made great progress in the areas of affiliate SEO, PBN, and client scaling.  You’ll learn their thought processes and find out how you can apply these same ideas to your own SEO projects. Daryl Rosser – Breaking Down Problems into Bite Size Pieces Daryl has got to be one of the most likeable guys in SEO.  He’s got an incredible head for business, especially for such a young fella, and has a sound technical understanding as well.  Plus, he’s fun to party with.  His technique for deconstructing high-competition affiliate SEO is bound to get some ideas popping off in your head. Name: Daryl Rosser Age: 22 Location: Northamptonshire, England Matt: When did you get into SEO and how? Daryl: 2.5 years ago when I was seriously struggling to make money. A local business had contacted me, they said they wanted SEO and heard I could help. I had no idea about SEO, but I needed money, so I met them, read up on some SEO buzzwords before meeting them, and later on ended up closing them as a client.  After their first payment came through, I setup some backlinks, and got them ranked #1 in 3 weeks, and had an “oh shit” moment. Matt: Let’s talk more about how you make affiliate SEO ridiculously easy. Daryl: Why is it that most people prefer local SEO to affiliate SEO? It’s easier. When people think of affiliate SEO, they think of competing with all the other affiliates. This requires a higher budget, better knowledge of ranking, and most likely more time. This technique is how I approach affiliate SEO, where there is barely any competition, you don’t need a lot of PBN links, and the conversion rate is ridiculous. Matt: How exactly do you do it?  Spill the beans.  Daryl: I don’t sell products as an affiliate. That’s a great model, and it works, but it takes a lot more traffic. I specifically seek out national firms that have pay-per-lead and/or pay-per-call affiliate offers. Firstly, this makes it a lot easier to generate a commission, and some of these can pay upwards of $100, depending on the niche. (Lingo clarification: Pay-per-lead means they will pay for an enquiry. Pay-per-call means they will pay for a call.) But the smart part of this strategy is that I treat them as local clients. Rather than try to compete on the national keywords, I can pick out a few towns or cities, rank my site top in them, and reap the rewards. This leaves you competing with local businesses, not other affiliates. The search volume is low, but you can rank multiple sites easier, and the search traffic is highly qualified. Say you work with a national plumbing company. If someone searches “plumbers in [your city]”, how qualified do you think they are to wanting a plumbing company? Then all you need to do is drive an enquiry or call through their website, and you get paid. It’s like local lead gen, but it’s through affiliate networks, so no client management. The next question people ask me when I share this is, “where do I get the offers?”  All I can say is that you should be on every reputable network, and you should be looking out for them. Here is a list of 20 reputable CPA networks: https://mthink.com/top-cpa-networks-2015/. You can specific ask your affiliate managers for these types of offers and they’ll be able to help. Matt: Very clever.  What’s your biggest win from using this technique? Daryl: Last month one of my websites had 1,342 unique visitors. The revenue (entirely from SEO) was $2,600. That is around $1.94 per visitor to my website. Compare that to some affiliates that have over 1,000 visitors daily with lower revenues. This is a solid model. Matt: Any advice to others in this SEO game? Daryl: My advice isn’t related to SEO so much as it’s related to business. One of the biggest mistakes I’ve made in the past, and I see others make, is to focus on things in the wrong order. They do the right work, but in the wrong order. Take client SEO as an example. People invest thousands of dollars and several hours into building up a PBN before they even have a client. How does that help you? It doesn’t. Evaluate your goals and ask yourself what is the one thing you should be focusing on right now to get you closer to that. If you’re just starting out with client SEO, it’s probably prospecting or outreach. Once you’ve done that, then you can build a PBN or rent links. As you expand your business you get better at this, but then it becomes – are you spending all your time managing and maintaining what you have, or are you focusing on growth?  It’s easy to get complacent at a certain stage and just manage things, don’t, keep focusing on growth and working your ass off towards it. Be sure to check out Daryl’s Lionzeal.com, a blog and community (of over 2,000 members) committed to scaling SEO businesses. Lior Ohayon – No Software Tool Exists?  Create it Yourself.   We all have some tasks that repeatedly consume a large amount of time.  Tasks that would be better resolved by software, if it existed.  Lior took matters into his own hands and developed a tool for speeding up the PBN auditing process. Name: Lior Ohayon Age: 23 Location: Toronto, Canada Matt: When did you get into SEO and how? Lior: I got into the SEO world by accident in 2013. I was a typical college dropout looking for ways to make money online. Everything I would read about starting a business boiled down to getting traffic to websites and SEO was one of the main ways. I consumed a lot of content about affiliate websites and link building, mainly through Pat Flynn, and played around with my own sites a little. Then I realized instead of starting a bunch of websites, I could just offer these new skills I learnt to existing business owners and help them grow their traffic. I stumbled into client SEO by accident! Matt: Let’s talk more about the software you created.  What problem does it solve? Lior: When you are buying an expired domain for your PBN at auction or from a broker, you need to do a lot of research using third party tools to see if it is a powerful, clean domain. There could be dozens of metrics to research, and it can get very time consuming and a headache to manage all of the tools in your browser (multiple tabs, logins, etc). I got frustrated with this whole process so I created ScopeReveal.  It’s a free tool that aims to eliminate that problem by streamlining all the tools into one convenient dashboard, as well as providing our own metrics from our algorithm that combines many of the imported metrics. Matt: Many SEOs including myself have considered having software created.  What was the development process like? Lior: First I decided which metrics are the most important, and figured out how to use their APIs to capture that data into my own software. Next, I found my developer by using Elance (now UpWork). He was one of the first to respond and he even built a prototype of the software which he included in his initial SEO proposal. That blew me away and his English was perfect so it was the right fit all around. Some tips I have for hiring on these sites is to include a secret code they should put in their proposal so you know they read it, as well as requiring a small test to be done (can be a paid test – well worth it). Then you will know their communication skills, skill level and ability to deliver on schedule. It also helps to promise future work on other projects if the first one goes well. This will lower rates if you always choose fixed price like me. On a side note, do research before making any SEO software that makes your life easier. It could already exist, or it could be cheaper to have a virtual assistant do the same tasks. You really have to calculate how much time or money it will save you in the long run, including upkeep and all. Matt: How much time savings can one expect using ScopeReveal? Lior: I typically buy PBNs from brokers. So once I get the URL from them, I would have to open MOZ, Majestic, Google, Ahrefs, Archive Wayback Machine, Who Is. That’s maybe 5 minutes in total. Then adding the domain to each one, analyzing the data, another 15 minutes or so. Read More Read More

The post SEO Spotlight: Daryl Rosser, Lior Ohayon, Joseph Elshazly first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/seo-spotlight-episode-2/feed/ 3
SEO Spotlight: Jacob Kettner, Daniel Moscovitch, Michael Landau-Spiers https://diggitymarketing.com/seo-spotlight-episode-1/ https://diggitymarketing.com/seo-spotlight-episode-1/#comments Wed, 20 Jan 2016 15:00:47 +0000 http://diggitymarketing.com/?p=1007 I was a strange child.  I always seemed to do things a bit differently than the other kids.  The habit stuck. I use four fingers to hold a pen.  I’m right-handed but left-legged.  I never wear underwear.  You should see the process I use to tie my shoes. Needless to say, I’ve always been fascinated with the various ways we can solve the same problems.  In my own SEO business, I spend a majority of my time testing.  What I’ve found is there is no cookie-cutter blueprint for success in SEO.  It’s always going to depend on your particular situation, skill set, and resources. I appreciate SEOs who can think outside-the-box; clever folk that use their intuition and do things a bit differently, allowing them to get some amazing results.  I’d like to introduce you to three of them. Jacob Kettner – Using Sub-domains for Risk-Free Authority Building I met Jacob a few months ago.  He’s a really bright SEO and is a perfectionist in his work.  I helped him audit one of his sites and found nothing wrong with his SEO plan.  That happens pretty much never.  The subdomain technique he uses is a clever way to build up authority, yet at the same time, reducing penalty risk. Name: Jacob Kettner Age: 26 Location: Winnipeg, Canada Matt: When did you get into SEO and how? Jacob: I got into internet marketing at the age of around 16.  I tried a lot of “loophole” tactics off Digital Point and Warrior Forum and made enough cash to keep me in the game.  I ran a business dropshipping customized products on eBay for years.  Despite my many years of internet marketing experience, I had virtually no knowledge of SEO and hired 3 different SEO firms before deciding to learn how to do it myself. Eventually I sold off my site and used the capital to start a client SEO business in the fall of 2014.  In the following year I learned a lot and scaled from nothing to a low 6 figure per year business. Matt: Tell us about the sub-domain authority boosting technique Jacob: Juicing up subdomains helps increase DA quickly without risking your money site.  The theory comes from looking at Web 2.0 sites like WordPress, Tumblr etc.  It’s a well-known SEO concept that subdomains on these strong properties will rank more quickly and have a higher tolerance for abuse than a fresh domain. Hold on a second though… How did these sites get so powerful?  Was it from high quality content and powerful links linking to the main domain?  Let’s look at majestic stats.  The URL metrics for https://www.tumblr.com/ are TF88 CF83 RD 27,112 and Backlinks 4,337,075.  So yes, I’m not denying that this is a very strong site.  But let’s look at the majestic metrics for the root domain tubmlr.com… TF92 CF93 RD 2,061,256 Backlinks 30,957,773,997.  To be clear this site has over 30 BILLION backlinks going to it. Although Tumblr in of itself has a lot of power on its main URL, a LARGE amount of power that the domain holds only 1.4% of the referring domains going to the site go to the main page.  The bulk of the referring links go to its sub-domains. My conclusion is that this is a two-way street.  Tumblr’s power helps parasite subdomains rank, but also the subdomains give back DA to the main domain. Matt: Practically, how do you juice up your sub-domains? Jacob: Here are a couple of models that I’ve used: Create subdomain EMD then build an auto video blog and leave it alone (maybe send some social signals). Create a subdomain EMD and send high authority links at it (this is a good place to test Fiverr gigs).  Shitty PBN links will work fine here, as well.  No need to waste the good stuff on this. I’m not advocating for sending thousands of GSA links at a money site subdomain, but you can definitely afford to be a little bit more aggressive than you would to the main domain.  Keep in mind the purpose of the subdomain isn’t necessarily to rank.  It’s just to add authority to the overall domain.  I’ve also done this with auto video blogs and no links just to increase page count and fresh content on the domain. (Note: I don’t link from these subdomains to my main site.) Matt: What are the biggest accomplishments you’ve been able to attain from using this technique? Jacob: I’ve ranked for a competitive local term with this method and just three PBN links. Matt:  Nicely done.  Got any advice for other SEOsS, newcomers and experts alike? Jacob: Take action and test stuff.  Foundations and principles last forever. Tactics and loopholes die as soon as they go public.  If you like tactics and loopholes, test stuff yourself and use it before it gets ruined.  I’m sharing this one because it’s particularly hard for Google to address it, given the parallel between this and major Web 2.0 sites. To learn more about Jacob and his case studies, be sure to check out his agency at First Rank. Daniel Moscovitch – Slick Client Getting Tactics If you’re doing client SEO, you’ve likely run into the same problem everyone else does.  Most businesses don’t trust SEOs.  Daniel solves this problem by simply rewording his sales copy and avoiding the conversations that would be in the typical SEO sales pitch. Name: Daniel Moscovitch Age: 30 Location: Tel Aviv, Israel Matt: When did you get into SEO and how? Daniel: 2.5 years ago (just after the first Penguin update). I started off as a whitehat link builder at a really awesome online marketing agency here in Tel Aviv and just exploded from there. Matt: Tell us about your client getting technique. Daniel: Let’s face it. SEO has a bad name. This is totally understandable due to the sheer number of businesses out there who have just gotten screwed by the so called SEO “experts” out there. These ‘experts’ often use outdated tactics, undercharge and under-deliver on their promises, and businesses are fed up (and I DON’T BLAME THEM!). That is why I have had success as of late by taking the focus away from SEO, a ‘dirty’ word, and talking instead about online visibility, creating an online brand, and making websites more powerful and more Google friendly. The word “SEO” has somewhat been tainted and is often really hard to understand what it actually is. That is why I always like to start describing my services with words like “improving your site’s power and authority”, “increasing website visibility” and “connecting you with more customers online”; “increased rankings”. Of course, there is no hiding that what we do is SEO, but this way I don’t scare prospects off right away and can slowly develop a relationship that shows the value that I provide as opposed to selling them on SEO right off the bat. I also like to make sure that the focus is NOT on what WE will do, but rather what THEY will gain from working with us. Most business owners want to stop worrying about having to actively search for clients, and prefer to stick to what they know and love. On every proposal we send out, we make sure that it includes that sentiment. On top of that, we have experimented recently on not locking clients into any long term obligation, but rather starting off with a small proof of concept. So far, 100% of our clients who have started with this, have gone on to continue with us long term and for more money. SEO can be difficult enough as is. When you add selling it to potential clients, you have to make it as easy and beneficial to them as possible. That is why it is important to focus on what THEY have to gain, start small, gain their trust and go from there. Matt: In ROI, what has this change in marketing achieved for your agency? Daniel: In the past 2 months alone, we have increased our monthly earnings by $5K by just switching the focus of our pitch and changing our proposals around. Matt: Any newbie advice for upcoming SEOs? Daniel: Never get discouraged, never stop learning, and embrace the confusion! To learn more about Daniel and his agency, check them out at MoreHotLeads.com. Michael Landau-Spiers – Smart Keyword Research to Break into a Huge Affiliate Niche   Michael is my favorite SEO success story of last year.  With only a year’s experience in SEO, he found a way to break into one of the most competitive affiliate niches (I’m in it myself).  Instead of going for the super high volume keywords like most SEO’s would (including myself), he broke down the niche and found keywords that we’re easily attainable, yet still very profitable.  When the timing was right, he cashed out on a huge flip. Name: Michael Landau-Spiers Age: 21 Location: United Kingdom Matt: When did you get into SEO and how? Michael: Read More Read More

The post SEO Spotlight: Jacob Kettner, Daniel Moscovitch, Michael Landau-Spiers first appeared on Diggity Marketing.

]]>
https://diggitymarketing.com/seo-spotlight-episode-1/feed/ 26