Seven reasons why your rankings dropped and how to fix them Search Engine Watch

Seven reasons why your rankings dropped and how to fix them

Seven reasons why your rankings dropped and how to fix them

Do you know the triumph when your content finally hits the first page of Google and attracts significant traffic? Unfortunately, nobody is safe from a sudden drop in rankings. The thing is that the reasons for it may be different and not obvious at all.

In this post, you’ll discover what could cause a sudden drop in traffic and how to fix the issue.

The tip of an iceberg

Unfortunately, there’s no one size fits all decision, when it comes to SEO. When you face the drop in your rankings or traffic, it’s just the tip of an iceberg. So, get ready to check lots of issues, before you identify the problem.

Graph on issues that cause ranking drops

Note: Percentages assigned in the above graph are derived from personal observation.

I’ve illustrated the most common reasons for a plummet. Start from checking these parameters to find out how you can recover your rankings and drive traffic to your website.

Algorithms test

First of all, check the SERP. What if it’s not only your website that changed its positions in search results? These sharp shifts may happen when Google tests its algorithms. In this case, you don’t even have to take any further steps, as the rankings will be restored soon.

If you track your rankings with Serpstat, you can analyze your competitors’ positions as well. It’ll help you understand whether the SERP was changing a lot lately. From the moment you create a new project, the tool starts tracking the history of top-100 search rankings’ changes for the selected keywords. The “Storm” graph illustrates the effect of the changes that have occurred in the search results.

The "Storm" graph that illustrates the factors causing the ranking drop

On this chart, you see that for the “cakes for dads” keyword the storm score was pretty high on 21st March. Now, let’s look at how the top-10 positions that were changing on this date.

Graph showing a phrase-wise rise and drop in the SERP

The graph shows a sharp drop and rise that occurred in most of the positions. In a few days, all the rankings were back to normal again.

This example tells us that whenever you witness a significant drop in your search rankings, you should start with analyzing the whole SERP. If there’s a high storm score, all you need to do is to wait a bit.

In case you checked your competitors’ positions and didn’t see any movements, here’s the next step for you.

Technical issues

Technical SEO affects how search robots crawl and index your site’s content. Even though you have optimized your website technically, every time you add or remove some files or pages, the troubles may occur. So, make sure you’re aware of technical SEO issues on your site. With Google’s URL Inspection tool, you can check the way search engines see your website.

These are the main factors crucial for your rankings:

1. Server overload

If your server isn’t prepared for traffic surges, it can take your site down any minute. To fix this problem, you can add a CDN on your website or cache your content, set up a load balancer, or set up a cloud hosting,

2. Page speed

The more the images, files, and pop-ups you add to your content, the more time it takes for your pages to get loaded. Mind that page speed isn’t only a ranking factor, but it also influences user experience. To quickly check the issue, you can go with Google’s PageSpeed Insights. And to speed up your website, you can:

  • Minimize HTTP requests or minify and combine files
  • Use asynchronous loading for CSS and JavaScript files
  • Defer JavaScript loading
  • Minimize time to first byte
  • Reduce server response time
  • Enable browser caching
  • Reduce image sizes
  • Use CDN again
  • Optimize CSS delivery
  • Prioritize above-the-fold content (lazy loading)
  • Reduce the number of plugins you use on your site
  • Reduce redirects and external scripts
  • Monitor mobile page speed

3. Redirections

It’s the most common cause of lost rankings. When you migrate to a new server or change the structure of your site, never forget to set up 301 redirects. Otherwise, search engines will either fail to index your new pages or even penalize your site for duplicate content.

Detecting site errors can be quite difficult especially if it’s located solely on one page. Inspecting every page would be time-consuming. Also, it’d be very costly if you’re running a business. To speed up the process of identifying such errors you can use different SEO tools and site audit tools, like Serpstat, OnCrawl, and other such ones.

 

Wrong keywords

Are you using the right keywords? If you hadn’t considered user intent when collecting the keywords, it might have caused some problems. Even if your site was ranking high for these queries for some time, Google could have changed the way it understands your site’s intent.

I’ll provide two examples to illustrate the issue.

Case one

There’s a website of an Oxford Summer School named “oxford-royale.co.uk”. The site didn’t contain any long-form descriptions but services pages. Once Google began to rank the website for queries with informational intent, SEO experts noticed the traffic dropped. After they added more texts to the service pages, they succeeded in fixing the problem.

Case two

This case occurred to a flower delivery agency. While the website was ranking for transactional queries, everything was alright. Then Google decided the site better suits informational intent. To restore the site’s rankings, SEOs had to add keywords with high transactional intent, such as “order”, “buy”, and many such keywords.

To collect the keywords that are right for your business goals, you can use KWFinder. With the tool, you can identify relevant keywords that you can easily rank for.

Screenshot of a suitable keywords' list in KWFinder

Outdated content

This paragraph doesn’t require long introductions. If your content isn’t fresh and up-to-date anymore, people won’t stay long on your site. Moreover, outdated content doesn’t attract shares and links. All these aspects may become good reasons for search engines to reduce your positions.

There’s an easy way to fix it. Update your content regularly and promote it not to lose traffic. The trends keep changing, and if you provided a comprehensive guide on the specific topic, you don’t want it to become outdated. Instead of creating a new guide every time, update the old one with new data.

Lost links

Everybody knows your link profile is a crucial part of your site’s SEO. Website owners take efforts to build quality links to the new pieces of content. However, when you managed to earn a large number of backlinks, you shouldn’t stop monitoring your link profile.

To discover whether your link profile has undergone any changes for the last weeks, go with Moz or Majestic. The tools will provide you with data on your lost and discovered links for the selected period.

Screenshot of discovered and lost linking domains in Moz

If you find out you’ve lost the links from trustworthy sources, try to identify the reasons why these links were removed. In case they’re broken, you can always fix them. If website owners removed your links by chance (for example, when updating their websites), then ask them to restore links. If they did it intentionally, no one can stop you from building new ones.

Poor user experience

User experience is one more thing crucial for your site’s rankings. If it had started ranking your page high on search results and then noticed it didn’t meet users’ expectations, your rankings could have suffered a lot.

Search engines usually rely on metrics such as the click-through rate, time spent on your page, bounce rate, the number of visits, and more. That’s why you should remember the following rules when optimizing your site:

1. Provide relevant metadata

As metadata is used to form snippets, it should contain relevant descriptions of your content. First of all, if they aren’t engaging enough, users won’t click-through them and land on your site. On the other hand, if your snippets provide false promises, the bounce rate will increase.

2. Create an effective content structure

It should be easy for users to extract the necessary information. Most of your visitors pay attention to your content structure when deciding whether they’ll read the post.

Break the texts into paragraphs and denote the main ideas in the subheadings. This step will help you engage visitors looking for the answer to their very specific questions.

3. Avoid complicated design and pop-ups

The content isn’t the only thing your audience looks at. People may also decide to leave your website because of irritating colors, fonts, or pop-up ads. Provide simple design and minimize the number of annoying windows.

Competition from other websites

What if none of the steps worked? It might mean that your rankings dropped because your competitors were performing better. Monitor changes in their positions and identify the SERP leaders.

You can analyze your competitors’ strategies with Serpstat or Moz. With these tools, you can discover their backlink sources, keywords they rank for, top content, and more. This step will help you come up with ideas of how you could improve your own strategy.

Never stop tracking

You can’t predict whether your rankings will drop one day. It’s much better to notice the problem before you’ve already lost traffic and conversions. So, always keep tracking your positions and be ready to react to any changes quickly.

Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter .

Related reading

Google China
What defines high-quality links in 2019 and how to get them
The evolution of SEO and the shift from point solutions to platform
On-site analytics tactics to adopt now Heatmaps, intent analysis, and more

Source link

Google updates Search Quality Evaluator Guidelines

Google updates Search Quality Evaluator Guidelines

On Thursday, Google updated its Search Quality Evaluator Guidelines for the first time since July 2018. The refreshed guidelines add more detailed directions regarding interstitial pages and content creator expertise, and buckets “E-A-T” (Expertise, Authoritativeness, Trustworthiness) within “Page Quality” in certain sections.

Why we should care. The Search Quality Evaluator Guidelines are what human quality raters use to evaluate websites and SERPs. They don’t directly affect rankings but their judgments are used to improve Google’s search algorithm.

The inclusion of “E-A-T” within “Page Quality” may reflect how Google wants its quality raters to approach evaluating content. The increased emphasis on interstitial pages within the Distracting Ads/SC section may also mean that webmasters and advertisers who use those techniques in an intrusive manner may see lower ratings. And, the more detailed guidance regarding content creator expertise may put questionable or lower-quality content under more scrutiny.

What’s different? Although the document has grown by two pages (to 166), the table of contents, along with the vast majority of the guidelines, has remained unchanged. Below are side-by-side comparisons of the previous guidelines (left) and the most recent version (right).

Advertisers that employ interstitial pages or ads, and app developers in particular, should verify that their ads do not limit a user’s ability to get to a page’s main content. Google_SQE_11.0_Comparison

The addition of this paragraph explicitly mentioning content creator expertise underscores the importance of vetting the information presented in your content.

Google_SQE_EAT_Comparison


“E-A-T” is now included within “Page Quality,” under the explanation column of certain tables (particularly in sections 15 and 17).

These revisions do not greatly alter the majority of guidelines that dictate how quality raters evaluate websites, but they were meaningful enough for Google to update the document, which means that content creators, marketers and advertisers should be aware of them as well.


About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Source link

New visual search innovations tap human emotions and biological buying triggers Search Engine Watch

New visual search innovations tap human emotions and biological buying triggers

New visual search innovations tap human emotions and biological buying triggers

There’s a science behind what engages shoppers and gets them to purchase and new visual search tech implementations promise to exploit that and reinvent ecommerce as we know it.

A shopper’s decision to buy products is more influenced by the primal brain areas and less from the analytical side. Us humans are hard-wired to our emotions which spring from the same areas of the brain, the right side, that processes and reacts to visual stimulation. In the early days of mankind, it’s largely how our ancient ancestors survived in the wild.

Similar to Facebook’s emoticons it rolled out as “reactions” in 2016, our modern emotions emerge from four core feelings, happy, sad, afraid/surprised (“wow”), and angry/disgusted, based on research conducted by the Institute of Neuroscience and Psychology at the University of Glasgow.

Smart marketers can appeal to our right brains that communicate in feelings and respond to images that increase conversions and sales because people tend to act based on emotions. Most of the purchase decisions people make are emotional, not practical. Retail shopping therapy is, perhaps, an offshoot of this science-based truth.

When it comes to shopping, decision-making, and conversions, another experiment conducted by the George Washington University and UCLA, found that playing to the emotional side of our brains is a far better strategy than using too many facts and figures that appeal to the decision-making areas of the brain.

The researchers found that ads that use logical persuasion (for example, “this car gets 42 miles to the gallon”) scored lower for conversions than those that “seduced” people by circumventing “consumers’ conscious awareness by depicting a fun, vague or sexy scene”.

Visual search will revolutionize ecommerce and SEO

The rise of visual search is powered, in part, by people’s desire to discover products and brands, and it’s playing out now in the new trend of shopping on social media channels such as Instagram and Pinterest that’s spreading most quickly amongst millennials as the next big thing.

Yet, “creating technology that can understand images as quickly and effectively as the human mind is a huge undertaking”, wrote Adam Stetzer in a trend piece on visual search last year. “Visual identification is a natural ability made possible through a wonder of nerves, neurons and synapses. We can look at a picture, and in 13 milliseconds or less, know exactly what we’re seeing”.

Google is making rapid advancements tied to the increasingly visual nature of the search for ecommerce. For example, in early March it rolled out a new pilot program to digitally connect retailers and consumers, who can now make purchases from results of Google Image searches.

For the pilot’s launch, Google cited a figure that 50 percent of online shoppers said images of the product inspired them to purchase. Google is currently testing its “Showcase Shopping” ads on what it calls “a small percentage” of traffic with select retailers, surfacing on broad queries such as “home office ideas”, “shower tile designs”, and “abstract art”.

Certainly, the visual search trend will impact the programmatic ad industry’s innovations for future offerings. Advanced AI and computer imaging will be two core technologies that power dynamic personalization and highly customized ads that boost campaign performance tied to consumer’s visual search behaviors. For instance, it enables offering up winter jackets in the shopper’s favorite colors as fall approaches, or quickly serves up visually or stylistically complementary dining sets to match a new dining table or tablecloth search or purchase.

Adtech leaders’ R&D programs have already begun to focus on new AI-powered marketing innovations, including research and development from Facebook, Google, and Pinterest, and new strategic partnerships such as the one announced by Amazon and Snap last year.

Shoppable visual ads take off on social media platforms

The powerful combination of influencer marketing, using emotional buying triggers we’re hard-wired to respond to, and the highly visual nature of popular social channels such as Instagram and Pinterest have sparked the fast growth of shoppable ads on social media platforms.

Many industry watchers are betting that Instagram and Facebook will lead the pack here. Late last year, Salesforce predicted that Instagram will grow 3X faster than overall social-traffic boosts, citing data from Cowen & Company that 30 percent of internet users reported purchasing a product they discovered on Facebook or Instagram.

The overall trend of social media’s impact on purchase behavior is well-documented. As many as 76 percents of consumers have purchased a product they’ve seen in a brand’s social media post, per data from Curalate.

Influencer marketing and consumers’ purchase of products, as a result, is nothing new. For example, many kids who grew up in the 1970s and their parents bought Wheaties back then based on the cereal’s “Breakfast of Champions” campaign because they were inspired to be like Bruce Jenner after his decathlon triumph at the 1976 Montreal Olympics.

While the mediums have changed, and we can now click on ads and have products delivered within the same day, and be much more granular in terms of micro-influencers’ campaigns that pinpoint targets and conserve campaign budgets, the psychology of why it works is the same.

New platforms such as Shopify make it easy for brands and merchants of all kinds to create engaging, highly connected sites that are helping to energize the social aspects of the web.

Large companies such as Amazon, Pinterest, and Instagram have done an excellent job of figuring out consumer sentiment, emotions, and online behaviors. We’re getting much closer to narrowing down to a “segment of one“, a trend that many retailers today are focused on in order to increase the personalization of advertising and improve the experience for consumers so that promotional offers to purchase products become more like a personal shopper catering to them instead of a pushy salesperson who annoys them to the point of departing the store.

And if Pinterest is any indication with more than 600 million visual searches each month, and fact that image-based Pinterest ads have an 8.5 percent conversion rate, the role of visual search in helping to capture our attention, personalize the advertising experience, and seduce us to buy is here to stay as ecommerce and SEO evolve around it.

Gary Burtka is Vice President of U.S. operations at RTB House, a global company that provides retargeting technology for global brands worldwide. He can be found on Twitter .

Related reading

SEO case study - How Venngage turned search into their primary lead source

Source link

How to use domain authority for digital PR and content marketing Search Engine Watch

Study: How to use domain authority for digital PR and content marketing

Study: How to use domain authority for digital PR and content marketing

For the SEO community, Domain Authority is a contentious metric.

Domain Authority (DA) is defined by Moz as

“A search engine ranking score developed by Moz that predicts how well a website will rank on search engine result pages (SERPs). A Domain Authority score ranges from one to 100, with higher scores corresponding to a greater ability to rank.”

Some people say that this score does more harm than good because it distracts digital marketers from what matters. Improving your DA doesn’t mean you’re improving your rankings. Others tend to find it useful on its own as a quick way to determine the quality or trustworthiness of a site.

Here’s what I say, from a digital PR perspective, domain authority is valuable when you’re using it to compare sites relative to one another. In fact, DA provides value for us PRs and is incredibly useful to our work.

Think of it this way. There are more websites than ever before, about 1.5 billion to be exact and so in some ways, this means there is more opportunity for marketers to get their content out in the world and in front of new audiences. While most people think that journalism is dying out, an enlightening post on Recode by Rani Molla explains that “while job postings for journalists are off more than 10 percent since 2004, jobs broadly related to content have almost quadrupled.” 

In other words, if outreach is executed well, there are more places than ever to get your content featured and lead to driving traffic, broadening your audience, and improving your search ranking.

But even the most skilled PR teams can’t reach out to 1.5 billion sites. The knowledgeable ones know that you really only need one successful placement to get your content to spread like wildfire all over the Internet, earning links and gaining exposure for your brand in the process. With so many options out there, how do PR professionals know which sites to spend time targeting?

That’s where DA comes into play. When it comes to link building, content marketers know that not all backlinks and brand mentions are created equally. The value of a link or mention varies depending on the referring website. Moz’s DA score is a way for us PRs to quickly and easily assess the quality of the websites we target for our client’s content marketing campaigns.

Our team tends to bucket online publishers, blogs, and websites into three categories:

  • Top-tier
  • Mid-tier
  • Low-tier

Keep in mind, particularly with the new Moz update, when deciding who to pitch, you must take a holistic approach. While domain authority is an excellent way to quickly assess the quality of a website, a site’s DA can change at any minute due to a multitude of factors, so make sure you are also taking into account your goals, the site’s audience, social following, and reputation as well as Moz DA score. In response to a Marketing Land tweet about the new DA, Stevie Howard says it perfectly.

Screenshot of Stevie Howard's tweet in response to a Marketing Land tweet about the new DA

Top-tier sites

What constitutes a top-tier website? Can a top-tier site have a low DA? Potentially, but it’s uncommon.

When you look at the holy grail of media coverage, DA tends to align perfectly. Take, for example, the following seven major publishers that any brand or business would love to earn coverage on. The DA scores for all of these sites fall above 90. These sites all have an extremely large audience, both on-site and on social media.

List of top tier sites having a DA score of 90 and above

Our team at Fractl has an innate sense of the online publisher landscape, and the largest and most well-known content publishers out there all tend to have a domain authority above 90. This is what we consider to be the “top-tier”.

These publishers are difficult to place with because of their large audience, social following, and reputation, so for the best chance at earning organic press mentions on these sites, offer them authoritative, unique, exclusive, and newsworthy content.

Mid-tier sites

Mid-tier sites may not be the holy grail of news publishers, but they’re our bread and butter. This is where the majority of placements tend to happen. These publishers hit a sweet spot for digital PR pros—they’re not as sought-after as Buzzfeed and don’t deeply scrutinize pitches the way The New York Times does, but they have large audiences and tend to be much more responsive to content pitches.

I tend to categorize the mid-tier as publishers that fall within a DA of 66 to 89. Here are some examples of publishers that may be considered mid-tier.

List of mid-tier publishers that have a DA of 66 to 89

Low-tier sites

Don’t underestimate a low-tier site simply because of its domain authority. For example, it wasn’t long ago that personal finance website, Money-ish, had a DA of 1. Launched in 2017, it was first its own website before being absorbed as part of the larger MarketWatch domain. MarketWatch has a DA of 93, with social engagement as high as 12,294,777 in the last year. If you ignored Money-ish because of its DA when they first started, you would have missed out on a chance to get your content featured on MarketWatch as well as build relationships with writers that are now under the MarketWatch umbrella. There are all types of content, and most marketers can figure out which projects have “legs” and which have less appeal. These lower-tier sites are often very niche and the perfect home for content that is aimed towards smaller, more precise audiences. These lower-tier sites also tend to have a high engagement where it matters, your target audience. Consider the site’s community. Does this site have a ton of email subscribers or high comment engagement? Are they killing it on Instagram or on another social network? You never know which site will become the next Money-ish, either!

List of low-tier sites with DA below 60 or 65

Pitching differences for each tier

There are plenty of sites that fall within different ranges of domain authority that would be an excellent fit for your content. It all just depends on your goals. In Fractl’s latest internal study, we were able to identify trends in the way journalists respond to PR professionals, based on the DA of the site they write for.

Graph on how journalists respond to PRs based on their sites DA score

Observations

  • Feedback from writers working for sites with a DA lower than 89 was most likely to be complimentary of the content campaigns we pitched them.
  • The verbiage of their responses was also more positive on average than those from journalists working for publishers with a DA of 90 or above.

An example of the feedback we received that would be labeled as complimentary is,

“Thanks for sending this over, it fits perfectly with our audience. I scheduled a post on this study to go up tomorrow.”- Contributor, Matador Network (DA: 82)

Those of us that have been pitching mainstream publishers for a while know from experience that it’s often easier to place with websites that tend to fall in the mid to low-tier buckets. Writers at these publishers are usually open to email pitches and open to writing about outside content because such websites have less stringent editorial guidelines.

Conversely, publishers that fall into our definition of “high-tier” were less positive on average than writers working for publishers with a DA less than 90. On average, the higher the DA, the less positive the language becomes.

Why might that be? It makes perfect sense that publishers like The New York Times, CNN, TIME, and The Washington Post would be less positive. They’re likely receiving hundreds of PR pitches a day because of their popularity. If they do respond to a pitch, they want to ensure that they’re inquiring about content that would eventually meet their editorial guidelines, should they decide to cover it.

According to our study, when journalists at publishers with a DA of 90 or above do respond, they’re more likely to be asking about the methodology or source of the content.

An example of this feedback is from a staff writer at CNN.

“Thanks for sending along. I’m interested to know more about the methodology of the study.”

A response like this isn’t necessarily bad, in fact, it’s quite good. If a journalist is taking time to ask you more about the details of the content you pitched, it’s a good indication that the writer is hoping to cover it, they just need more information to ensure that any data-driven content is methodologically-sound.

Conclusion

Domain authority will continue to remain a controversial metric for SEOs, but for those of us working in digital PR, the metric provides a lot of value. Our study found a link between the DA of a site and the type of responses we received from writers at these publishers. High DA sites were less positive on average and requested research back methodologies more than lower-tier sites. Knowing the DA of a site allows you to:

  • Improve your list building process and increase outreach efficacy
  • Customize each outreach email you send to publishers of varying DAs
  • Anticipate the level of editorial scrutiny you’re up against in terms of content types and research methodologies
  • Optimize content you create to fit the needs of your target publisher
  • Predict the outcome of a content campaign depending on where you placed the “exclusive”

Remember, just because a site has a high DA, it doesn’t mean it’s necessarily a good fit for your content. Always be sure to take a holistic approach to your list building process. Keep in mind the social engagement of the site, the topics they cover, who their audience is, their editorial guidelines, and most importantly, the goals of you or your client before reaching out to any publisher solely based on domain authority.

Domenica is a Brand Relationship Manager at Fractl. She can be found on Twitter .

Related reading

On-site analytics tactics to adopt now Heatmaps, intent analysis, and more
How AI is powering real-time SEO research Insights and optimization
The SEO metrics that really matter for your business
How progressive web apps positively impact your SEO

Source link

Google Search adds support for FAQ and How-to structured data

Google Search adds support for FAQ and How-to structured data

At Google I/O just now, Google announced support for new structured data for FAQ and How-to markup. Yes, Google announced FAQ and How-to markup at the 2018 Google I/O event a year ago, but now Google has launched new structured markup to bring these rich search results to life.

How-to results. How-to search results in Google will show searchers step-by-step information on how to accomplish specific tasks directly in the search results. Google has published how-to documentation for your developers to use when adding the markup to your own pages and also how to add this to Google Assistant. The documentation includes information on the steps, tools, duration, and other properties you would include in your markup.

Here are screen shots of what it looks like in search:

Here are screen shots of what this looks like in the Google Assistant:

How-to Search Console report. Google also added a new How-to enhancement report in Search Console that shows you your errors, warnings and valid items for pages with HowTo structured data. Here is a screen shot of this report.

FAQ results. Google also announced new search results for FAQs in search and Google Assistant. This is designed for FAQ pages that provide a list of frequently asked questions and answers on a particular topic. Adding this structure data helps Google show questions and answers directly on Google Search and the Assistant. The documentation for this markup can be found here and for Assistant can be found here.

Google warns you not to confuse this with QA pages, which is devoted to forums or other pages where users can submit answers to questions.

Here is how FAQ results look in search:

Here is how FAQ results look in Assistant:

Google also launched a Google Search Console enhancement report for FAQ structured data.

Why we care. Making your search results more prominent in Google search may drive more clicks to your web site from Google search. But at the same time, if the searcher can get the full answer directly in Google search or on Google Assistant, you may never see that searcher end up on your web site. We recommend testing this markup on the relevant pages and see if it leads to more traffic and conversions for your business.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.

Source link

What defines high-quality links in 2019 and how to get them? Search Engine Watch

What defines high-quality links in 2019 and how to get them

What defines high-quality links in 2019 and how to get them

Links are a way for users to get relevant information on the internet. They are also a ranking factor for your pages. Of the 200+ ranking factors, Google is known to use, links share the top three spots with content and RankBrain.

However, having links is not enough. Their quality matters. Google takes into account not only the number of backlinks a web page has but their quality as well. Sadly, some SEOs, in the rush to make their sites appear popular to search engine crawlers, invest in low-quality links. High-quality link building, however, is the only way to get desirable results.

Chasing the quickest and cheapest results, some SEOs resort to the wrong methods of generating backlinks. These are often blackhat techniques that could lead to Google penalties and destroyed reputations. What’s surprising is that even SEO agencies employ poor link-building techniques. A 2018 study conducted by Assertive Media of 230 UK link-building agencies found that nine out 10 used unscrupulous techniques.

To avoid their fate, you need to know what makes a high-quality link.

So, what determines high-quality links?

Below are three factors that determine the quality of the link.

a) Authority of the linking page

The authority of the page linking to you is important to Google. The logic is that if an authoritative page has noticed your content, it must be impactful and relevant. This authority, referred to as PageRank, shows Google how authoritative the page that noticed you is, signaling your own authority.

b) Authority of the linking site

The site-wide authority of the linking website is also an important indication to Google that your site is authoritative. Much like page authority, website authority measured by domain rating (DR) and domain authority (DA) scales on Ahrefs and Moz respectively is transmitted by the linking website.

c) Content relevance

Page rank and domain authority are of no consequence if the web content is irrelevant. For instance, the prospect of getting a link from NYTimes.com sounds amazing. However, the link is useless to you if the subject matter of the article doesn’t relate to the subject matter of your website. This is because at the base of all link-building is the relevance of the linked information to the user.

How to get high-quality links in 2019?

It’s no secret that high-quality link building is still one of the most essential SEO skills. However, to master it means mastering several other skills, necessary for getting others to link to your site.

To get a link that meets all three criteria discussed above and do it consistently so your website is strengthened by high-quality links takes a deliberate effort. It requires you to spend an ample amount of time doing the right things.

The following are some of the most effective things you can do to improve your link-building efforts in 2019:

1. Publish content strategically

Content marketing is at the heart of quality link-building. Before you can expect backlinks, you have to provide useful content. But that’s not enough. Having a strategy will help you see what type of content performs best, where to publish it and when to change tactics.

For instance, according to Backlinko, visuals, list posts, research data, and in-depth guides perform the best. With a sound strategy, you can distribute the various types of content across a diverse range of top ranking sites. Ultimately, a content marketing strategy will help you publish content that not only aids the user but also gets you high-quality links.

For example, in 2015, we did an in-depth article at Effective Inbound Marketing where we called on the most noted link-building experts in the industry and asked them to give their thoughts on guest blogging. The article took nearly one month to complete but was the best performing piece on several article ranking sites for the day. Suffice to say, it generated tons of organic backlinks as it was widely discussed on several blogs. This is mainly because it answered people’s problem at the time and the strategy we used ensured that the thought-leaders shared the article with their followers.

2. Use email outreach

In link-building, often you get only what you ask for. Whether it’s link reclamation, reverse image search, submitting your blog to relevant directories, pitching guest posts or using broken links for link-building, many techniques will require you to email people.

Don’t shy away from it. Approach it knowing the challenges of reaching people by email in 2019. In order to avoid ending up in people’s spam folders, I’d suggest the following approach:

  • Personalize your email to the recipient.
  • Address their questions and aim to solve a problem rather than being direct in your request.
  • Identify the best time to reach out to people, and you’ll get a higher response rate.

3. Be active in your online communities

First, you need a presence on various online platforms that allow you to create a profile with a link to your website. For instance, Product Hunt, the place where users meet to discuss the latest technologies, is also a DA 75 site. All it takes to get a link from it is creating a profile. For best results, be active on the site by reviewing products, commenting, upvoting, and downvoting the content regularly.

Crunchbase is a similar community of people who want to stay on top of the digital shift. By creating an account, you automatically get a high-quality link from the DA 91 website. The more active you are, the more you benefit from your account. These and other online communities are a valuable source of high-quality links, but you must remain active to reap maximum rewards.

4. Polish and optimize your website

Poor website design is bad for SEO. To start with, if your design turns people off, they will leave your site quickly. Google will take it as a signal of poor service.

Another design factor to consider is website speed. Research shows that 47% of users won’t wait longer than two seconds for a website to load.

You also need to take into account the diversity of the devices people use to access your content. More specifically, you have to optimize your website for portable devices, such as smartphones, as a substantial amount of traffic is continually coming from them. Smartphones and tablets make up more than 61% of internet traffic.

In 2019, having a flexible site design that can be modified on the fly is crucial. The ability to quickly tweak your website allows you to stay responsive to the needs of the consumers while adopting new design trends as they develop. Settling on a rigid design denies you this chance. However, if you choose to use a CMS based design like WordPress it would allow you the convenience of making changes easily.

Conclusion

For more than two decades, link-building has been a sound SEO strategy for many. This is unlikely to change in the coming years. If you hope to succeed, you must learn to recognize a high-quality link and, more importantly, how to get one.

Ayodeji Onibalusi is the Founder of Effective Inbound Marketing.

Related reading

How progressive web apps positively impact your SEO
Improving your site's SEO by checking duplicate content
How to take advantage of the latest updates to Google Search Console
SEO case study - How Venngage turned search into their primary lead source

Source link

The SEO metrics that really matter for your business Search Engine Watch

The SEO metrics that really matter for your business

The SEO metrics that really matter for your business

Whether you are a business owner, marketing manager or simply just interested in the world of ecommerce, you may be familiar with how a business can approach SEO.

To every person involved, the perception of SEO and its success can vary from a sophisticated technical grasp to a knowledge of the essentials.

At all levels, measurement and understanding of search data are crucial and different metrics will stand out; from rankings to the finer details of goals and page speed.

As you may know, you can’t rely solely on ranks as a method to track your progress. But there are other, simple ways to measure the impact of SEO on a business.

In a recent AMA on Reddit, Google’s own Gary Illyes recently urged SEO professionals to stick to the basics and this way of thinking can be applied to the measurement of organic search performance.

In this article, we will look to understand the best metrics for your business when it comes to understanding the impact of SEO, and how they can be viewed from a technical and commercial perspective. Before we start, it’s worth mentioning that this article has used Google’s own demo analytics account for screenshots. If you need further info to get to grips, check out this article, or access the demo version of Google Analytics.

Each of these are commercial SEO metrics — data that means something to everyone in a business.

Organic traffic

This is undoubtedly a simple, if not the most simple way of understanding the return of any SEO efforts. The day-to-day traffic from search engines is the key measure for many marketers and any increase can often be tied to an improved level of search visibility (excluding seasonal variation).

In a world where data drives decisions, these figures are pretty important and represent a key part of any internet user’s session, whether that is to get an answer, make a purchase or something else.

In Google Analytics, simply head follow this path: Acquisition -> All Traffic -> Channels to see the organic traffic received within your chosen time period

Identifying traffic sources in Google Analytics

You might be asking, “how can I know more?”

Google might have restricted access to keyword data back in 2011, but you can still dig down into your traffic from organic search to look at landing pages and locations.

Organic traffic data – Filtered by landing page 

Not all traffic from search hits your homepage, some users head to your blog or to specific landing pages, depending on their needs. For some searches, however, like those for your company name, your homepage will be the most likely option.

To understand the split of traffic across your site, use the “Landing Page” primary dimension and explore the new data, split by specific page URL.

Understanding the traffic split using Google Analytics

Organic traffic data – Filtered by location

Within the same section, the organic search data can be split by location, such as city, to give even further detail on the makeup of your search traffic. Depending on how your business operates, the locations shown may be within the same country or across international locations. If you have spent time optimizing for audiences in specific areas, this view will be key to monitor overall performance.

Screenshot of search data filtered by city

Screenshot of the city wise breakdown of the search traffic in Google Analytics

Revenue, conversions, and goals

In most cases, your website is likely to be set up to draw conversions, whether that is product sales, document downloads, or leads.

Part of understanding the success of SEO, is the contribution to the goal of a business, whether that is monetary or lead-based.

For revenue based data, head to the conversions section within Google analytics, then select the product performance. Within that section, filter the secondary dimension by source/medium to show just sales that originate from search engine traffic.

Screenshot of the product performance list to track search originated sales

If your aim isn’t totally revenue based, perhaps a signup form or some downloadable content, then custom analytics goals are your way of fully understanding the actions of visitors that originate from search engines.

Within the conversions section, the source of your goal completions can be split by source, allowing you to focus on solely visits from organic search.

Graph on source wise split of goal conversions

If a visitor finds your site from a search and then buys something or registers their details, it really suggests you are visible to the right audience.

However, if you are getting consistent organic search visits with no further actions taken, that suggests the key terms you rank for, aren’t totally relevant to your website.

SEO efforts should focus on reaching the relevant audiences, you might rank #1 for a search query like “cat food” but if you only sell dog products, your optimization hasn’t quite worked.

Search and local visibility

In the case that your business has web and/or physical store presences, you can use the tools within Google My Business to look further into and beyond the performance of the traditional blue links.
Specifically, you can understand the following:

  • How customers search for your business
  • How someone sees your business
  • What specific actions they take

The better your optimization, the more of these actions you will see, check these out!

Doughnut graph of search volume seen in Google Analytics

Graph of customer actions

Graph of listing sources for Google my business

Average search rankings

Rankings for your key terms on search engines have traditionally been an easy way to quickly get a view of overall performance. However, a “quick Google” can be hard to draw conclusions from. Personalized search from your history and location essentially skews average rank to a point where its use has been diminished.

A variety of tools can be used to get a handle on average rankings for specific terms. The free way to do this is through Google Search Console with freemium tools like SEMRush and Ahrefs, which also offer an ability to understand average rank distribution.

With search rankings becoming harder to accurately track, the measure of averages is the best way to understand how search ranking relates to and impacts the wider business.

Graph on average positioning of the website in search

Technical metrics – Important but not everyone pays attention to these

When it comes to the more technical side of measuring SEO, you have to peel back the layers and look beyond clicks and traffic. They help complete the wider picture of SEO performance, plus they can help uncover additional opportunities for progress.

Search index – Through search consoles and other tools

Ensuring that an accurate index of your website exists is one thing that you need to do with SEO. Because if only a part of your site or the wrong pages are indexed, then your overall performance will suffer.

Although a small part of overall SEO work, its arguably one of the most crucial.

One quick way is to enter the command “site:” followed by the URL of your site’s homepage, to see the total number of pages that exist in a search engine’s index.

To inspect the status of a specific page on Google, the Google Search Console is your best option. The newest version of the search console provides a quick way to bring up results.

Screenshot of the latest Google Search Console

Search crawl errors

As well as looking at what has been indexed, any website owner needs to keep an eye out for what may be missing, or if there have been any crawl errors reported by Google. These often occur because a page has been blocked, or the format isn’t crawlable by Google.

Head to the “Coverage” tab within Google Search Console to understand the nature of any errors and what page the error relates to. If there’s a big zero, then you and your business naturally have nothing to worry about.

Screenshot of viewing error reports in Google Search Console

Click-through rate (CTR) and bounce rate

In addition to where and how your website ranks for searches, a metric to consider is how often your site listing is clicked in the SERPs. Essentially, this shows the percentage of impressions that result in a site visit.

This percentage indicates how relevant your listing is to the original query and how well your result ranks compared to your competitors.

If people like what they see and can easily find your website, then you’ll likely get a new site visit.

The Google Search Console is the best go-to resource again for the most accurate data. Just select the performance tab and toggle the CTR tab to browse data by query, landing page, country of origin, and device.

Screenshot of a CTR performance graph on the basis of query, landing page, country of origin, and device

If someone does venture onto your site, you will want to ensure the page they see, is relevant to their search, after all, search algorithms love to reward relevance! If the page doesn’t contain the information required or isn’t user-friendly, then it is likely the user will leave to find a better resource, without taking any action, known as a bounce.

In some cases, one visit may be all that is needed, therefore a bounce isn’t an issue. Make sure to view this metric in the wider context of what your business offers.

Mobile friendliness

Widely reported in 2015, was the unveiling of mobile-friendliness as a ranking factor. This is crucial to the evolution of browser behavior, with mobile traffic, often greater in volume than desktop for some sites.

Another report in the ever useful Google Search Console gives a clear low-down of how mobile-friendly a site is, showing warnings for any issues. It’s worth saying, this measure isn’t an indication of how likely a conversion is, but more the quality of your site on a mobile device.

Graph for tracking the mobile-friendliness of a website

Follow your metrics and listen to the data

As mentioned at the start of this article, data drives decisions. In all areas of business, certain numbers will stand out. With SEO, a full understanding comes from multiple data points, with positives and negatives to be taken at every point of the journey.

Ultimately, it often comes down to traffic, ranks, and conversions, the numbers that definitely drive business are made up of the metrics that don’t often see the light of day but are just as important.

As a digital marketer, it is always a learning experience to know how data drives the evolution of a business and ultimately, how successes and opportunities are reported and understood.

Matthew Ramsay is Digital Marketing Manager at Digitaloft. 

Further reading:

Related reading

Three ideas to create a high-converting product page
SEO writing guide From keyword to content brief

Source link

How Venngage turned search into their primary lead source Search Engine Watch

SEO case study - How Venngage turned search into their primary lead source

SEO case study - How Venngage turned search into their primary lead source

Venngage is a free infographic maker that has catered to more than 21,000 businesses. In this article, we explore how they grew their organic traffic from about 275,000 visitors per month in November 2017 to about 900,000 today — more than tripling in 17 months.

I spoke with Nadya Khoja, Chief Growth Officer at Venngage, about their process.

Venngage gets most of their leads from content and organic search. The percentage varies from month to month in the range of 58% to 65%.

In Nov 2017, Venngage enjoyed 275,000 visitors a month from organic search traffic. Today (16 months later) it’s 900,000. Nadya Khoja (their Chief Growth Officer) extrapolated from their current trend that by December of 2019 (in nine months) they will enjoy three million organic search visitors per month.

Screenshot of Venngage's statistics

In 2015, when Nadya started with Venngage, they saw 300 to 400 registrations a week. By March of 2018, this was up to 25,000 a week. Today it’s 45,000.

While Nadya had the advantage of not starting from zero, that is impressive growth per any reasonable metric. How did they do it?

Recipe

There are a lot of pieces to this puzzle. I’ll do my best to explain them, and how they tie together. There is no correct order to things per se, so what is below is my perspective on how best to tell this story.

The single most important ingredient: Hypothesize, test, analyze, adjust

This critical ingredient is surprisingly not an ingredient, but rather a methodology. I’m tempted to call it “the scientific method”, as that’s an accurate description, but perhaps it’s more accurate to call it the methodology written up in the books “The Lean Startup” (which Nadya has read) and “Running Lean” (which Nadya has not read).

This single most important ingredient is the methodology of the hypothesize, test, analyze, and adjust.

What got them to this methodology was a desire to de-risk SEO.

The growth in traffic and leads was managed through a series of small and quick iterations, each one of which either passed or failed. Ones that passed were done more. Ones that failed were abandoned.

This concept of hypothesizing, testing, analyzing, and adjusting is used both for SEO changes and for changes to their products.

The second most important ingredient

This ingredient is shared knowledge. Venngage marketing developed “The Playbook”, which everyone in marketing contributes to. “The Playbook” was created both as a reference with which to bring new team members up to speed quickly, as well as a running history of what has been tested and how it went.

The importance of these first two ingredients cannot be overstated. From here on, I am revealing things they learned through trial and error. You have the advantage to learn from their successes and failures. They figured this stuff out the hard way. One hypothesis and one test at a time.

Their north star metrics

They have two north star metrics. The first one seems fairly obvious. “How many infographics are completed within a given time period?” The second one occurred to them later and is as important, if not more so. It is “how long does it take to complete an infographic?”

The first metric, of course, tells them how attractive their product is. The second tells them how easy (or hard) their product is to use.

Together these are the primary metrics that drive everything Venngage does.

The 50/50 focus split

As a result of both the company and the marketing department having a focus on customer acquisition and customer retention, every person in marketing spends half their time working on improving the first north star metric, and the other half spend their time working on improving the second.

Marketing driving product design

Those north star metrics have led to Venngage developing what I call marketing driven product design. Everywhere I ever worked has claimed they did this. The way Venngage does this exceeds anything ever done at a company I’ve worked for.

“How do I be good?”

This part of Nadya’s story reminds me of the start of a promo video I once saw for MasterClass.com. It’s such a good segue to this part of the story that I cropped out all but the good part to include in this article.

When Steve Martin shed light on an important marketing question

I’ve encountered a number of companies through the years who thought of marketing as “generating leads” and “selling it”, rather than “how do we learn what our customers want?”, or “how do we make our product easier to use?”

Squads

The company is structured into cross-functional squads, a cross-functional squad being people from various departments within Venngage, all working to improve a company-wide metric.

For example, one of the aspects of their infographic product is templates. A template is a starting point for building an infographic.

As templates are their largest customer acquisition channel, they created a “Template Squad”, whose job is to work on their two north star metrics for their templates.

The squad consists of developers, designers, UI/UX people, and the squad leader, who is someone in marketing. Personally, I love this marketing focus, as it de-focuses marketing and causes marketing to be something that permeates everything the company does.

There is another squad devoted to internationalization, which as you can infer, is responsible to improve their two north star metrics with users in countries around the world.

Iterative development

Each template squad member is tasked with improving their two north star metrics.

Ideas on how to do this come from squad members with various backgrounds and ideas.

Each idea is translated into a testable hypothesis. Modifications are done weekly. As you can image, Venngage is heavy into analytics, as without detailed and sophisticated analytics, they don’t know which experiments worked and which didn’t.

Examples of ideas that worked are:

  • Break up the templates page into a series of pages, which contain either category of templates or single templates.
  • Ensure each template page contains SEO keywords specific for the appropriate industry or audience segment. This is described in more detail further in this document.
  • Undo the forced backlink each of the embedded templates used to contain.
    • This allowed them to get initial traction, but it later resulted in a Google penalty.
    • This is a prime example of an SEO tactic that worked until it didn’t.
  • Create an SEO checklist for all template pages with a focus on technical SEO.
    • This eliminated human error from the process.
  • Eliminate “React headers” Google was not indexing.
  • Determine what infographic templates and features people don’t use and eliminate them.

Measuring inputs

I personally think this is really important. To obtain outputs, they measured inputs. When the goal was to increase registrations, they identified the things they had to do to increase registrations, then measured how much of that they did every week.

Everyone does SEO

In the same way that marketing is something that does not stand alone, but rather permeates everything Venngage does, SEO does not stand alone. It permeates everything marketing does. Since organic search traffic is the number one source of leads, they ensure everyone in marketing knows the basics of technical SEO and understands the importance of this never being neglected.

Beliefs and values

While I understand the importance of beliefs and values in human psychology, it was refreshing to see this being proactively addressed within an organization in the context of improving their north star metrics.

They win and lose together

Winning and losing together is a core belief at Venngage. Nadya states it minimizes blame and finger-pointing. When they win, they all win. When they lose, they all lose. It doesn’t matter who played what part. To use a sports analogy, a good assist helps to score a goal. A bad assist, well, that’s an opportunity to learn.

SEO is a team effort

While it is technically possible for a single person to do SEO, the volume of tasks required these days makes it impractical. SEO requires quality content, technical SEO, and building of backlinks through content promotion, guest posting, and the others. Venngage is a great example of effectively distributing SEO responsibilities through the marketing department.

To illustrate the importance of the various pieces fitting together, consider that while content is king, technical SEO is what gets content found, but when people find crappy content, it doesn’t convert.

You can’t manage what you don’t measure

This requires no elaboration.

But what you measure matters

This probably does justify some elaboration. We’ve all been in organizations that measured stupid stuff. By narrowing down to their two north star metrics, then focusing their efforts to improving those metrics, they’ve aligned everyone’s activity towards things that matter.

The magic of incremental improvements

This is the Japanese concept of Kaizen put into play for the development and marketing of a software product.

Done slightly differently, this concept helped Britain dominate competitive cycling at the 2008 Olympics in Beijing.

Customer acquisition is not enough

Venngage developed their second north star metric after deciding that acquiring new customers was not, in and of itself, any form of the Holy Grail. They realized that if their product was hard to use, fewer people would use it.

They decided a good general metric of how easy the product is to use was to measure how long people take to build an infographic. If people took “too long”, they spoke to them about why.

This led them to change the product in ways to make it easier to use.

Link building is relationship building

As a reader of Search Engine Watch, you know link building is critical and central to SEO. In the same way that everyone in Venngage marketing must know the basics of technical SEO, everyone in Venngage marketing must build links.

They do so via outreach to promote their content. As people earn links from the content promotion outreach, they record those links in a shared spreadsheet.

While this next bit is related to link building, everyone in Venngage marketing has traffic goals as well.

This too is tracked in a simple and reasonable way. Various marketers own different “areas” or “channels”. These channels are broken down into specific traffic acquisition metrics.

As new hires get more familiar with how things work at Venngage, they are guided into traffic acquisition channels which they want to work on.

Learning experience, over time

My attempt here is to provide a chronology of what they learned in what order. It may help you avoid some of the mistakes they made.

Cheating works until it doesn’t

Understanding the importance of links to search ranking, they thought it would be a good idea to implement their infographics with embedded backlinks. Each implemented infographic contained a forced backlink to the Venngage website.

They identified a set of anchor text they thought would be beneficial to them and rotated through them for these forced backlinks.

And it worked, for a while. Until they realized they had invited a Google penalty. This took a bit to clean up.

The lessons learned:

  • The quality of your backlinks matter.
  • To attract quality backlinks, publish quality content.

Blog posts brought in users who activated

At some point, their analytics helped them realize that users who activated from blog posts where ideal users for them. So they set a goal to increase activations from blog posts, which led to the decision to test if breaking up templates into categories and individual pages with only one template made sense. It did.

Website design matters

Changing the website from one big template page to thousands of smaller ones helped, and not just because it greatly increased the number of URLs indexed by Google. It also greatly improved the user experience. It made it easier for their audience to find templates relevant to them, without having to look at templates that weren’t.

Lesson learned: UI/UX matters for both users and SEO.

Hybrid content attracts

Hybrid content is where an article talks about two main things. For example, talking about Hogwarts houses sorting within the context of an infographic. This type of content brings in some number of Harry Potter fans, some of whom have an interest in creating infographics. The key to success is tying these two different topics together well.

Content is tuneable

By converting one huge templates page into thousands of small template pages, they realized that a template or set of templates that appeal to one audience segment would not necessarily appeal to others. This caused them to start to tune templates towards audience segments in pursuit of more long tail organic search traffic.

How did they figure out what users wanted in terms of better content? They used a combination of keyword research and talking with users and prospects.

Some content doesn’t make the cut

After they caught onto the benefits of tuning content to attract different audience segments, they looked for content on their site that no one seemed to care about. They deleted it. While it decreased the amount of content on their site, it increased their overall content quality.

Traffic spikes are not always good news

When they initially started creating forced backlinks in their infographics, they could see their traffic increase. They saw some spikes. Their general thought was more traffic is good.

When they experienced the Google penalty, they realized how wrong they were. Some traffic spikes are bad news. Others are good news.

When your website traffic shows a sudden change, even if you’re experiencing a spike in organic search traffic, you must dig into the details and find out the root cause.

Lesson learned: There is a thing as bad traffic. Some traffic warns you of a problem.

Links from product embeds aren’t all bad

They just needed to make the embedded links optional. To allow the customer to decide if they do or do not deserve a backlink. While this did not cause any change to their levels of organic search traffic, it was necessary to resolve the Google penalty.

Boring works

Incremental continuous improvement seems repetitive and boring. A one percent tweak here, a two percent tweak there, but over time, you’ve tripled your organic search traffic and your lead flow.

It’s necessarily fun, but it delivers results.

Lesson learned: What I’ll call “infrastructure” is boring, and it matters. Both for your product and your SEO.

Figure out what to measure

The idea of measuring the amount of time required to complete an infographic did not occur to them on day one. This idea came up when they were looking for a metric to indicate to them how easy (or difficult) their product was to use.

Once they decided this metric possibly made sense, they determined their baseline, then through an iterative process, making improvements to the product to make this a little faster.

As they did so, the feedback from the users was positive, so they doubled down on this effort.

Lesson learned: What you measure matters.

Teach your coworkers well

They created “The Playbook”, which is a compendium of the combined knowledge they’ve accumulated over time. The playbook is written by them, for them.

Marketing employees are required to add chapters to the playbook as they learn new skills and methods.

Its primary purpose is to bring new team members up to speed quickly, and it also serves as a historical record of what did and did not work.

One important aspect of continuous improvement is for new people to avoid suggesting experiments that previously failed.

Additionally (and I love this), every month everyone in marketing gives Nadya an outline of what they’re learning and what they’re improving on.

Their marketing stack

While their marketing stack is not essential to understanding their processes, I find it useful to understand what software tools a marketing organization uses, and for what. So here is theirs. This is not a list of what they’ve used and abandoned over time, but rather a list of what they use now.

  • Analytics: Google Analytics and Mixpanel
  • Customer communications: Intercom
  • Link analysis and building: Ahrefs
  • Link building outreach: Mailshake
  • Project management: Trello
  • General purpose: G Suite

In closing

To me, what Nadya has done at Venngage is a case study in how to do SEO right, and most of doing it right are not technical SEO work.

  • Help senior management understand that some things that are not typically thought of as SEO (website design for example) can have serious SEO implications.
  • Get senior management buy in to include these non-SEO functions in your SEO efforts.
  • Understand what very few basic metrics matter for your company, and how you measure them.
  • Distribute required SEO work through as many people as reasonably possible. Include people whose job functions are not necessarily SEO related (writers, designers, UI/UX, and more).
  • Test and measure everything.
  • Win big through a continuous stream of small incremental improvements.

Venngage has surely lead by example and all the guidelines and pointers shared above can surely help your organization implement its search for increased sales.

Kevin Carney is the Founder and CEO of the boutique link building agency Organic Growth. 

Related reading

Three fundamental factors in the production of link-building content
How to conduct a branded search audit
How to write SEO-friendly alt text for your images

Source link

Three ideas to create a high-converting product page Search Engine Watch

Three ideas to create a high-converting product page

Three ideas to create a high-converting product page

Tough competition on the ecommerce market makes retailers continuously search for new ideas to improve web stores’ UX. Optimizing the product page is one of the key areas in this quest for enhancements.

We reviewed the best practices of ecommerce leaders and success stories of smaller merchants, and came up with three hacks that make any product page convert more visitors into customers.

1. Optimize product descriptions

A good product description is a top factor influencing customers’ desire to purchase. The problem is customers want to get answers to their questions, but they don’t want to read a lot. Average web-surfers give a web page no more than 15 seconds to capture their attention. If a product description fails to meet this deadline, it fails to convert.

Customers think about different aspects of a product: Some are interested in materials, some are more concerned about durability. To make a product page convert well, you have to strike a balance between being informative and brief. Here are the best practices in product description derived from the success of market leaders:

  • Start with a unique value proposition: A brief product description that welcomes a potential shopper must clearly explain what is so special about this item. A selling product page doesn’t speak about features, it shows what particular benefits customers get when they buy the item.
  • Avoid visual overload: Structure key information using headers and collapsible sections to save space on the page. This makes product pages more transparent and interactive, as well as minimizes the time required to get the key ideas.

The screenshot below shows how Oliver implements these principles on their product pages. They hide the detailed information about product features, materials and delivery options in expandable sections.

Screenshot of how Oliver implemented an expandable reading section on page

Mulberry went a step further and combined tabs for Description, Details, Material, and Size Charts with pop-ups for Delivery and Returns. The result? All types of customers get excessive information about the product without reloads and scrolling.

Screenshot of how Mulberry used pop ups to add product descriptions and details to save scroll time

The case of The Sims 3 manufacturers also proved that clarity and order drive conversion. They tested six versions of the “game launcher,” all of which had particular benefits, simpler design, and lesser information. As a result, conversion increased up to 128%.

2. Give people more images to describe items

Human beings are very good at processing visual information, much better than at reading. This means pictures and colors on product pages create the first impression of items and thus are even more important than descriptions.

  • Size matters: A product image is the only way for a customer to feel the product. So make sure that shoppers can zoom in to examine the product in detail (its fabric and tiny parts). These are not just words. Larger images helped Skinner Auctions by 63%. Skinner Auctions scaled their catalog images from 250 pixels to 350 pixels. And what’s even better? The amount of bidding visitors who actually filled out all the online forms required to place a bid rose to a huge 329%.
  • Angles matter as well: Surprisingly, it is a common mistake to show the product only facing forward. Customers want to see the interior pockets of a purse, the back of a dress, and the outsole of a shoe. A well-selling page features the product from different angles or even provides a video showing how it looks in motion. Look at ASOS, they allow you both to inspect the skirt’s texture and buttons and to watch a short video clip.

Screenshot of how ASOS helps buyers understand the texture and other product details

  • Customers want to try on items: Online shoppers are concerned about how items will suit real them rather than professional models. Many successful web stores show their products on people with different body shapes. This helps customers imagine themselves with items and make purchase decisions easier and faster.
  • We believe people more than models: Amazon and ModCloth ask their customers to share personal photos in the product reviews. Such a gallery is included in the product description to show customers how items look in everyday life and make the product page more trustworthy.

Screenshot example of adding real customers' pictures to encourage buyers to make purchases

3. Dialog with customers

Do online retailers have fewer opportunities to talk to their customers than brick-and-mortar do? Not really. Though communication between web stores and shoppers doesn’t happen face to face, merchants can still say everything customers want to hear and ask for everything they need to know.

Add an FAQ and tips to the description to clarify any doubts. An FAQ has several benefits as it:

  • Answers the questions of the customers that are already on the page.
  • Attracts new visitors to product pages from the browser’s search results.
  • Helps keep product descriptions short.

Apart from a full-fledged FAQ, you can try short tips as RollerSkateNation.com did. Their sincere advice was not oriented on increasing sales directly. In fact, it showed customers how to replace roller states for kids less frequently by buying larger items and wearing double socks. Customers felt taken care of and increased purchases by 69%.

Screenshot of how RollerskateNation's added Adam’s pro-tip to the page to increase sales

The position of the tips and the FAQ section is also important. In the above case study, RollerSkateNation managed to further boost revenue by 99% by placing their hint below the product description. Customers had enough time to process key details and then got really useful advice as a surprise.

Use reviews to build trust. When it comes to making a purchase decision, reviews are almost as important as product descriptions and prices. Most shoppers look for reviews and, at best, they can read credible feedback right on the product page. This way customers don’t have to leave the web store and are less likely to choose another vendor. The case study of Express Watches proves that a well-designed Reviews section can increase conversion by 59%.What does this “well-designed” mean? The product page should let shoppers sort and rate reviews, add images and stars. To show even more credibility, you can pick some reviews and put them forward as testimonials.

And for sure be careful with negative reviews. Try to express your professionalism and care. In fact, a well-processed negative review can be even more convincing than a dozen positive ones.

Ask customers how to improve conversion. Small details, like words and button colors, influence the success of product pages. Though A/B tests make attempts to polish the web store less risky, don’t be shy to ask customers directly about their impression. For example, Amazon introduced a new feedback feature that shows how shoppers rate the size of the item.

Screenshot example of showing buyer preferences for products

By the way, this is a great CX feature per se that allows customers to quickly understand which size to take without exploring the size guide. But now pay extra attention to their poll about the utility of the feature. Why not ask customers if you can do it?

What’s next?

However good best practices are, they work well nine times out of ten. Unfortunately, there is no guarantee that your case isn’t the tenth one. Trust seals normally improve conversion as they make the website look trustworthy. But Icouponblog managed to increase their conversion by 400% by removing a security badge. What does this mean for you? The theory is worth reading, but real results appear only after you test and try. Devote enough time to validate your ideas, and you will definitely find the way to a high-converting product page.

Maria Marinina is a Digital Marketing Manager at Iflexion.

Related reading

How to write SEO-friendly alt text for your images
Less than 10% of marketers to focus on Digital PR in 2019, wise or unwise
How to perfectly balance affiliate marketing and SEO
international insight from google analytics

Source link

Using Python to recover SEO site traffic (Part three) Search Engine Watch

Using Python to recover SEO site traffic (Part three)

Using Python to recover SEO site traffic (Part three)

When you incorporate machine learning techniques to speed up SEO recovery, the results can be amazing.

This is the third and last installment from our series on using Python to speed SEO traffic recovery. In part one, I explained how our unique approach, that we call “winners vs losers” helps us quickly narrow down the pages losing traffic to find the main reason for the drop. In part two, we improved on our initial approach to manually group pages using regular expressions, which is very useful when you have sites with thousands or millions of pages, which is typically the case with ecommerce sites. In part three, we will learn something really exciting. We will learn to automatically group pages using machine learning.

As mentioned before, you can find the code used in part one, two and three in this Google Colab notebook.

Let’s get started.

URL matching vs content matching

When we grouped pages manually in part two, we benefited from the fact the URLs groups had clear patterns (collections, products, and the others) but it is often the case where there are no patterns in the URL. For example, Yahoo Stores’ sites use a flat URL structure with no directory paths. Our manual approach wouldn’t work in this case.

Fortunately, it is possible to group pages by their contents because most page templates have different content structures. They serve different user needs, so that needs to be the case.

How can we organize pages by their content? We can use DOM element selectors for this. We will specifically use XPaths.

Example of using DOM elements to organize pages by their content

For example, I can use the presence of a big product image to know the page is a product detail page. I can grab the product image address in the document (its XPath) by right-clicking on it in Chrome and choosing “Inspect,” then right-clicking to copy the XPath.

We can identify other page groups by finding page elements that are unique to them. However, note that while this would allow us to group Yahoo Store-type sites, it would still be a manual process to create the groups.

A scientist’s bottom-up approach

In order to group pages automatically, we need to use a statistical approach. In other words, we need to find patterns in the data that we can use to cluster similar pages together because they share similar statistics. This is a perfect problem for machine learning algorithms.

BloomReach, a digital experience platform vendor, shared their machine learning solution to this problem. To summarize it, they first manually selected cleaned features from the HTML tags like class IDs, CSS style sheet names, and the others. Then, they automatically grouped pages based on the presence and variability of these features. In their tests, they achieved around 90% accuracy, which is pretty good.

When you give problems like this to scientists and engineers with no domain expertise, they will generally come up with complicated, bottom-up solutions. The scientist will say, “Here is the data I have, let me try different computer science ideas I know until I find a good solution.”

One of the reasons I advocate practitioners learn programming is that you can start solving problems using your domain expertise and find shortcuts like the one I will share next.

Hamlet’s observation and a simpler solution

For most ecommerce sites, most page templates include images (and input elements), and those generally change in quantity and size.

Hamlet's observation for a simpler approach based on domain-level observationsHamlet's observation for a simpler approach by testing the quantity and size of images

I decided to test the quantity and size of images, and the number of input elements as my features set. We were able to achieve 97.5% accuracy in our tests. This is a much simpler and effective approach for this specific problem. All of this is possible because I didn’t start with the data I could access, but with a simpler domain-level observation.

I am not trying to say my approach is superior, as they have tested theirs in millions of pages and I’ve only tested this on a few thousand. My point is that as a practitioner you should learn this stuff so you can contribute your own expertise and creativity.

Now let’s get to the fun part and get to code some machine learning code in Python!

Collecting training data

We need training data to build a model. This training data needs to come pre-labeled with “correct” answers so that the model can learn from the correct answers and make its own predictions on unseen data.

In our case, as discussed above, we’ll use our intuition that most product pages have one or more large images on the page, and most category type pages have many smaller images on the page.

What’s more, product pages typically have more form elements than category pages (for filling in quantity, color, and more).

Unfortunately, crawling a web page for this data requires knowledge of web browser automation, and image manipulation, which are outside the scope of this post. Feel free to study this GitHub gist we put together to learn more.

Here we load the raw data already collected.

Feature engineering

Each row of the form_counts data frame above corresponds to a single URL and provides a count of both form elements, and input elements contained on that page.

Meanwhile, in the img_counts data frame, each row corresponds to a single image from a particular page. Each image has an associated file size, height, and width. Pages are more than likely to have multiple images on each page, and so there are many rows corresponding to each URL.

It is often the case that HTML documents don’t include explicit image dimensions. We are using a little trick to compensate for this. We are capturing the size of the image files, which would be proportional to the multiplication of the width and the length of the images.

We want our image counts and image file sizes to be treated as categorical features, not numerical ones. When a numerical feature, say new visitors, increases it generally implies improvement, but we don’t want bigger images to imply improvement. A common technique to do this is called one-hot encoding.

Most site pages can have an arbitrary number of images. We are going to further process our dataset by bucketing images into 50 groups. This technique is called “binning”.

Here is what our processed data set looks like.

Example view of processed data for "binning"

Adding ground truth labels

As we already have correct labels from our manual regex approach, we can use them to create the correct labels to feed the model.

We also need to split our dataset randomly into a training set and a test set. This allows us to train the machine learning model on one set of data, and test it on another set that it’s never seen before. We do this to prevent our model from simply “memorizing” the training data and doing terribly on new, unseen data. You can check it out at the link given below:

Model training and grid search

Finally, the good stuff!

All the steps above, the data collection and preparation, are generally the hardest part to code. The machine learning code is generally quite simple.

We’re using the well-known Scikitlearn python library to train a number of popular models using a bunch of standard hyperparameters (settings for fine-tuning a model). Scikitlearn will run through all of them to find the best one, we simply need to feed in the X variables (our feature engineering parameters above) and the Y variables (the correct labels) to each model, and perform the .fit() function and voila!

Evaluating performance

Graph for evaluating image performances through a linear pattern

After running the grid search, we find our winning model to be the Linear SVM (0.974) and Logistic regression (0.968) coming at a close second. Even with such high accuracy, a machine learning model will make mistakes. If it doesn’t make any mistakes, then there is definitely something wrong with the code.

In order to understand where the model performs best and worst, we will use another useful machine learning tool, the confusion matrix.

Graph of the confusion matrix to evaluate image performance

When looking at a confusion matrix, focus on the diagonal squares. The counts there are correct predictions and the counts outside are failures. In the confusion matrix above we can quickly see that the model does really well-labeling products, but terribly labeling pages that are not product or categories. Intuitively, we can assume that such pages would not have consistent image usage.

Here is the code to put together the confusion matrix:

Finally, here is the code to plot the model evaluation:

Resources to learn more

You might be thinking that this is a lot of work to just tell page groups, and you are right!

Screenshot of a query on custom PageTypes and DataLayer

Mirko Obkircher commented in my article for part two that there is a much simpler approach, which is to have your client set up a Google Analytics data layer with the page group type. Very smart recommendation, Mirko!

I am using this example for illustration purposes. What if the issue requires a deeper exploratory investigation? If you already started the analysis using Python, your creativity and knowledge are the only limits.

If you want to jump onto the machine learning bandwagon, here are some resources I recommend to learn more:

Got any tips or queries? Share it in the comments.

Hamlet Batista is the CEO and founder of RankSense, an agile SEO platform for online retailers and manufacturers. He can be found on Twitter .

Related reading

Complete guide to Google Search Console
Google tests AR for Google Maps Considerations for businesses across local search, hyperlocal SEO, and UX
robots.txt best practice guide
How to pick the best website audit tool for your digital agency

Source link