Directory submission is a tactic that has evolved dramatically since it first became known. Firstly, it is no longer referred to as a directory submission, simply because the term has received some negativity over the years.
Secondly, the goals have changed: we no longer focus on link acquisition. When you come to think of it, the whole link-building strategy has undergone the same evolution: it has become more integrated, meaning that we now pursue non-link-building tactics while still hoping to get some links anyway.
Some of the non-link-building benefits of getting listed that may still result in links include:
- Proactive reputation management (i.e. making sure your business name is mentioned a lot across the web)
- Discoverability (i.e. making sure your business is there when people use the directory search to find what they need). This comes with traffic and leads, which is always nice.
Getting listed: the opportunities
If you think directories are dead, think again: there are plenty of new and old directories out there that can send you traffic and leads. Here are just a few categories to look into.
SaaS and B2B directories
These come in several types and forms. Some are more traditional (free but with the option of charging you once for premium review):
While others charge you a monthly/yearly fee:
- Business.com ($ 299 annual fee)
- Tuugo.us ($ 9.99 per month for premium listing)
- Yalwa.com ($ 4.95 a month for premium listing)
These deserve a separate article (which you can find here). Apart from the ability to send local traffic (from people trying to discover a local service), they are also quite useful for so-called local citation building – in other words, they help search engines associate you with important locations.
Getting listed: the smart way
There are many more useful directories out there that can still drive sales, but choose wisely; in many cases, it’s an investment of some sort. In addition, it’s paramount to stay away from penalized directories. Here are a few tools I use to evaluate whether any directory or platform is worth the investment:
Find whether the platform ranks in Google
Does Google think a directory is good enough to rank it high in search results? Search positions are the most reliable sign of a site’s health.
There are not many sites that will let you see the stats for free, and Serpstat is one of the most affordable.
Simply run the domain in Serpstat to quickly see where it ranks and how its rankings are distributed among different search engines. There are also tools to analyze whether the domain is ever featured in Google, which is an important signal of health too. Here is the list of tools you can use.
Find whether the platform has any traffic
Since creating an alternative traffic source is one of the main goals here, this is vital. There aren’t many reliable ways to evaluate a website’s traffic unless you own it, but these are decent:
- Alexa.com: its major data source is their own toolbar, which may mean it’s somewhat limited. Yet, it is the oldest player in the field, and therefore quite trustworthy
- SimilarWeb.com: read more about their data sources here: “global ISP data, and thousands of add-ons, extensions, apps and plugins, plus a team of web crawlers that scan thousands of websites”.
Check whether your subcategory is linked to from elsewhere
I wouldn’t be an SEO if I paid no attention to backlinks, but in my defense, links are not just a sign of SEO ‘authority’ – they signal quality too; if someone links to it, it must be a good page.
I use Ahrefs bulk backlink analysis feature to quickly run a lot of pages and section to choose the best ones.
[NB: I only mention directories that have proven worth the investment based on their rankings and traffic.]
Have you listed your website in some directories and seen some solid traffic and leads? Share your tips and resources in the comments.
Navneet Panda, whom the Google Panda update is named after, has co-invented a new patent that focuses on site quality scores. It’s worth studying to understand how it determines the quality of sites.
Back in 2013, I wrote the post Google Scoring Gibberish Content to Demote Pages in Rankings, about Google using ngrams from sites and building language models from them to determine if those sites were filled with gibberish, or spammy content. I was reminded of that post when I read this patent.
Rather than explaining what ngrams are in this post (which I did in the gibberish post), I’m going to point to an example of ngrams at the Google n-gram viewer, which shows Google indexing phrases in scanned books. This article published by the Wired site also focused upon ngrams: The Pitfalls of Using Google Ngram to Study Language.
An ngram phrase could be a 2-gram, a 3-gram, a 4-gram, or a 5-gram phrase; where pages are broken down into two-word phrases, three-word phrases, four-word phrases, or 5 word phrases. If a body of pages are broken down into ngrams, they could be used to create language models or phrase models to compare to other pages.
Language models, like the ones that Google used to create gibberish scores for sites could also be used to determine the quality of sites, if example sites were used to generate those language models. That seems to be the idea behind the new patent granted this week. The summary section of the patent tells us about this use of the process it describes and protects:
In general, one innovative aspect of the subject matter described in this specification can be embodied in methods that include the actions of obtaining baseline site quality scores for a plurality of previously-stored sites; generating a phrase model for a plurality of sites including the plurality of previously-scored sites, wherein the phrase model defines a mapping from phrase-specific relative frequency measures to phrase-specific baseline site quality scores; for a new site, the new site not being one of the plurality of previously-scored sites, obtaining a relative frequency measure for each of a plurality of phrases in the new site; determining an aggregate site quality score for the new site from the phrase model using the relative frequency measures of the plurality of phrases in the new site; and determining a predicted site quality score for the new site from the aggregate site quality score.
The newly granted patent from Google is:
Predicting site quality
Inventors: Navneet Panda and Yun Zhou
US Patent: 9,767,157
Granted: September 19, 2017
Filed: March 15, 2013
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicating a measure of quality for a site, e.g., a web site. In some implementations, the methods include obtaining baseline site quality scores for multiple previously scored sites; generating a phrase model for multiple sites including the previously scored sites, wherein the phrase model defines a mapping from phrase specific relative frequency measures to phrase specific baseline site quality scores; for a new site that is not one of the previously scored sites, obtaining a relative frequency measure for each of a plurality of phrases in the new site; determining an aggregate site quality score for the new site from the phrase model using the relative frequency measures of phrases in the new site; and determining a predicted site quality score for the new site from the aggregate site quality score.
In addition to generating ngrams from text upon sites, in some versions of the implementation of this patent will include generating ngrams from anchor text of links pointing to pages of the sites. Building a phrase model involves calculating the frequency of n-grams on a site “based on the count of pages divided by the number of pages on the site.”
The patent tells us that site quality scores can impact rankings of pages from those sites, according to the patent:
Obtain baseline site quality scores for a number of previously-scored sites. The baseline site quality scores are scores used by the system, e.g., by a ranking engine of the system, as signals, among other signals, to rank search results. In some implementations, the baseline scores are determined by a backend process that may be expensive in terms of time or computing resources, or by a process that may not be applicable to all sites. For these or other reasons, baseline site quality scores are not available for all sites.
Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana
The post Using Ngram Phrase Models to Generate Site Quality Scores appeared first on SEO by the Sea ⚓.
Level up your ad game by using ad customizers and optimizing your geo-targets for lead generation clients, especially for an educational vertical.
Read more at PPCHero.com
When you perform a search at Google, and you have a set of search results in front of you, how do you decide what to click upon?
- Security, privacy experts weigh in on the ICE doxxing
- Personalized Training to Become a PPC Superhero
- Laying the foundations of good SEO: the most important tasks (part 1)
- Konsus looks to give companies a way to get specially designed documents in under a day
- Leveraging Custom Intent Audiences on YouTube