Monthly Archives: April 2018
There are so many situations when it is a good idea to test out PPC Bidding Strategies on campaigns. In this article, we are going to discuss different reasons why you might want to avoid using automated bid strategies.
Read more at PPCHero.com
Whether you’re hunting for a 200GB MicroSD card or a new laptop, we’ve got a dozen tech deals to check out this weekend.
Feed: All Latest
DocuSign CEO Dan Springer was all smiles at the Nasdaq on Friday, following the company’s public debut.
And he had a lot to be happy about. After pricing the IPO at a better-than-expected $ 29, the company raised $ 629 million. Then DocuSign finished its first day of trading at $ 39.73, up 37% in its debut.
Springer, who took over DocuSign just last year, spoke with TechCrunch in a video interview about the direction of the company. “We’ve figured out a way to help businesses really transform the way they operate,” he said about document-signing business. The goal is to “make their life more simple.”
But when asked about the competitive landscape which includes Adobe Sign and HelloSign, Springer was confident that DocuSign is well-positioned to remain the market leader. “We’re becoming a verb,” he said. Springer believes that DocuSign has convinced large enterprises that it is the most secure platform.
Yet the IPO was a long-time coming. The company was formed in 2003 and raised over $ 500 million over the years from Sigma Partners, Ignition Partners, Frazier Technology Partners, Bain Capital Ventures and Kleiner Perkins, amongst others. It is not uncommon for a venture-backed company to take a decade to go public, but 15 years is atypical, for those that ever reach this coveted milestone.
Dell Technologies Capital president Scott Darling, who sits on the board of DocuSign, said that now was the time to go public because he believes the company “is well positioned to continue aggressively pursuing the $ 25 billion e-signature market and further revolutionizing how business agreements are handled in the digital age.”
Sales are growing, but it is not yet profitable. DocuSign brought in $ 518.5 million in revenue for its fiscal year ending in 2018. This is an increase from $ 381.5 million last year and $ 250.5 million the year before. Losses for this year were $ 52.3 million, reduced from $ 115.4 million last year and, $ 122.6 million for 2016.
Springer says DocuSign won’t be in the red for much longer. The company is “on that fantastic path to GAAP profitability.” He believes that international expansion is a big opportunity for growth.
Google’s mobile-first index is here, causing fresh uncertainty about potential SEO impacts – but there are a number of proactive steps to take to manage risk and maximize ranking opportunities.
Rather than passively wait to feel the impact of the shift to mobile-first indexation, we advise companies to take six specific actions to prepare for opportunities and protect site performance as the mobile-first index is rolled out throughout 2018.
Brands that have been prioritizing mobile performance shouldn’t experience a negative impact from the mobile-first index, but an honest and systematic re-evaluation is required. Companies who have allowed the mobile and desktop experience to diverge over the years will likely experience change – rankings could be lost (or gained) as a result of the switch.
Before diving in and making changes to prepare for the mobile-first index, we recommend running a full audit of current desktop and mobile rankings in all the regions your company does business in, along with top performing pages.
Tracking this performance over time, any losses or gains in keyword visibility should be clear to see – along with potential causes. Across the six actions below the common thread is Google’s determination to provide accurate answers to users, in the channel that is used most frequently – mobile.
Keeping that at the heart of your SEO strategy and things should be fine – but having a plan certainly helps.
- Action one – go mobile-responsive
Even today, too few marketers and SEO professionals meaningfully differentiate between responsive, mobile-friendly and standalone mobile sites – but that difference will become especially important in 2018.
A responsive website adjusts (or responds) based on user activity and the device used. Typical features of a responsive site include minimal navigation, images optimized for mobile and content that shifts seamlessly according to the size of the display.
In comparison, a mobile-friendly design is often anything-but mobile-friendly, attempting to show content on a mobile device as they do on a desktop, and so give users the frustrating experience of having to manually zoom in, or squint at small fonts.
Finally, some brands still operate standalone mobile sites, completely separate from the desktop experience. With responsive and mobile-friendly sites, there shouldn’t be any difference in content from a desktop version of a site.
However, a mobile-friendly site may be disproportionately skewed towards the desktop experience with an impact on factors like mobile site speed, navigation and general usability – and these are all areas of concern when considering how Google evaluates quality in 2018.
With a separate mobile site, marketers need to make sure that the mobile version contains everything (useful) that your desktop site does which could be a lot of work depending on your mobile strategy so far.
For some brands still lingering with standalone mobile sites, the shift to the mobile-first index may be the nudge needed to move to a fully responsive approach to the site.
Whether you operate responsive, mobile-friendly or a standalone mobile site, the first action we recommend is to identify any differences and either add to or completely overhaul the mobile sites you manage.
While desktop site continue will factor into rankings as a secondary consideration (and it is vanishingly unlikely that longstanding sites with many well-earned rankings will be wiped off the SERPS) making sure the mobile experience contains all the relevant content of the desktop experience – including all structured data/meta description/alt text/schema – is an important protective step.
- Action two – optimize site speed versus competitors
The mobile-first index flips previous logic – when 80% of evaluations about rankings were based on desktop crawling and indexing, site speed considerations were less of a concern.
However, as Google crawls mobile sites while mimicking (a not-very-good) mobile connection, slow performance, elements that struggle to load and broken links will quickly use up crawl equity and indicate that your site is less efficient at delivering the answers that users want relative to your competitors.
In addition to Google tools, we regularly use platforms like GTMetrix, Pingdom, DareBoost and WebPageTest.org to get a complete view of speed issues.
Particularly for international sites, testing mobile speed from different locations and comparing these measurements to those of your key competitors will help establish practical targets to aim for. Although Google frequently mentions a target page speed of under three seconds as being ideal, in practical reality and SEO terms, aiming to be better than your competitors should be enough.
Like with SEO in general, speed optimization is similar to an old joke – ‘you don’t have to run faster than the bear to get away. You just have to run faster than the guy next to you.’
The challenge for SEO professionals is to identify elements like these that can be improved without too much damage to the brand experience or taking away content useful for users.
- Action three – optimize the customer journey
Understanding the intent of site visitors and reducing barriers from their first click in the SERPS to the information they are looking for should result in positive user experiences – and minimize the risk that comes from a site experience that causes confusion, fruitless clicking around and pushes customers away.
Although there’s some fuzziness about quite how Google interprets the quality of a user’s visit – and how it rewards that quality in terms of rankings – we advise researching the different types of mobile journeys your customers take in a systematic way and making them more efficient.
Though much ‘best practice’ SEO advice has in the past been based around engagement and keeping visitors on the site, we all know that site visitors often stick around because they’re being frustrated by unclear navigation and a poor approach to customer journey planning.
Users are more impatient of poor customer journeys on mobile – and we must anticipate that Google will feel the same too. Though helping visitors to get the answers they seek more quickly may actually decrease dwell time, we’re confident that Google and other search engines will differentiate between a short visit and a swift return to the SERPS, and a short visit that successfully ends the user’s search.
Evaluating bounce rates and the success of the mobile user journey using heat-mapping tools like Hotjar or user research panels like Peek User Testing will bring in objective data to answer whether your visitors are engaged and loving your content, or hitting barriers and getting increasingly annoyed.
In the mobile-index era, we predict that this annoyance will have a greater impact on rankings – and so is a risk to be managed carefully.
While taking steps to understand your assets and protect your rankings is important, the shift to the mobile-first index is also a big opportunity to get ahead of competitors who are less prepared. Knowing that others will be slow to react really gives an extra incentive to put real effort into SEO strategies that will positively differentiate your brand from competitors.
- Action four – prioritize content formatting that excels on mobile.
A lot of content marketing (such as infographics, interactive microsites, mega pages and even video, depending on the platform) produced by brands still display poorly on mobile devices.
Taking a mobile-first mindset and prioritizing everyday content and content marketing assets that work particularly well on mobile devices will resonate best with both customers and search engines. Fortunately, there are a lot of methodologies that can be used to provide depth of content that is engaging and easily navigable on mobile.
One of the biggest changes is the resurgence of expandable content areas like tabs, accordions and other filters. Use filters to hide content not relevant to a visitor’s specific query, tabs that reveal further information when clicked and accordions that expand the page are all familiar to site visitors – and allow for a single web page to be seamlessly used in multiple ways by multiple audiences.
While these have been seen by Google and other search engines as a potentially sneaky way to cram in content to a page, Google is on record as stating that content that is hidden to make a mobile site more efficient and speedier to explore will be taken into full consideration.
While competitors may have a responsive or mobile-friendly site and feel that this is enough preparation, many will likely still take a desktop-first mindset, creating overloaded pages that are tedious to wade through on mobile devices.
Thinking with a customer and mobile-first mindset to arrange content that can be skimmed easily through logical headings, bolding of main points and pull-out quotes, numbered lists, bullet-points and more will support mobile visitors and and differentiate from competitors while allowing search engine bots to crawl effectively.
- Action five – evaluate AMP and progressive web apps
AMP templates are easily applied in the code with well-established procedures to provide the speedy AMP version to search engines, with the slower (but perhaps more visual) non-AMP still being recognized for ranking purposes with a canonical tag.
Progressive Web Apps use browser feature detection to give a fast, app-like experience that can be loaded from a mobile home screen or simply visited with a direct link. Websites that have a lot of moving parts and a lot of returning traffic, for example in e-commerce or other transactional sites, are the most well suited for Progressive Web Apps as they can massively streamline the user experience.
In both cases, although implementation is comparatively straightforward, you can bet that a minority of companies in your industry will have a systematic approach to using these technologies.
Being fast, being relevant and being right are key watchwords for future mobile-first SEO and using technologies that help speed, indexation and the user experience is a positive and proactive step.
Action six – identify competitors to beat
As discussed, not every competitor will be thinking systematically about the mobile-first index, or the changing nature of SEO in general. That opens up the possibility that by being faster and more focused, some previously difficult to rank for keywords will become more obtainable.
Using your business and industry knowledge, we advise clients to identify competitors who have rankings ahead of your own that may be less responsive to change, and underprepared for the mobile-first index.
Building these target keywords into your mobile strategy and wider SEO strategy – including off-site SEO and link earning – should result in some strong opportunities.
Conclusion – manage risk, capitalize on opportunities
For some, the mobile-first index won’t result in anything transformational – if you’ve been following best practice for years and your main competitors have been doing likewise there probably won’t be any game-changing shifts.
However, in any period of uncertainty there are opportunities to take advantage of and risks to manage – and in competitive SEO niches, taking every chance to get ahead is important.
Whatever your starting point – the mobile-first index is the new normal in SEO, and now is the time to get to grips with the challenge – and potential.
A popular search engine developed by Google Inc. of Mountain View, Calif. uses PageRank.RTM. as a page-quality metric for efficiently guiding the processes of web crawling, index selection, and web page ranking. Generally, the PageRank technique computes and assigns a PageRank score to each web page it encounters on the web, wherein the PageRank score serves as a measure of the relative quality of a given web page with respect to other web pages. PageRank generally ensures that important and high-quality web pages receive high PageRank scores, which enables a search engine to efficiently rank the search results based on their associated PageRank scores.
A continuation patent of an updated PageRank was granted today. The original patent was filed in 2006, and reminded me a lot of Yahoo’s Trustrank (which is cited by the patent’s applicants as one of a large number of documents that this new version of the patent is based upon.)
I first wrote about this patent in the post titled, Recalculating PageRank. It was originally filed in 2006, and the first claim in the patent read like this (note the mention of “Seed Pages”):
What is claimed is:
1. A method for producing a ranking for pages on the web, comprising: receiving a plurality of web pages, wherein the plurality of web pages are inter-linked with page links; receiving n seed pages, each seed page including at least one outgoing link to a respective web page in the plurality of web pages, wherein n is an integer greater than one; assigning, by one or more computers, a respective length to each page link and each outgoing link; identifying, by the one or more computers and from among the n seed pages, a kth-closest seed page to a first web page in the plurality of web pages according to the lengths of the links, wherein k is greater than one and less than n; determining a ranking score for the first web page from a shortest distance from the kth-closest seed page to the first web page; and producing a ranking for the first web page from the ranking score.
The first claim in the newer version of this continuation patent is:
What is claimed is:
1. A method, comprising: obtaining data identifying a set of pages to be ranked, wherein each page in the set of pages is connected to at least one other page in the set of pages by a page link; obtaining data identifying a set of n seed pages that each include at least one outgoing link to a page in the set of pages, wherein n is greater than one; accessing respective lengths assigned to one or more of the page links and one or more of the outgoing links; and for each page in the set of pages: identifying a kth-closest seed page to the page according to the respective lengths, wherein k is greater than one and less than n, determining a shortest distance from the kth-closest seed page to the page; and determining a ranking score for the page based on the determined shortest distance, wherein the ranking score is a measure of a relative quality of the page relative to other pages in the set of pages.
Producing a ranking for pages using distances in a web-link graph
Inventors: Nissan Hajaj
Assignee: Google LLC
US Patent: 9,953,049
Granted: April 24, 2018
Filed: October 19, 2015
One embodiment of the present invention provides a system that produces a ranking for web pages. During operation, the system receives a set of pages to be ranked, wherein the set of pages are interconnected with links. The system also receives a set of seed pages which include outgoing links to the set of pages. The system then assigns lengths to the links based on properties of the links and properties of the pages attached to the links. The system next computes shortest distances from the set of seed pages to each page in the set of pages based on the lengths of the links between the pages. Next, the system determines a ranking score for each page in the set of pages based on the computed shortest distances. The system then produces a ranking for the set of pages based on the ranking scores for the set of pages.
Under this newer version of PageRank, we see how it might avoid manipulation by building trust into a link graph like this:
One possible variation of PageRank that would reduce the effect of these techniques is to select a few “trusted” pages (also referred to as the seed pages) and discovers other pages which are likely to be good by following the links from the trusted pages. For example, the technique can use a set of high quality seed pages (s.sub.1, s.sub.2, . . . , s.sub.n), and for each seed page i=1, 2, . . . , n, the system can iteratively compute the PageRank scores for the set of the web pages P using the formulae:
.A-inverted..noteq..di-elect cons..function..times..fwdarw..times..function..times..function..fwdarw. ##EQU00002## where R.sub.i(s.sub.i)=1, and w(q.fwdarw.p) is an optional weight given to the link q.fwdarw.p based on its properties (with the default weight of 1).
Generally, it is desirable to use a large number of seed pages to accommodate the different languages and a wide range of fields which are contained in the fast growing web contents. Unfortunately, this variation of PageRank requires solving the entire system for each seed separately. Hence, as the number of seed pages increases, the complexity of computation increases linearly, thereby limiting the number of seeds that can be practically used.
Hence, what is needed is a method and an apparatus for producing a ranking for pages on the web using a large number of diversified seed pages without the problems of the above-described techniques.
The summary of the patent describes it like this:
One embodiment of the present invention provides a system that ranks pages on the web based on distances between the pages, wherein the pages are interconnected with links to form a link-graph. More specifically, a set of high-quality seed pages are chosen as references for ranking the pages in the link-graph, and shortest distances from the set of seed pages to each given page in the link-graph are computed. Each of the shortest distances is obtained by summing lengths of a set of links which follows the shortest path from a seed page to a given page, wherein the length of a given link is assigned to the link based on properties of the link and properties of the page attached to the link. The computed shortest distances are then used to determine the ranking scores of the associated pages.
The patent discusses the importance of a diversity of topics covered by seed sites, and the value of a large set of seed sites. It also gives us a summary of crawling and ranking and searching like this:
Crawling Ranking and Searching Processes
FIG. 3 illustrates the crawling, ranking and searching processes in accordance with an embodiment of the present invention. During the crawling process, web crawler 304 crawls or otherwise searches through websites on web 302 to select web pages to be stored in indexed form in data center 308. In particular, web crawler 304 can prioritize the crawling process by using the page rank scores. The selected web pages are then compressed, indexed and ranked in 305 (using the ranking process described above) before being stored in data center 308.
During a subsequent search process, a search engine 312 receives a query 313 from a user 311 through a web browser 314. This query 313 specifies a number of terms to be searched for in the set of documents. In response to query 313, search engine 312 uses the ranking information to identify highly-ranked documents that satisfy the query. Search engine 312 then returns a response 315 through web browser 314, wherein the response 315 contains matching pages along with ranking information and references to the identified documents.
I’m thinking about looking up the many articles cited in the patent, and providing links to them, because they seem to be tremendous resources about the Web. I’ll likely publish those soon.
Copyright © 2018 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana
Europe’s sweeping privacy law GDPR goes into effect May 25th, and Facebook is being forced to push users through new agreements to terms of service changes required to comply with the law. That’s why during today’s successful Q1 2018 earnings report call, Facebook CFO David Wehner warned that “we believe MAU or DAU might be flat or down in Q2 due to the GDPR rollout.” He also said that while Facebook doesn’t expect a significant impact on ads from GDPR, there may be a slight impact and it will be monitoring for that. Wehner notes that GDPR will impact the global online advertising industry so it may be hard to tell what the exact repercussions are for Facebook.
Wehner later clarified that’s “what we’re expecting given that you’re having to bring people through these consent flows, and we have been modeling it and expect there would be a flat to down impact on MAU and DAU.” Facebook went on to describe how if users change their ad privacy settings through the GDPR prompts to allow less targeting, ads could be less effective, so advertisers would pay less for them.
“Fundamentally we believe we can continue to build a great ads business” while continuing to protect people’s privacy, Wehner explained. He said what’s important is Facebook’s relative value to advertisers, which theoretically shouldn’t change since all ad platforms are impacted by GDPR.
Facebook unveiled its GDPR-related changes and how users will be asked to consent to them last week, and drew heavy criticism. Facebook employed “dark patterns” in the design of the consent flow, coercing users to agree to the changes without fully considering them. Meanwhile, it minimized the size and visual prominence of the buttons to revoke permissions from Facebook or reject the changes outright and terminate their account.
Facebook was likely trying to minimize the disruption to the user experience and thereby its user count with this shady design methodology. Just the fact that Wehner said Facebook has to “bring people through these consent flows” rather than describing them as giving user choice or anything about Facebook’s commitment to privacy shows that it views GDPR as merely a hurdle, not something users deserve for protection.
Read our full story on Facebook’s Q1 2018 earnings:
Benefits of robots: 1. They never get tired. 2. They can lift very heavy things. 3. They can walk through (controlled) conflagrations at the University of Michigan.
Feed: All Latest
- Data scientists: Bring the narrative to the forefront
- Core Web Vitals & Preparing for Google’s Page Experience Update
- Conversion modeling through Consent Mode in Google Ads
- The search dilemma: looking beyond Google’s third-party cookie death
- The FDA’s Decision to Pause J&J Could Help Defeat Covid-19