Stackery, a 4-year old Portland startup, wants to help development teams deliver serverless resources on AWS more easily, and today it announced several enhancements to the platform.
With serverless applications, the development team outlines a set of trigger events and the cloud infrastructure vendor — in this case AWS — provides the exact amount of required resources to run the event and no more. This frees developers from having to worry about provisioning the proper amount of resources to run the application.
“Stackery is a secure serverless platform for AWS. We’re geared toward teams who are moving from laptop through production, and [we provide the tools] that they need to design, develop, and then deliver modern applications for those teams,” Stackery CEO Tim Zonca told TechCrunch.
In general, the product helps create a virtual whiteboard, where development teams can build serverless applications in a highly visual way, then it helps with testing and deployment of the app on AWS. Zonca says that the updates they are announcing today focus on building in security and governance into the platform, while offering a full set of continuous delivery tools in a modern git-driven delivery system.
“We realized that we could fill in some of the gaps [for developers] and help them take what we have developed as a set of best practices around securely delivering applications over the course of the last year, and just bake them into the product, so that those teams don’t have to think about those practices in a serverless world,” Zonca explained.
For starters, they are offering a code review for known vulnerabilities as they pop the application into their git repository, whether that’s Bitbucket, GitLab or GitHub. “We’ve introduced the ability to audit function code for known vulnerabilities, and we do this by just using common tooling out there,” he said.
The company is also helping test that code, which gets a bit tricky when ephemeral serverless infrastructure is involved. “We allow people to automate the spinning up of temporary ephemeral testing environments, and then help them plug in the automation for their system testing or integration testing or unit testing, and even provide an environment associated with this pull request for humans to go in and actually log on and do usability testing,” Zonca said.
When an application has passed all the testing, and is ready to be deployed to staging or production environments, Stackery can automatically promote that change set. Companies can then choose to do a final review before deployment or simply allow it to deploy automatically once the application passes all the contingencies the team set up.
Stackery was founded in 2016. It has raised $ 7.4 million, according to Crunchbase data.
There’s nothing that beats that organic #1 position in Google’s SERPs when it comes to brand visibility, increase in traffic, trust factor boost, reduction in cost per lead, and so on.
Everyone who’s anyone in online business knows this, which is why the struggle to grab that marketer’s Holy Grail can look like a cut-throat business to many SEO novices.
However, even SEO pros get confused when Google throws a wrench into the intricate workings of the rankings machine. Google’s core algorithm updates can mess up even the best SEO strategies, especially if you react in a panic to a drop in the rankings.
Today, I’ll share with you the three things I’ve learned from 2019 Google algorithm updates that will help you future-proof your SEO. First, however, take a look at the hints that Google rolled out alongside those updates to see if you’re building your SEO strategy on a healthy foundation.
2019 Google core algorithm updates and what they tell us
That’s just a bit shy of 9 updates per day.
All of them change how the algorithm evaluates a website and its rankings (most just slightly, though).
However, three of them were so-called ‘core algorithm updates’ – meaning that their impact on the rankings was likely significant for most indexed websites. Google announced these (in March, June, and September of 2019), which is not something that they normally do. This should give you an idea of how important they were in the grand scheme of all things SEO-related.
Websites were affected differently, with some seeing increases in their rankings and traffic, and others plummeting to Google’s page #3. Many of the sites that experienced significant drops are in the Your Money, Your Life (YMYL) niche.
(Verywellhealth.com shows a significant drop after the March core update)
“The sensitive nature of the information on these types of websites can have a profound impact on peoples’ lives,” says Paul Teitelman of Paul Teitelman SEO Agency. “Google has long struggled with this and at least one of these core algorithm updates was designed to push trustworthy YMYL content to the top while sinking those websites that contain dubious and untrustworthy information.”
Google signaled a path forward with these updates. If you were not paying attention, here are the key takeaways:
- Google signals an intent to keep rewarding fresh, complete, and unique content. Focus on answering the searcher’s questions thoroughly and precisely.
- E-A-T (Expertise, Authoritativeness, Trustworthiness) guidelines are more important than ever. Things like backlinks from reputable websites, encryption, and who authors your posts can make or break your organic rankings.
- Google wants to see you covering a wide range of topics from your broader niche. Increase your relevance with content that establishes you as the go-to source in your niche.
SEO is far from an exact science.
If anything, it’s educated guesswork based on countless hours of testing, tweaking, and then testing again.
Still, there are things that you can do to future-proof your SEO and protect your websites from reacting too violently to core algorithm updates.
Based on Google’s recent hints, here are three things that you should focus on if you’re going after those page #1 rankings in the SERPs.
Three tips to future-proof your website’s SEO
Keep the focus on high-quality, actionable content
I know you’re annoyed with hearing it by now but high-quality content is a prerequisite to ranking at the top of the SERPs and staying there.
This means that you need to pin-point a specific question that the searcher wants answers to and then write a piece of content that provides a detailed clarification of the issue. Does it need to be 5,000 words long? That depends on the question but, in most cases, it doesn’t. What it needs to be is concise and thorough, and clarify any and all questions that the searcher might have while reading it.
Ideally, you will want your content to be 1500+ words. According to Backlinko’s Brian Dean and his research, Google tends to reward longer content.
My advice is to ask yourself the following questions when you’re writing:
- Am I providing the reader with a comprehensive answer to their question?
- Is my content more thorough than what’s already on the #1 page of the SERPs?
- Am I presenting the information in a trustworthy way (citing sources, quoting experts)?
- Is my content easy to understand, and free from factual, stylistic, and grammar errors?
If your answer to these questions is a yes, you’re already doing better than (probably) 95% of your competitors.
Improve the E-A-T score of your website
In SEO, E-A-T stands for Expertise, Authoritativeness, and Trustworthiness.
In other words – who is authoring blog posts and articles that are published on your website? Are they penned by an expert in the field or by a ghostwriter?
Why should people trust anything you (or your website) have to say? That’s the crux of E-A-T.
The concept appears in Google’s Quality Raters’ Guidelines (QRG), and SEO experts have debated for years whether or not it has any bearing on the actual organic rankings.
In 2018, Google cleared all doubts around it, announcing that QRG is, in fact, their blueprint for developing the search algorithm. “You can view the rater guidelines as to where we want the search algorithm to go,” Ben Gomes, Google’s vice president of search, assistant and news, said in a CNBC interview.
Here’s what the QRG has to say about E-A-T
We have no idea if Google’s core algorithm can evaluate E-A-T parameters as well as an actual human rater. Still, if that’s Google’s end goal, it’s a good idea to pay attention to it now, regardless of whether it’s implemented or not. It most certainly will be at one point in the future.
To improve your E-A-T score, focus on the following
- Add an author byline to your posts – every post that you publish should be authored by someone. Use your real name (or your author’s real name), and start building a reputation as an expert in the field.
- Create your personal website – even if you’re trying to rank your business site, make sure to have a personal branding website of your own (and of any regularly contributing authors). Those websites should be maintained – you don’t need to SEO the heck out of them but you should publish niche-relevant content regularly.
- Get featured on Wikipedia and authority websites – QRG clearly instructs raters to check for author mentions on Wikipedia and other relevant sites. That stands to reason because experts in the field will often be quoted by other publications.
- Get mentions on forums – same goes for forum mentions. If people name-drop you on relevant forums, that means that they feel you have something important to say.
- Secure your site with HTTPS – security is an important E-A-T factor, especially if you’re selling something via your website. An unsecured website will have a low E-A-T score so make sure to invest in encryption to boost trustworthiness.
Build quality backlinks and establish a social presence
Quality backlinks are still a very important ranking factor.
However, according to a report released by Backlinko, it’s not about one or two backlinks, regardless of how strong they are.
What moves the ranking needle are sustainable, evergreen link-building strategies – backlinks from trusted, niche-related websites that are acquired by white hat SEO methods such as blogger outreach, guest posting, and collaborations with other influencers in the niche. The more of these types of backlinks you get, the better your organic rankings.
Additionally, getting backlinks from a greater number of referring domains ensures that your rankings are protected if, for example, a couple of those websites get shut down or penalized in the future. When you’re playing the link-building game, it pays to think ahead.
(Image Source: https://backlinko.com/google-ranking-factors)
And, while they don’t carry the same weight as true backlinks, you’d be wrong to underestimate the value Google’s ranking algorithm places on social media signals.
A truly authoritative website – and all the authors that write for it – will have a strong social media presence. They will use it to amplify their message, build additional authority, and drive traffic to their website. Ahrefs’ Tim Soulo does this better than any other SEO expert that I know.
All of this will affect the aforementioned E-A-T parameters. If nothing, it will distribute your name far and wide, signaling to Google that you’re not a complete nobody that just happens to run a website or write a blog about a certain topic. The stronger your social media presence; the more followers, comments, and shares you end up earning – the better it is for your E-A-T.
Get people to trust you and the algorithm will follow
Pretty soon, the key to top rankings will be how believable and trustworthy you are. Google’s current insistence on E-A-T parameters clearly demonstrates that. Everything else will be just the icing on the cake after that – the fancy schema you’re using, the on-page SEO gimmicks, and all the other loopholes SEO experts are now using to rank their websites.
I’m interested to hear what you think about the direction that Google is taking with this year’s algorithm updates. Have any of your websites been affected? Leave a comment below and let’s discuss.
The post 2019 Google core algorithm updates: Lessons and tips to future-proof your SEO appeared first on Search Engine Watch.