Panda, Penguin, Hummingbird – you’re probably familiar with these as some of the major Google rankings algorithm updates that shaped the Internet of the early 2010s.
Back in the day, Google was rather transparent regarding the focus points of its updates – duplicate content, spammy links, keyword stuffing, unresponsive design, etc.
A public discussion of ranking updates meant SEO experts and Webmasters understood which web practices could boost and which could lower the site’s rankings. However, it also indicated a more effortless way to uncover the weak spots of the updates and get around them by practicing gray-hat and black-hat SEO techniques.
To combat the banned practices and improve the quality and relevance of its results, Google began, as of 2017, referring to all major algorithm changes as Core Updates.
Ever since, Google has been releasing less information on what the updates aim to improve, thus making Webmasters work out good and bad SEO practices based on in-depth analyses and competitor observations.
Still, with some experts’ estimates that Google rolls out between 500 and 600 algorithm changes each year, it’s hard to keep track of which optimization techniques are in Google’s favor and which can get you a penalty.
To help you prevent endangering your website’s Google search rankings, we’re bringing you 10 SEO practices to avoid.
- Not Optimizing for Mobile
In 2020, the share of organic search engine visits in the US via mobile reached its peak at 64%. The number represents a tremendous rise in mobile Internet use, compared to 2013 when only 27% of organic traffic came through mobile devices.
Similarly, in 2021, mobile devices accounted for 74% of clicks on search ads in the US.
Despite the huge numbers and Google naming the Internet as mobile-first far back in 2018, 24% of the top million most popular websites globally are not mobile-friendly.
The convenience of browsing via mobile has settled its consistent and undeniable rise compared to desktops and tablets. As a result, Google has already begun penalizing websites that don’t meet the engine’s mobile-friendly or responsive standards.
What to do: Start by running your website through Google’s free mobile-friendly test and optimizing the weak points.
- Slow Loading Speed
On average, a desktop page loads in 10.3 seconds. On mobile, page load time is 22 seconds. Yet, more than half of mobile visitors will abandon a website if it takes longer than three seconds to load.
Google recognized users’ tendency to bounce if waiting too long for a page to load and thus rolled out Largest Contentful Paint (LCP) as a part of its Core Web Vitals. LCP measures how long does it take for a page’s main content – e.g., images and text – to load.
A good LCP result is less than 2.5 seconds, whereas a load time of over 4 seconds can drop your search rankings significantly.
What to do: Check your site’s LCP by running it through the tools such as PageSpeed Insights, Lighthouse, Search Console, Chrome DevTools, etc. Then, optimize your website accordingly and regularly perform page speed tests to maintain good results.
- Not Checking the Vitals
Besides the previously mentioned LCP, Core Web Vitals entail FID and CLS:
- FID, i.e., First Input Delay, refers to the time between a user’s first interaction with your site – clicking a link or tapping a button – and the browser’s response.
- CLS, i.e., Cumulative Layout Shift, measures unexpected layout shifts within the viewport during a page’s entire lifecycle.
In other words, Core Web Vitals measure factors Google deems important for user experience – performance, responsiveness, and visual stability. As the search engine furthers its efforts to provide an impeccable user experience to searchers, websites that score poorly in either of the three factors are set to face a drop in their rankings.
What to do: The same tools we listed for measuring LCP will provide you with your other Core Web Vitals results. Analyze them, optimize your site following the recommendations, and regularly test to keep up with Google’s expectations.
- Building Low-Quality Backlinks
Before the Penguin update, paid “guest posting sites” or link insertion on so-called link farms represented the foundation of increasing a site’s domain authority. Even though Google devoted particular care to undermining these bogus websites, link farms unfortunately still exist in a more or less sophisticated manner.
Link farms are websites with little to no organic traffic that artificially create poor quality content in large amounts. Although obtaining a backlink from a site with DA over 70 or 80 might seem tempting, there’s a realistic chance this practice can backfire.
Artificial link building is considered a black-hat SEO technique, whether you pay for a guest post, link insertion in an existing blog post, or agree on link exchange. There’s no benefit from appearing on a link farm, as your website will get no traffic via that channel and will be associated with poor content and web practices in the eyes of Google.
What to do: Avoid bad SEO practices and allow your website to organically achieve backlinks by creating valuable content.
- Irrelevant Anchor Texts
Whether it’s internal linking or writing a guest post (for a reputable, quality website, mind you, not a previously discussed link farm), the choice of your anchor text is a decision to sleep on.
Anchor text is a major SEO opportunity to relate a specific web page on your website with a certain word or a phrase you wish to rank for.
Instead of generic anchor texts such as “click here” and “read more,” opt for branded anchors and long-tail keywords related to your page.
What to do: Before deciding on an anchor text, thoroughly research queries users search for to find your brand.
- Avoid Keyword Stuffing
Keyword stuffing is an ancient web practice of cramming a website blog or a landing page with the same repetitive keywords to achieve a better search ranking. The approach yielded unnaturally sounding texts, but it worked – at least until the rollout of the Panda algorithm update in 2011.
Thanks to AI, Google can now understand the meaning of a bulk text, even if you don’t repeatedly use your key phrase and opt for synonyms instead. Moreover, the search engine can easily correlate a text with a user query without the need for both the text and the user question to contain the same terms.
What to do: Write with a natural flow and use keywords sporadically and organically. Instead of repeating the focus keywords, research synonymous phrases to cast your net wider.
Let’s say the page you’re linking to lists website design companies in Chicago. Use tools such as Keyword Planner, Answer the Public, or Ubersuggest, type in your keyword, such as “Chicago web design agency” and uncover different variations to convey your message.
- Neglecting User Experience
SEOs, web designers, and business owners would have it much easier if a single, unified website UX metric existed. However, the quality of user experience on a web page is reflected across website metrics – bounce rate, pages per session, time on page, etc.
The process of improving your site’s UX thus begins with an in-depth and analytical examination of your website metrics.
While sometimes the devil really is in the details, for most websites, bad UX is caused by some of the well-known bad practices:
- the website is not optimized for mobiles
- poor quality content
- slow loading speed
- unintuitive navigation
- unappealing design
What to do: Gather website insights through tools such as Google Analytics, Hotjar, Optimizely, etc. Then, integrate the findings into regular A/B testings to monitor what works best for your UX.
- Not Bothering With Alt Tags
Alt tags refer to the alternative text that describes the images on your website. Besides making your website more accessible, taking your time to write quality and suitable alt tags can help boost your site’s rankings in the image search results.
What to do:
- Be specific and concise.
- Describe the image’s content.
- Use keywords only if they fit the context.
- Broken Links & Bad Redirects
Empty or non-existent pages can spend your crawl budget quickly. If bots come across broken links repeatedly, your website might be regarded as outdated and thus penalized by Google.
Bad redirects can negatively affect your relationship with both Google and users. Sending traffic to another page than the one intended can signify deceitful behavior, which can not only lower your rankings but get your website penalized altogether.
What to do: Check for broken links and bad redirects in your Search Console, and make sure your users land on the page they wanted.
- Not Working on Your Content
Quality, informative, and valuable content is the backbone of your website. Whether you completely disregard writing a blog, don’t regularly update your best-performing articles, or publish blogs for the sake of creating content, poor site content practices will have adverse effects on your search rankings.
Having great content makes your website an authority on the topic for users and Google alike. Thus, Google will want to promote your website in the search results, and users will love reading your enlightening pieces.
What to do: Find a balance between quality and quantity and write useful articles that answer users’ questions.
Rick Seidl is a digital marketing specialist with a bachelor’s degree in Digital Media and Communications, based in Portland, Oregon. With a burning passion for digital marketing, social media, small business development and establishing its presence in a digital world, currently quenching his thirst through writing about digital marketing and business strategies for DigitalStrategyOne.