Sites that are heavy on JavaScript (JS) would do well to pay attention to Google’s recent announcement, because of the fact that this is the sort of thing that could potentially end up helping their sites get better SEO scores than might have been the case otherwise. Gary Ilyes, an executive at Google, recently spoke out about a flurry of complaint emails he had received from SEO professionals who reported seeing fake sites in the SERP.
With all of that having been said and now out of the way, it is important to note that he highlighted the extended loading times of these sites as a particularly serious issue. He also recommended that sites that are heavy on JS should make sure that the content loads first, and that includes marginal boiler plate content which currently loads dead last.
Many of the pages that he spoke about managed to only load the boilerplate content, and that can make it harder to spot duplicate sites. Sites that are getting reported as dups can easily change things by making the content load first, and that might give them a higher SEO score as well as getting a better reception from site visitors with all things having been considered and taken into account.
Ilyes’s recent threads on Mastodon and LinkedIn showed that Google is not going to allow sites to get ranked easily. They are going to make sites work for their rankings, and the user will always be the winner when they are given such high levels of preference.
Google is constantly changing up their algorithm, but oftentimes sites can get better results by focusing on fundamentals. Faster loading times are well known for boosting SEO, and sites that have struggled to boost their rankings in the past might want to give this a try since it can streamline the loading process and help users and visitors to see the important content first.
Extended loading times make it very likely that a user might click away. That can destroy a site’s rankings on the SERP and make it hard to recover them later on.
Read next: Google Introduces Continuous Desktop Scrolling For Search Results In The US
With all of that having been said and now out of the way, it is important to note that he highlighted the extended loading times of these sites as a particularly serious issue. He also recommended that sites that are heavy on JS should make sure that the content loads first, and that includes marginal boiler plate content which currently loads dead last.
Many of the pages that he spoke about managed to only load the boilerplate content, and that can make it harder to spot duplicate sites. Sites that are getting reported as dups can easily change things by making the content load first, and that might give them a higher SEO score as well as getting a better reception from site visitors with all things having been considered and taken into account.
Ilyes’s recent threads on Mastodon and LinkedIn showed that Google is not going to allow sites to get ranked easily. They are going to make sites work for their rankings, and the user will always be the winner when they are given such high levels of preference.
Google is constantly changing up their algorithm, but oftentimes sites can get better results by focusing on fundamentals. Faster loading times are well known for boosting SEO, and sites that have struggled to boost their rankings in the past might want to give this a try since it can streamline the loading process and help users and visitors to see the important content first.
Extended loading times make it very likely that a user might click away. That can destroy a site’s rankings on the SERP and make it hard to recover them later on.
Read next: Google Introduces Continuous Desktop Scrolling For Search Results In The US