Now that you’ve got at least one full time marketer on the payroll, I’d strongly recommend they set aside an hour or three to sort the mass dupe content, as seen by Google, due to the way the blog is repeating the content of blog posts.
“In practical terms, an autonomous data network is one that configures itself. All data on the network is automatically split into chunks and encrypted”
Search for that on Google, just like it is. It’s a phrase from a recent blog post.
1st result will be the blog post. See the bit at the bottom of the search results, under that 1 result…
In order to show you the most relevant results, we have omitted some entries very similar to the 1 already displayed.
If you like, you can repeat the search with the omitted results included.
Click the link
Notice that there are 14 results now? That’s 1 good, primary result and 13 results in the secondary or supplementary index (where Google stuffs dupe content).
In other words, Google sees 1 original and 13 bad copies. This is equivalent to a quality score grade of ‘Fail’.
The impact is Panda algorithm related, where site quality impacts organic search rankings. It may not be a massive issue for this article, but instead will impact everything on the entire site, serving as a glass ceiling that prevents widespread reach of content published on the blog, and main site.
Reducing excerpt lengths, eliminating tags from indexing, yata yata - basic SEO 101 kind of stuff. Should be able to fix half in a few minutes and the other half in 2-4 hours, unless the CMS or template files are problematic, then double the time could be required.
Just a suggestion…