Duplicate Content Caused By Tags and categories.
Duplicate Content causes by many URL parameters.
we can fix it by Robots.txt and Google parameter Tool. in GWT
Also, deindex all categories and tags using Url Removal Tool In google webmaster tools.
- Click on Google Index on GMT dashboard, and Select Remove Links.
- After That enter The category string and select to remove all directory. It removes all URLs contains above category parameter. yourwebiste.com/category/ffff and other child URLs remove at a time.
- Tags also the same process we need to tag or any renamed string like blocking directory using robots.txt. check out my robots.txt tutorial. also seeHow to use tags and categories in WordPress.
- also, you can find types duplicate contents on yoast.
- Remove reply to com variable by yoast plugin
- SEO>>Permalinks>> Select Remove Replytocom variable.
I have 2 years old website which was hit panda still not recovered due to duplicate content caused by internal links,404 Not Found Errors, WordPress tags and categories.
Simply I changed the URL with New domain The website was high for few days may be 15 days. but I got a message from google webmaster tools. as low-quality thin content that no value adds to the user.
I am unable to find that totally compressed and gain Changed the New URL. but this I got lucky I found thin low-quality content caused by media attachment URLs.
Thin content caused by WordPress Attachments
Actually, WordPress media default URL looks wordpress.com/?attachmentid=1223
but here WordPress media Permalink or URL slug wordpress.com/postname/attachment name
Here wordpress.com/postname and wordpress.com/postname/media name (without any content only title and media image)
Solution: Attachment Pages Redirect Plugin; Yoast SEO Plugin Not working but that have an option about redirect attachment URLs to Post page.
Finding thin Content By Words count, Modification Date, paragraphs in Article, headings count, images and other factors also.
Check google wmt parameter tool to control parameters like replytocom
Test your robots.txt with GMT (robots texter upgraded).
Try to include better robots.txt