If you’re an SEO professional, you don’t ever want to receive this notification on your (or your client’s) Search Console dashboard.
This dreaded message means Google has taken a manual action on your website, which can be due to any of the following reasons:
Security threats and search engine penalties are critical issues that can hurt your rankings and your bottom line. You need to avoid them from the outset. To achieve great SEO results, you need to ensure that your website is well-protected and properly optimized.
I had the pleasure of moderating an SEJThinkTank webinar last October 18, presented by Eric Kuan, Webmaster Relations Specialist at Google. Kuan talked about website security and manual actions, and how webmasters and SEO professionals can avoid getting hacked and penalized. Here is a recap of the webinar he gave.
According to Google, the number of hacked sites increased by approximately 32 percent in 2016 compared to 2015, and this trend is not expected to go down. Aggressive hackers prey on vulnerable websites, and if you don’t secure your outdated site now, you might be the next target of their attack.
Hacked spam is the most common type of website compromise. Spammers inject content into a legitimate website in order to drive traffic to a malicious or deceptive site. A hacker might redirect content to pharmaceutical, gambling, or pornographic websites that can cause real damage to your actual site.
Malware is any piece of software that was written with the intent of doing harm to data, devices, or people. Malware can directly affect your website users, which is why Google provides strict warnings.
Credit card skimming is a fairly new security threat that affects e-commerce platforms. It can also be considered as one of the most dangerous compromises for consumers as credit card data is stolen. Thus, it can hurt your reputation if you don’t address this immediately.
A botnet is a network of computers infected with malicious software and remotely commanded and controlled by cybercriminals called botmasters. Botnets steal your resources in order to do malicious things like crack passwords or crack other sites. This type of compromise is difficult to detect, often bypassing anti-virus and security tools.
You should always put web security on top of your list.
Hackers are constantly looking for exploits. Check your log files constantly so you can spot and fix any compromises right away.
Pay attention because only a single weak link is needed to break the entire chain. You can do 98-99 percent right in website security but if you neglect that 1-2 percent, you’re still vulnerable to compromises. Hackers can exploit that one weak link and undo all the security measures you’ve done.
Whether you’re a small or a big brand, you can get affected by website compromises. No one is 100 percent immune to these types of security issues.
If your site is hacked, a lot of damage can happen:
Aside from the above damages, fixing a hacked site, finding the vulnerability and re-securing lost data can be extremely difficult. Hackers will constantly try to keep a hacked website hacked. Therefore, they will do things that can prevent you from spotting the compromise, such as cloaking and file injection.
The best thing way to avoid this inconvenience is to secure your website properly.
Here are the steps you can take to avoid your website from getting hacked.
Once you’ve added and verified ownership of your site in Search Console, Google will send you critical website notifications such as vulnerability and hacking warnings that you need to pay attention to.
Talk to everyone who works on your site – developers, marketers, SEO professionals, etc. – and make sure that they understand the importance of security.
This is one of the most effective ways to recovering your site when it has been compromised, but not all webmasters do it. If you have a backup of your site, it will be easier to revert it to its original state prior to getting hacked.
Keeping your software updated is the easiest thing you can do to prevent your site from being compromised. Most of the compromises Google sees are from outdated software such as content management systems (CMS), plugins, etc.
If you’re using a CMS or e-commerce platform, sign up for their newsletter and be on the lookout for emails saying you need to update the software due to security risks.
Talk to the people who are working on your site because sometimes making updates to the software can cause certain plugins to break or become incompatible.
If you’re not an expert in securing your site or you think you need an added layer of protection, you can invest in a security software.
HTTPS is about encrypting the information transmission of your website, which is a good practice that can help keep user data secure. This is related but different from securing your website from intrusion.
Google urges using HTTPS everywhere. If you have limitations, then use it on any sensitive data that gets passed like passwords or credit cards. Google’s Chrome browser will now notify if sensitive info is being passed on non- HTTPS.
A manual action is an adjustment of a site that is manipulating Google Search. Manipulative behavior is:
Make sure to follow and understand Google’s Webmaster Guidelines. Don’t resort to manipulative behavior to game the search engine – it will do you no good.
If you’re building a good website for your users, you aren’t going to get penalized. A manual action is reserved for webmasters trying to do something tricky in order to manipulate search rankings.
Talk to your users about how you can improve their experience on your website. Google focuses on bringing users to sites that would be most helpful for them. Therefore, if you listen to what your users need and give them what they want, you should have no difficulty ranking well.
You should also make sure that your website’s technical SEO components are on point so that Google understands it properly.
A reconsideration request is a request to have Google review your site after you fix problems identified in a manual action notification. Reconsideration requests are manually reviewed by the Google Webmasters team.
If you aren’t aware of the problem, get help from experts.
Watch the video recap of the webinar presentation and Q&A below.
Here’s the SlideShare of the presentation as well.
Join the next SEJ ThinkTank on Wednesday, November 15 at 2:00 PM Eastern featuring Christopher Hart and Christoph Trappe from ScribbleLive. Find out how you can break down organizational silos and align SEO and content teams to obtain bigger wins.