Semantic SEO 2016 Relevant and Proof Terms Unveiled The Unfair Advantage of Semantic Analysis to Double your SEO and Triple your AdSense Results Chief Editor: Alberto Moreno Nieto
Disclaimer note The techniques described in this book are the result of numerous experiments and research. The authors of this book are not responsible for the impact these techniques have on the positioning of your website nor the income of your business.
Authors Semantic SEO 2016 - Relevant and Proof terms unveiled is a collective work created by Seowaz.com Team The authors of the work are: Alberto Moreno Nieto Brian Smith Paloma Pericet JesĂşs MarĂn Daniele Siddi
ISBN digital edition: 978-1-326-54476-8 No part of this publication may be reproduced or transmitted in any form or any means, electronic, or mechanical, including photocopy, recording, scanning or any information storage retrieval system, without explicit permission in writing from the Author (Chief Editor). Copyright 2016 by Alberto Moreno.
Testimonials “Semantic search is changing the way people use the web. Semantic SEO 2016 provides a concise summary of state of the art of SEO techniques, with powerful, actionable tips to take advantage of semantic search. If you are serious about improving your search traffic, Semantic SEO 2016 is an excellent resource.� Andy Kirkmaf --- SEO Consultant ---
All of us publish webpages on the Internet with the objective of those pages being visited. After deploying classic rules of SEO (external links, meta-tags, images, anchor links, infographics, H1, ...), it is the content that provides the positive differentiating factor. 'Semantic SEO 2016' provides the techniques and pathway toward the highest Google positions by focusing on content as the primary factor. Angel Maria Herrera --- CEO of www.samastah.com ---
Since the arrival of the Panda algorithm, "content is king." We have heard this particular phrase over and over again. But the question we are posed with is: How can we produce better web content than that which currently exists on other webpages? This book teaches not only how, but also provides you with the tools, examples, and necessary steps to be taken in order to maximize the potential of your webpage content. It also identifies the "do's and don'ts," elements judged as necessary to improve in order to cope with SEO challenges as a basic daily task. Besides this, the book is brief and to the point, focusing on Panda, Penguin, and Hummingbird, as well as a number of diverse SEO elements. Identifying what those SEO elements are, and what web tools help us to improve those SEO algorithms, is a basic lesson in SEO. If content is king, this book teaches how you how to achieve the power you are seeking. Ivan Kydystuk --- CEO of SIU Sp. z o.o. ---
Introduction We have learned these techniques the hard way, we fought, we were defeated, stood up, studied, shared, investigated, innovated, and after many frustrations we got some guidelines that helped us to greatly improve our SEO and that may help many other webmasters to get huge improvements. Many of the techniques exposed in these book are already explained in many other books but we have repeated them here, for one important reason: they matter. Google currently pays attention to about 250 signals, and every one of these techniques aim to get a good rating in most of them. Some of the techniques are absolutely exclusive to this book and curiously are the ones that may skyrocket your SEO or AdSense results. Statistical Correlations provide evidence that new signals like Relevant and Proof terms strongly affect SEO and here we explain how to take advantage of them. Semantic SEO is evolving in intelligent, unique and funny ways. Here we will show you incredible discoveries that will let you make the best of these changes. We will overview all marketing types, and also explore all the signals that affect SEO, but we will dig deeper in Semantic SEO which is what Google is currently focusing on to strengthen their algorithms and is still very new for all SEO specialists.
What is SEO The Internet has been the biggest revolution in telecommunications in recent decades, allowing most information services to migrate to a globalized network. Search engines like Google are the biggest showcases of these services. However, the ranking of your site on search engines may not always be what you expect. Why is that? Because your website is not properly optimized for visibility on these search engines. SEO (Search Engine Optimization) is the optimization of your website’s position in search engine results. The purpose of this process is to improve the visibility of your website, which depends on various factors such as its structure, accessibility, and content. In recent years, Google has made a special effort to improve its interpretation of text to gauge the quality of the content of a web page.
How do you improve your SEO The optimization of your website revolves around several fundamental aspects. Among these are the quality of your content, the keywords in use and the links to your website from external websites.SEO can be divided into two different sections: ● Off-page SEO, which includes backlinking and social network optimizations. ● Onsite SEO, which includes the creation of quality content, user experience, and technical aspects.
Both aspects are needed for optimizing your site’s ranking among primary search engines such as Google, Bing, and Yahoo! In order to optimize your ranking using SEO strategies, it’s a requirement that you design your website to create a great user experience. This means that it is mandatory to have a clear topic for your site and develop a site that satisfies your users’ needs and offers useful information within an intuitive design. In the process of website ranking, the user experience is very important because if visitors do not find the information they seek, they will immediately leave the page. The latest improvements in search engines have made black hat SEO (bad practice) techniques almost totally useless, and you should avoid them always. These techniques are penalized when it comes to ranking a website so the result of their use is extremely negative for your business. Among these techniques are things such as including hidden text on a page or overusing keywords to try and achieve an unnatural ranking. SEO techniques are intended to rank specific pieces of content within the context of what users are
looking for.
Google will give a favorable ranking of your site so long as it is based on providing valuable information to the user.
Visit http://www.searchmetrics.com/knowledge-base/ranking-factors-infographic-2015/ for a detailed list of ranking factors. This book is divided into several sections focusing specifically on creating high quality content using the appropriate tools.
Other online marketing types When promoting your site effectively, there are several approaches to improving your visibility on the Internet, such as positioning through paid advertising (SEM), organic positioning (SEO), the use of social networks, email marketing, and display advertising online. The more the Internet talks about your site, the more links you get and therefore the more relevant your content is in relation to other pages. Besides, Google uses other signs unrelated to the use of relevant links, such as: â—? Social networks. â—? Feelings when writing about your website (if users are happy or unhappy with your service).
Get your users to talk about your site. The trail from comments and reviews will appeal to Google and improve your ranking. This book does not try to deal with social and backlinking techniques to promote your website. We will mainly focus here on Semantic SEO mainly. For learning backlinking techniques we recommend you check out the Point Blank SEO Link Building Course (pointblankseo.com), which introduces ways of improving your backlinking whilst avoiding penalizations. Here we list the different types of advertising and online promotion.
Affiliate marketing In affiliate marketing, advertisers pay a commission to websites that advertise their products. For each sale made using the affiliate’s website, the website receives a commission. The main advantage for advertisers is that they only have to pay commission for sales made and not returned. For websites that advertise these products, the big advantage is that if the website is very specialized, the payment performance may be much higher than standard advertising. There are many services using this model, from websites that sell subscriptions and have already developed an affiliate program to big companies like Amazon. Affiliate links include a personalized link that detects what source the client comes from, that is, who owns the link and to whom you should assign the affiliate share. Similarly, on your website you must provide relevant advertising about the topic you advertise, otherwise the content will not be as attractive and might seem alien to the interests of the user base.
Search Engine Marketing (SEM) Search Engine Marketing (SEM) involves paying search engines to position your website using text ads whenever a user performs a search that could be related to your page and may be lucrative. AdWords is a tool offered by Google that lets you pay for placing those ads. The system on which AdWords is based is the Pay-Per-Click auction, which means that advertisers will bid for a specific keyword. The one that pays more will rank better. The ad links will appear at the top of the search engine results marked in a way that will differentiate them from links ranked using organic processes. Each time a user clicks on your link, you pay for it according to the AdWords auction offers. This system, used properly, has an immediate advantage over organic ranking. For example, if you sell a product for $300 and you pay $5 for each click, you will get a nice profit, provided that at least one of every ten visitors ends up buying your product. The fact that you’re paying the most for your ad for a specific keyword does not equate to your ad being automatically ranked in the top ad positions. AdWords works using the Ad Rank system in which a combination of the highest bid for a keyword and the quality of your page are used in order to rank results. This way, Google ensures that even within the ads, websites have to compete in quality. The higher the quality of your ads, the lower the cost of your bids. SEM has several very important benefits: ● Instant traffic without waiting for SEO ranking. ● The ability to determine the size of your potential customer base. ● If your traffic does not generate enough profit you can compare your results with competitors to find out why they are profitable and your site is not. ● You can conduct fast experiments to optimize your user experience and conversions to sales. ● You can quickly analyze which keywords give you the highest conversion to sales allowing you to define your content strategy. ● Increasing your traffic will likely increase your popularity in social networks and this will positively affect your SEO rank.
Social network marketing Marketing through social networking is a marketing strategy through the Internet using media such as blogs, forums, and social networks. Social networking has stood out as an immediate way to inform a large audience of developments and news quickly and with little effort. Learn to move like fish in water in social networks and let your audience grow and generate greater loyalty.
Promoting in social networks Google constantly implements new features in its search engine so you always have something new to offer to global users. The major social networks today are Twitter, Facebook, and Google+, as well as other social networks that are more picture-oriented such as Instagram and Pinterest. The profiles that are in use in these social networks are also affected by Google PageRank, so managing the activity and the content of these profiles is always recommended. By using these social networks, you can analyze the engagement level of your users by their interactions with your posts. By doing this you also detect what kind of news and general information interests your users thus allowing you to focus more on that type of content. Social networks can be the main window of interactivity with your users since it is on these online services that most content is shared. By showing activity in your social profiles, Google analyzes the activity of your posts and grants them the appropriate relevance according to their usefulness. Social network engagement provides a good showcase for your business. One of the immediate ways you can increase this engagement is to create attractive content in a more advertising-oriented, direct style that helps to attract the attention of users. When creating real engagement with your users, employ an easily recognizable style that comfortably facilitates interaction with your followers. Due to the high degree of specialization required in social network marketing, the role of the community manager is gaining great importance in all online businesses. The community manager develops an effective strategy for your site ensuring agile interaction with users.
Social network advertising Social network advertising is a type of online advertising that focuses on social networking services. This kind of advertising has the following advantages: â—? Instant experiments on different kinds of users â—? A great platform to speed up your growth on social networks â—? A lot of targeting available: geo-targeting, behavioral, social, etc.
You can explore and learn how to use advertising on the biggest social networks: Facebook, Twitter, LinkedIn and YouTube. ‘Returning’ users will help you improve your SEO ranking. Focus not only on instant revenue, but also on user engagement with your brand.
Email marketing The email marketing strategy is the use of electronic mail with an informative or commercial interest, depending on the purpose of your business. The email marketing strategy is one of the most direct and traditional ways to contact your users. This does not mean it is the most effective way, but if done correctly, it definitely attracts the attention of users. For this to happen you need a strategy that both fulfills the needs of your users and avoids sending emails too often, so as not to be categorized as spam. The great advantage of this type of model is that the user will read your ads directly from their own email account. Combining it with the the usual routine of checking your personal email from your mobile device, email marketing allows you to almost instantly reach your entire user base using few resources. When using email marketing, you must consider spam filters that could classify your content as inappropriate. It is also not advisable to send too many emails as they can tire the user, leading them to ignore the sender or unsubscribe from your user list.
Online display advertising Advertising is one of the great business markets that exists today, especially among online businesses. Online display advertising is the most classic form of advertising on the Internet that uses a variety of formats to advertise a service or additional site within another domain. Online advertising is more customizable, allowing the possibility of selecting the type of users you want depending on their preferences, age, or gender. Online display advertising is usually displayed as a banner located on a visible part of the target page in order to attract the attention of users. This is one of the main forms of digital advertising. This type of advertising provides two big advantages over traditional advertising in newspapers or other media. Firstly, it is completely measurable, so we know exactly which users were captured by analyzing the results as well as the income generated from these operations. This allows us to optimize it to maximize revenue. Secondly, we can create a completely targeted advertising campaign aiming our offer at a specific demographic. Adjusting the features of advertising according to the media in which it appears provides a better conversion ratio (i.e. a higher percentage of users that perform actions such as purchasing on our website after the visit). Using this method is a simple and practical way to attract new users. Online display advertising is a payment model so it is important to define your goals in advance in
order to optimize your budget. There are different ways in which you can do this: â—? Cost per thousand impressions (CPM): A fee is payable for every thousand times your ad appears on the page. â—? Cost per click (CPC): a fee is paid each time a user clicks on the ad.
Trying different types of ads will help to optimize your marketing.
Main algorithms that influence SEO in Google Google is currently the largest search engine with around 80% of the global market. The key to its success lies in the algorithms conceived at the very beginning of the company. These algorithms are programs and formulas that select the result that best suits the intention of the user for a specific search. We learn about how Google works internally by the information the company shares, and by intensive use of statistical correlation. In the last few years, Google has released three main algorithms that aim to punish websites that prioritize SEO more than user experience. Back in 2012, a great deal of SEO experts learned that many bad practices led to higher ranks. This is no longer the case thanks to these new algorithms. Now, the webmasters need to focus more and more on quality to obtain higher rankings.
Google Penguin Google Penguin is the name given to the algorithm that penalizes websites that do not follow the webmaster guidelines Google has developed in terms of ‘inbound links.’ First announced in April 2012, Google Penguin launched a war against what is known as black hat SEO. This algorithm severely penalizes the search engine rank of pages that use positioning techniques that they consider invalid. These techniques may even result in the removal of the website from the search engine. When the first version of Google was created the intention was to qualify every page according to the external referral links received from other sites. The idea was very good, and created highly-relevant results. Over time, many companies found ways to create links that had nothing to do with the actual value of the page, resulting in many pages ranked in high positions, but with little real value to the user. One of the main innovations of Google Penguin has been to penalize pages that gain position by using link farms, or by creating low-quality links. Penguin penalizes the links that are not reliable indicators of the quality of your website. This means Google’s algorithms will use all possible intelligence to detect the different types of links that link to your page. They might be links that point to your domain from another one of your own domains, a friend doing you a favor with a link, or links introduced indiscriminately on third-party pages or paid links. In theory, the only links that are allowed are those inserted by a 'human' into a page, someone who has liked your site and wants to say something about it. All other links are theoretically prohibited. Penguin algorithm information is unrelated to the content, and is not fully covered by this book. As this book is focused on onsite SEO, we recommend the course at www.pointblankseo.com for a
very good explanation about how the Penguin algorithm works.
Google Panda The Google Panda algorithm bases its effectiveness on categorizing low-quality websites by their content, penalizing their ranking. Its first version was released in February 2011, making low quality pages with high advertising content fall from the top ranks. In the latest updates, Google Panda includes features for new and small sites to stand in the higher positions as long as their content is of high quality. This helps new entrepreneurs looking to gain a foothold in the Internet. The objective of this algorithm is to ensure that search results are useful and penalize spammers who rank artificially. The quality of content is a major concern, and the Panda algorithm makes the webmasters and administrators of the pages align their goals with the search engine.
Google Hummingbird Hummingbird interprets the text, trying to make sense of each of the elements within a context. Thus, the interpretation of long-tail keywords has been improved. The more specific and explanatory the content, the better the ranking. Concepts relating to keywords are more fully developed in the Keywords section.
Rating metrics User experience and interactivity with the page is one of the elements that will affect your ranking, and this corresponds to several factors verifiable through tools such as Google Analytics, Clicky, and Woopra. To improve your SEO, it is important that you spend some of your time improving these metrics. These ratios help Google to assess the quality of your page.
Click-through Rate In Webmaster Tools, you will see the ratio of clicks you receive for each search on Google. This is a very important factor to optimize your ranking. If the user is not interested in your site and does not click, Google will detect this and lower the positioning of your website. You must improve this ratio in the following ways: ● Optimize the title of your article to make it more attractive to the user. ● Optimize the tag description of your article to make it more appealing.
Note that the click-through ratio also changes depending on your rank in Google, as the first results always receive more clicks. Check your searches with a low ratio of clicks in Webmaster Tools, and optimize them to improve your ranking.
Time on site The time on site is the average amount of time that users spend on the website, from the time they arrive from the search engine until they finally exit from the last page. This metric strongly measures the level of engagement with the site, analyzing their real interest in the content. It’s important that you find ways to increase the time on site of your visitors. Actions to increase time on site ● Improve the overall design of the website through user surveys and A/B experiments (https://support.google.com/analytics/answer/1745147?hl=en). ● Study which pages within your site generate more time spent on the page to create similar content.
● Include videos in articles. ● Dedicate effort to include interactive elements. ● Include content created specifically for the page, instead of generic content. ● Enter polls, petitions, and registry items for the user to interact with the site. ● Set external links to open in new tabs. ● Divide your content into different pages in the same domain and relate these pages together with the appropriate tags, rel=”next” and rel=”prev”, and make sure they have different URLs. ● Allow the users to interact with the page by adding comments or discussing with other members of the site
Returning visitors Returning visitors is an indicator of user engagement. When users return to a website, Google interprets this behavior to mean that visitors have found what they want and are willing to come back for more content. By creating a user base, we can provide a regular income of visits which must be cared for by incorporating new content to maintain the interest of the visitors. Checking the volume of return visitors to the site can show what kind of content is most interesting compared with the number of visits that the site generates. Therefore, the user rate of return is an indicator of how we are improving the website’s image and rankings which will lead the site to being considered worthy of higher positions. Actions to improve the number of returning visitors ● Improve user loyalty by sending a newsletter that informs users of new developments on the website. ● Update the content of the page to provide relevant new information about the topic of the site. ● Create contests to increase the interest of your users. ● Use social networks so that they can receive information about your page updates directly. ● Enable the possibility of collaboration with comments from your users and even articles if it is necessary. ● Post advances of your upcoming articles so you can generate hype in your user base. ● Spread information on social networks to advertise your content.
Bounce rate By definition, the bounce rate measures the level of user abandonment when they don’t find the content they are looking for. A high bounce rate doesn’t necessarily mean that the user experience is bad; maybe the user is getting what he wants from your site. However, if the user has expressly found what they wanted without interacting with the site the leaving of the user is counted as a bounce as well. In a real and relevant data set you combine the data of the bounce rate with the time spent on the page so you can properly observe the activity of users on our site and check if they have found what they really want.
Actions for reducing the bounce rate ● Include articles related to what the user may be reading at the moment. ● Add related categories to fully explore the site. ● Relate highly-relevant information to the page title. ● Avoid using pop-ups to not interrupt the user experience. ● Divide long articles into multiple pages to promote internal traffic using the tags rel="next" and rel="prev". ● Provide the user with content they are searching for according to the selected keywords.
Keywords What are they and why they are so important Keywords are the most important terms used to position your website. More specifically, keywords are the words that you enter into the search engine, and in return you get a result according to the entered term. That is, if you search for "chocolate" using Google, the most appropriate results for that word will appear, such as its definition, where you can buy it, or nutritional values. Be clear about the purpose of your page, as well as the main topics that will be addressed in it. If you are interested in making a website aboutsports, it is not recommended to position your content with keywords from different fields, such as cooking terms, because this will affect your Authority Rank. When choosing keywords, you should consider current competitors and the use of these terms on existing and ranked pages. To do this you can use the Google AdWords tool. Using this tool, you can choose the most important terms to use. Depending on the audience you want to attract, you can configure the keyword search for more accurate results by language and geographic location. The Google AdWords tool was created by Google in order to include additional paid advertising in the search engine (SEM), but also includes numerous free utilities that are very useful for SEO purposes. You can also use specialized online tools to find your best keywords, like SECOCKPIT (http://secockpit.com/). SECOCKPIT will give you hundreds of keyword suggestions, along with data about SEO competitiveness, and will also allow you to discover the keywords that your competitors are using. Short Tail Keywords are short search terms composed of one or two words. These are used to generate general search results. More specific results are found using what are called long tail keywords. These are generally composed of three or more words and are used for more specific searches with less competition. Suggestions from Google when searching, as well as searches related to the term in question, allow you to expand your understanding of the most used keywords to reach different results. When looking for “sports” in Google, the search engine itself suggests adding to the search terms like “Olympic” or “water” using Google Instant, the prediction system included in the search engine. In turn, you provide a related search, e.g. “types of sports,” “sports list,” or “sports shop.” These terms help you to follow a specific strategy for generating your content, leveraging the traffic generated to redirect to your website. For example, if your goal is to create a web page about football in Spain, consider creating a lot of in-depth articles that cover multiple aspects of football in Spain. By using such keywords, the conversion ratio increases, especially if you position descriptive phrases, such as “teams La Liga 2015,” or natural phrases like “trip to Barcelona from Madrid by bus.” Although such phrases have less traffic than short tail keywords, using this method will help you in different ways:
● Google will see your content as useful and not spammy. ● Your content will be more relevant and your articles will address more specific topics. ● The short tail keyword will also rank with time as Google realizes that you are writing good content, which is in the best interests of the user.
In fact, when you search any general term, like ‘Hotels’, you will see that today most of the articles or sites that appear have longer titles. In 2012, it would be likely that some terms with just the title ‘Hotels’ would appear on the first page. Now Google is searching in the long tail to try to find out which sites are more related to the original ‘user intention.’
Image 1 – Example ‘Hotels’ results
Short Tail keywords are those that will send traffic to your pages once they are better positioned in the search engine through the various processes explained in this book. It is recommended that the terms that most narrow your search appear as early as possible in the title. For example, ‘Hotels’ narrows the user intention down more than ‘Cheap’, so it should appear first in your title as often as possible. Use your keywords early in your article (in the first paragraph), in your H1 and H2, in bold and in your text, but not so much that Google flags it as spammy. Usually, more than five percent of keyword density in the text is considered a spammy technique. Once these keywords are already positioned, add more terms related to the structure of your page to improve your results in Google. Organic positioning requires effort, but mostly it requires time, at least if you want to follow a method that ensures a high and legitimate ranking, without resorting to tricks that would surely risk a penalty in your site’s ranking. When choosing keywords to position your page using Google AdWords, select those that best suit your purpose, while also assessing the average monthly searches and competitiveness of the keyword. Note that the lower the competition is, the easier the ranking of that term will be, thereby encouraging specialization in the subject of your website. This way, you prompt Google to think that you're not just considering the ranking, but also about offering new, quality content to the user. Link related articles together using internal links and also associate and categorize them using bread crumbs. For example: www.mydomain.com/diets/carbohydrates/pumpkin MyDomain >> diets >> carbohydrates >> pumpkin
How to pass spam filters in long tail positioning When your goal is to position yourself for many similar long keywords it is unwise to create many articles with similar keywords. For example, don’t group articles like in the following example: ● 'Buy cheap flats'. ● 'Buy cheap flats in the Manhattan area'. ● 'Buy cheap flats in Queens'.
This series of articles will be punished by Google and users because the web is saturated with spam like this. When you want to achieve long tail strategies put the words that most narrow your search in the title of the article, and then create the different sections of the long tail keywords in the article, like Wikipedia does. Notice how the Wikipedia article about Star Wars is organized:
Image 2 - Different sections of the long tail keywords If you search ‘Star Wars setting’ in Google, you will find that the first result is the specific section of the main article, ‘Star Wars’ in Wikipedia.
Image 3 – Result of specific section in a webpage
Make sure that the article is written and organized in a sensible way that provides value to the user. Using this technique, we would organize the first example in this way: We would create an article named ‘How to buy cheap flats’ and two main sub-sections: ● Manhattan area ● Queens
Money keywords Money Keywords are those with high Cost Per Click (CPC). Those keywords are expensive because they reflect a clear user intention and additionally this intention is associated with high commercial value. For example, if you search ‘iphone 6 prices’ surely you have more intention to buy an iphone than if you are searching ‘iphone 6 backgrounds’. Ranking only using these types of keywords will detract value from your page so you have to avoid using this type of content exclusively. The best way to do this is to generate useful and reliable content related to these keywords. Thus, the positioning comes more naturally, offering useful content and preventing your site being categorized as spam. Before you choose to use a money keyword, keep in mind that it is always possible to position yourself using synonyms after researching the terms through the different tools available. The diversity of these keywords means you can receive more traffic than usual, increasing the number of visits, while combining this strategy with the creation of relevant content. Imagine a website that has five articles: ● New York hotels. ● Las Vegas hotels. ● Chicago hotels. ● Paris hotels. ● Los Angeles hotels.
If the site has only five pages, and all of their titles match searches that have a high CPC (cost per click) the search engine automatically thinks that it is a website designed exclusively for that objective. Following this strategy, the search engine understands that the aim of the site is not to
provide a useful service to the user, and it will reduce the website’s rank, valuation, and relevance. However, if these items are among many others that provide relevant information, Google spam filters won’t be triggered, like in the following examples: ● Ten things you should know before traveling to New York. ● Regulations in US airports. ● The four best restaurants in Las Vegas.
Or even narrow down the user intention further: ● Hotels in New York close to Wall Street. ● Three hotels in Las Vegas for non-gamblers. ● Where to stay when visiting Chicago. ● The ten most romantic hotels in Paris. ● Hotels in Los Angeles where you will see Hollywood stars.
The fast and easy ways to improve your website ranking are often penalized by Google because they have already been exploited by many spammers. Spammers usually try to get a higher ranking without working too much or trying to offer value to the user.
The ten most common mistakes When positioning your page using keywords there are always mistakes that can hurt the value of your website. Below you can find the ten most common errors in the use of SEO, and keywords that you should avoid at all costs: ● Repeating keywords in the same link, increasing the possibility of being penalized. For example: http://sneakers.domain.com/sneakers-sport-sneakers-general. ● Using underscores to separate terms in a URL, for example “domain.com/sneakers_sport” should be “domain.com/sneakers-sport”. ● Choosing keywords that are not suited to your content; rather than attract more traffic to the website, this can cause Google to penalize you. ● Linking to articles penalized by Google, or linking to or being linked by domains of low
quality. ● Mass linking to articles or domains automatically (site-wide links). ● Repeating your keyword in more than five percent of the overall text. ● Excessive use of money keywords. ● Using generic or nonspecific keywords, leaving the theme of your website in ambiguous terrain and making it difficult for Google to match the intention of your user with the intention of your article. ● Not considering the competitiveness and popularity of the keywords. ● Choosing keywords that deliver a lot of traffic, but do not match the correct user intention (these users won’t be interested in your products). ● Frequent changes in the generation of format URLs, making Google see pages with different names and duplicate content (for example:http://www.domain.com/SneakersSport and http://www.domain.com/sneakers-sports, making a difference in the capital letters).
How to choose the right keywords Search engines try to emulate the human mind and the preferences of their users. Therefore, the key terms for different concepts should have a logical connection to the central theme of your website. However, these terms do not always have the search demand that you might initially think. Google services provide an online tool to check the popularity of your keywords. To analyze, search, and define titles for your articles we recommend using the following tools: ● AdWords. ● Google Trends. ● WordStream (http://www.wordstream.com/). ● SECOCKPIT http://secockpit.com/.
Google Trends is an application where you can find the trend in the number of searches for keywords over time, and compare with other terms. The downside of this tool is that it doesn’t provide raw search numbers, but relative graphs of the search terms in the selected time period. In this way Google Trends is only a minor help in checking the trend of the keyword you want to use. Because context is very important in any business, there are times when different terms are more
popular. In the chart below the overall comparison of the terms “football” and “basketball” shows the great popularity of basketball during the 2014 World Basketball Championship in Spain. Interpreting these contexts is vital to receiving more traffic, taking advantage of the moment of popularity of a keyword to improve your position. The choice of keywords is equally important in addressing your audience. Depending on the terms you choose, your content should be directed to a niche audience who are interested in the information or product you offer, as well as their related elements. In this way you satisfy a specific demand with your content. If we continue with the example of a sports website, it is more likely that the users you are attracting have interests in various sports fields, such as tennis or Formula 1, than unrelated fields like architecture. Your competence is especially relevant in the choice of the terms that you want to work with. Note that the less competition there is with a keyword the easier it is to get a high position in searches the users make. If you choose to position your page using these terms, that will help you increase visits to your site from the start. Identifying these needs is necessary when creating your website and your content, because search engines categorize your website depending on the terms you use. When first launching a website focus on a small range of keywords. This helps your page to get indexed in a category, allowing you to focus your future efforts on other keywords. Remember that optimal positioning is not obtained immediately so try at first to work on the most relevant keywords while gradually increasing their number and variety on your page. Choosing keywords is crucial to the success of your content. For this we propose the following list of main ideas for your choice: ● Focus your domain or subdomain in a specific area of knowledge to improve your Authority Rank. ● Take advantage of trending timelines so you can anticipate events and write articles about them longbefore they arrive. ● Use tools like SECOCKPIT to find opportunities and watch the keywords used by your competition.
Keyword density in the text When including different keywords within your content, it is important to consider how they relate to the text. For best results it is recommended that the main keywords add up to around 3% of the total text. That is, if your text is on 'Layers in Adobe Photoshop' and has a length of 1000 words, the times you should repeat these words are as follows in the example below:
Word
Repetitions
Layers in Adobe Photoshop
7 times (1000/4*0.03)
Adobe Photoshop
15 times (1000/2*0.03)
Adobe
30 times (1000*0.03)
Photoshop
30 Times
It is also recommended that the keywords appear once on the following HTML tags: ● H1, H2. ● Alt (images). ● Title. ● Description. ● First paragraph of your article.
The number of times the keywords appear in the different types of HTML tags, H1, H2, H3, ALT (images) anchor text of incoming links, URLs, and DESCRIPTION tag is also influential. A low density of keywords may harm your SEO because Google will not understand the subject of your article, and an unnaturally high density may harm it even more.
URLS URL structure has been an important source of “over optimization” in search engines, and therefore its use these days with SEO remains delicate. Here are points to consider: ● Use your keyword once in the URL or subdomain (and no more than once). ● The closer to the beginning of the URL the keyword is, the more valuable it is. ● Putting several words in the subdomain name or a lengthy subdomain name can be penalized by Google Penguin. ● Repeating the keyword in the URL, including the domain name, can be considered spam. ● Entering the keyword in the domain name (EMD technique) is only recommended if it is relevant to your domain without SEO. Domains likewww.my-keyword.com have been exploited and used for SEO for many years, and now Google revises the relevance of these domains for a particular keyword more thoroughly. ● Separate words in the URL with the character ‘-’ not ‘_’.
Url changing When we change our website we change the structure of URLs, sometimes to improve SEO, and sometimes to add relevant information to the URL, or even to meet legal requirements to avoid prohibited content. Whenever you make changes that affect the structure of URLs please note the following important points to prevent loss of indexing: ● Always redirect the old URL to the new URL using an HTTP 301 (Permanent Redirect). ● Do not try to optimize URLs if you are not completely sure how to change them or have been penalized precisely because of URL over-optimization. ● If you are changing the domain to another domain name always redirect all internal pages via 301 redirect and communicate the change in Webmaster Tools. ● It is HIGHLY recommended to perform practical tests to remind yourself to check for mistakes when replacing the URLs. It is extremely important and delicate so it is essential to ensure that this is all done well. Compatibility of the old URLs to the new must be maintained, as
sometimes the code has bugs or the coder hasn’t tested the URL generation thoroughly; it is appropriate to implement "tests". For an example of how to do this read the 'Tests on the contents' section.
Google published a guide on how to properly use 301 redirects: (https://support.google.com/webmasters/answer/93633?hl=en).
SEMANTIC SEO Vocabulary What it is We define vocabulary as the set of words within a language that a person uses. In this particular case we study a concrete kind of vocabulary using our research on proof and relevant terms. Having a combination of both kinds of terms in your work has been shown to have a strong, positive effect on search engine rankings. As a potential writer, you need to keep in mind that the context matters above all, especially if we are talking about online articles. Choosing the proper concepts for creating an article is mandatory in order to offer great content. That is something that you should never forget. Proof terms These words show that the author of the article is an expert about the topic that is being covered. We can identify proof terms as those words that belong to a particular field. These are common terms that you can find in most articles that share the same topic. In this sense, proof terms show a level of expertise about a particular topic, and are mandatory for making the article credible to the user and to search engines. As an example of this concept, if you are going to write about ‘laser hair removal’ demonstrate that this area belongs to your field of expertise:
follicles candidate rejuvenation complimentary plastic depilatory band effective tweezing underarms rubber electrolysis investment fillers efficient removal permanent treatment dermatologists discoloration dermatology effectively concentrated previous shoulders effectiveness reduction uncomfortable unwanted growth bikini consultation also underarm redness estimated laser requires recommendations melanin ingrown newsletter ineffective inhibit light frequently aesthetic advertising copyright candela irritate affordable stubborn removing permanently pigment
Fragment of word cloud generated by Seologies Terms Discovery for ‘laser hair removal’. Relevant terms These kinds of terms are not as common as the previous proof terms. In this case relevant terms are those related to the main topic, but not always present in articles covering the subject. In other words, these terms are usually present as part of a deeper subtopic. These terms show search engines, and your visitors, that you are elaborating your subject more than most of your competitors. For example, if we were writing about “weight loss” it would be a good idea to include terms such as ‘diagnosis’ for medical reasons or ‘jogging’ as an example of sports the visitor should do. By
elaborating these subtopics alongside the main topic you will be adding value to your main topic resulting in a better ranking in the search engines according to the present SEO policies.
Knowledge Graph The Knowledge Graph (KG) is a knowledge database integrated in and by Google. It allows the collection and association of information in an effective manner, showing the most useful data with regard to what you are looking for.
Image 4 - Knowledge Graph for 'Cook Omelette'
The aim of the Knowledge Graph is to understand human knowledge, associating and relating it in the most effective way possible. Currently, the KG helps Google to: â—? Identify the websites that are providing accurate and relevant information. â—? Instantly return specific results to users, understanding exactly what the user is searching for (user intention). â—? Return unique and very effective answers to many questions when users access Google via their smartphone (Google Now).
Some of this structured information currently appears alongside the search you make, offering a description of the content you are looking for through the most relevant data. Thus, if you are looking for a classic author then their date of birth, summary of their work, and a list of their most important works will appear, and also, other relevant information. The search collection is done using what is known as semantic analysis. This method uses the
disambiguation of queries to provide the most relevant information about specific user searches. This is an improvement on current search engines, offering a successful semantic interpretation through a series of interconnected terms. However, today, these engines do not understand natural human language perfectly so it requires specification to show a specific term. The Knowledge Graph is a response to users’ need for services such as Apple Siri, which offers appropriate responses to users’ questions. Thanks to this Knowledge Graph, you can get information directly, without having to access any link. The best way to exploit this component of Google is to offer truthful, detailed information in a very readable way. Make your content easily understood by Google by increasing your Flesch Readability Score.
Flesch–Kincaid Readability Score Thе Flesch-Kincaid Readability Score mеаѕurеѕ your соntеnt rеаdаbіlіtу іn the fоllоwіng ways: ● 100- Vеrу easy fоr your audience to rеаd. They are uѕuаllу 12 words long. Sentences contain words with nо mоrе than two ѕуllаblеѕ. ● 65- Avеrаgе Englіѕh рrоfісіеnсу. Sеntеnсеѕ, on аvеrаgе, аrе 15 to 20 words long. Words аrе usually two ѕуllаblеѕ lоng. ● 30- A bit above аvеrаgе. The average sentence іѕ 25 words long. Two оr more ѕуllаblеѕ can be present іn wоrdѕ. ● 0- Vеrу difficult to read. Sentences аrе 35 wоrdѕ long and the average wоrd соntаіnѕ more than two syllables.
Aѕ you саn see, the higher the rating, the easier іt is for your consumers tо rеаd your content. Technical writing іѕ uѕuаllу ѕсоrеd low due to іtѕ advanced nature and specific targeting. Thіѕ can work fоr consumers іn niche markets who understand certain аѕресts of a product or service. Fоr еxаmрlе, іf уоu were a government wireless communications fіrm lіkе LGS Innovations, your content would contain wоrdѕ that the average U.S. сіtіzеn mау not bе рrіvу to. How to improve the Flesch-Kincaid readability score Sentences per paragraph Yоu nоrmаllу wаnt to kеер thіѕ number as low аѕ роѕѕіblе. Thеrе are many thеоrіеѕ оn the ideal numbеr оf sentences реr раrаgrарh, but fіvе ѕееmѕ tо be a good number. Hоwеvеr, thіѕ is аn average number and the rеѕultѕ саn be mеаnіnglеѕѕ dереndіng оn the content оf уоur document.
Nоvеlѕ with оnе ѕеntеnсе dialogs wіll hаvе a very lоw numbеr, while nоn-fісtіоn or tесhnісаl wоrk wіll tеnd tо hаvе more sentences реr paragraph. Thе number depends оn the style оf your document though in general, each раrаgrарh should contain sentences relating to one thought оr tоріс. Characters per word This mеаѕurеѕ the average length оf the wоrdѕ іn your document. In general, shorter wоrdѕ are easier tо undеrѕtаnd thаn lоngеr wоrdѕ. Hоwеvеr, the length of the words уоu wіll nееd to use tеndѕ tо vаrу directly wіth thе reading level оf уоur аudіеnсе. Children's books require vеrу ѕhоrt wоrdѕ while technical аrtісlеѕ wіll almost always hаvе muсh longer words due tо the technical vocabulary they muѕt rеfеrеnсе. Thаt doesn’t еxсuѕе the author frоm using long words unnecessarily. Keep it short. Mahan’s Sеа Pоwеr averages 4.7сhаrасtеrѕ/wоrd, while Twain’s Huckleberry Finn had 3.8 characters/word. Longer wоrdѕ will hаvе mоrе syllables and this increases the Flеѕсh-Kіnсаіd Grаdе lеvеl оf your document. Words per sentence Thіѕ аlѕо should bе kept tо a low number. Experts tend tо center around favoring a range оf 17 to 20 words per sentence. This is an average number that you should treat as such. Flesch Rеаdіng Eаѕе Score: Thіѕ rаtеѕ your document on a ѕсаlе оf 1-100. Thе hіghеr the numbеr, the еаѕіеr іt іѕ to read and understand the dосumеnt. A ѕсоrе of 70 оr hіghеr іѕ usually rесоmmеndеd. This score іѕ соmрutеd bу аnаlуzіng thе аvеrаgе numbеr оf wоrdѕ per ѕеntеnсе andѕуllаblеѕ реr word. Yоu wаnt tо hаvе ѕhоrtеr ѕеntеnсеѕ аnd ѕhоrtеr wоrdѕ.
Semantic SEM Optimization How to even triple your AdSense results Not only will the semantic techniques help you to optimize your content to improve the SEO, but it will also help you to dramatically improve your AdSense income. The reason is very simple: Semantic analysis algorithms of your content also govern your AdSense income. Just as it doesn’t make any sense to try and guess what content is required to improve your ranking, it also doesn’t make sense to try and guess which text in your article has more potential to attract more ads and generate more revenue. Depending on the text of your articles, it might be that Google doesn’t find any ads to display to the user, or that the ads are completely irrelevant, which will result in a very low CPC. For example, if your article mentions “triglycerides”, the CPC that advertisers pay in the United States is 0.72 USD (Data extracted from Google Keyword Planner). However, when the article mentions “high triglycerides”, the CPC goes up to 4.5 USD, and just for mentioning “triglyceride levels”, the CPC goes up to 10.5 USD. You can optimize your AdSense results manually or with the help of tools.
Manual Optimization of AdSense Once your article has been written by an expert, locate the words that are really important and find out the CPC of related terms. If, for example, you write about ‘weight loss’, it is very possible that you’ve written words such as ‘BMI’, ‘triglycerides’, or ‘cardiovascular’ in your article. Find all of the terms that are really associated with ‘weight loss’ and create a table with the associated CPC (Use Google Keyword Planner for this):
weight loss pills
1.70
weight loss
4.16
rick ross weight loss
7.49
extreme weight loss
3.89
weight loss calculator
4.87
best weight loss pills
1.00
dr oz weight loss
1.27
melissa mccarthy weight loss
4.22
miranda lambert weight loss
2.08
cardiovascular system
6.57
cardiovascular disease
5.83
cardiovascular
5.05
cardiovascular exercise
2.48
cardiovascular endurance
0.00
cardiovascular technologist
7.27
what is cardiovascular disease
3.93
atherosclerotic cardiovascular disease
4.57
cardiovascular consultants
7.43
cardiovascular fitness
0.00
bmi calculator
0.49
bmi
0.56
bmi chart
0.73
calculate bmi
0.20
how to calculate bmi
0.28
bmy
0.47
bmi formula
0.21
bmi chart for women
1.14
what is my bmi
0.14
triglycerides
0.72
triglyceride
3.32
high triglycerides
4.67
how to lower triglycerides
1.58
what are triglycerides
2.25
triglyceride levels
10.68
lower triglycerides
0.00
lowering triglycerides
0.31
triglycerides
0.00
Once you have completed the table with all of the terms (and depending on the purpose of the article), you can choose to insert terms that have increased demand for advertisers, and they will pay you more for them. Enter ‘cardiovascular fitness’ and they won’t pay you for your ads, talk about ‘cardiovascular consultants’ and your CPC will go up to 7.43. After seeing these variations, does it interest you to analyze the CPC before going back to write your article? How much money do you think you could be losing by not taking this information into account?
Assisted AdSense Optimization To more quickly optimize your article, and have a better perspective of the terms that you can use, we recommend that you use the content optimization for ads at https://www.seologies.com. You will discover the most profitable terms much faster and you can more effectively define your article guidelines.
Keep in mind that the optimizer will help you find terms that can generate higher profit if you put them in your article. The goal is to help you create an article optimized for AdSense, though it is still necessary to create user-oriented articles that are interesting and relevant. Don’t try to force your writing to include the terms; rather, use them wisely. For example, if your article is about ‘weight loss’, Seologies will show you some topics to bring up in the text of your article that will attract ads with high CPC:
Nutritionist Online Schools
89.01 USD
Nutritionist Degree Programs Online
65.38 USD
Bariatric Surgery Los Angeles
64.93 USD
Nutritionist Schools Online
37.74 USD
Morbid Obesity Treatment Centers
11.22 USD
Endocrine Treatment For Breast Cancer
11.16 USD
Liposuction For Morbidly Obese
8.28USD
Foods That Lower Cholesterol And Triglycerides Naturally
7.12 USD
(and hundreds of terms more)
After evaluating this big list, you may consider covering these subtopics in your article as you know that they are highly rewarded by advertisers. Don’t repeat the same highly-rated keywords very often in your article. This could trigger the “money keyword filter” and decrease your SEO as soon as Google Panda crawls your site. In fact, you don’t need to repeat the keywords very often, because the main goal of using these keywords is not to increase your ranking on them, but to attract good ads. Be sure that your article still matches your user’s intention after including these “ad attractors”, otherwise it will be categorized as deceptive content be penalized in the long term.
Content With the change in Google ranking policy, content becomes very important when it comes to showing the quality of a site. Content is the author’s text, displayed on the page, and relevant to a topic. It is important to create professional content, focusing on it being especially useful to your users. In this section we show you the keys to ranking your website using the content of your site, and why it is important when creating your website.
The importance of quality The content of your website is not only important for search engines to classify the information you offer; it is also important to keep users returning to your page. The users will also share your content more often if it is really interesting and relevant. The activity that your items have through social sites like Facebook or Twitter makes Google interpret your content as valuable, and thus improves the ranking of the page. In turn, Google concedes particular importance to the content shared on its social network, Google+. Every time you make a new entry on your page, you must focus on offering value to the user. To do this, your content must bring something new to what already exists on the Internet. There is a lot of content on the Internet and plagiarism of indexed texts is penalized by Google and other major search engines. To avoid this, make sure your articles are original, creating unique and useful content to rank you above your competitors. The value of your content is also measured in terms of the research you've done on the subject matter, and how it is shown in the text. For example, one strategy is to include in your new text answers to questions that people already have. Elements within a long sentence are difficult for a machine to interpret, resulting in the site being poorly rated by search engines. Care about spelling and grammar. A poorly written text, in terms of either structure or spelling, can mean the difference between being on the first page or relegated to the last position. Before uploading new content to your website, make sure it is correctly written. In addition to how content is written, Google values diversity. Include information in different formats. Including videos or infographics in new entries improves user interaction. Structure content in different ways, creating lists and more visual styles, which are always valuable to users and ultimately also to search engines. Ranking in search engines using good content works as long as it is done with the high quality that users need.
Content strategy
After deciding on the topic of your website you need to aim at realistic goals. Organic positioning takes time, so at the beginning of your project results will not be immediately apparent. The purpose of SEO content is to attract as much valuable traffic as possible, and over time, to achieve a high ranking of your site as one of the main references of your industry. There is no great secret to it other than diligence. Depending on your goal, you'll need a particular type of content that fits your business idea. Thus, if your project is based on online sales you need to create a detailed description of each product on sale as well as prevent search engines from misunderstanding the content of the article. Writing your articles clearly prevents search engines and visitors from confusing concepts. By writing concise messages and trying to anticipate the intention of users when they perform specific searches, you will get a better organic positioning. Stick to the issue at hand, avoiding rambling and creating noise or irrelevant information within the content. The less the author is distracted from the main subject of the article, the better the result will be in all aspects, especially if you provide the search engines with the facilities to identify the content.
Variety is important too When creating new content for your articles invest time in adding multimedia files for a positive impact on the user experience, increasing satisfaction and usefulness. Alongside the text it is always advisable to include images or videos that increase user interaction and the time the user spends on the page. To display images in an attractive way you can create a slideshow. These images have an effect on the valuation of the SEO in your page so it is important to lay these elements out well, including html tags (alt and title) with the required text for the search engines to interpret the content. Infographics are very useful to capture the user’s attention. This type of image visually displays data on a particular theme. Thus, the author can communicate the objective of the post more directly, making it more understandable and appealing to the reader.
Content curation Three years ago the “curation� concept, consisting of receiving news from other media and rewriting it in your own words, became fashionable. It seemed to be providing an article that was new in the eyes of Google, but without the costs of hiring a researcher. Google tries to promote the original creator of the content or the information in its ranking; which is fair. Google does deep analysis of relevant and proof terms of each page and favors pages that use a certain set of vocabulary for the first time.
Thus, if you rewrite an article from another person without contributing anything new, Google will detect that you are basing your content on another article. This will decrease the rating of your whole subdomain. The easiest way to avoid these dead ends is to use Seologies, which shows you relevant and proof terms from not one, but thousands of articles, allowing you to combine them in different ways and create different sets of vocabulary.
Links ● Never link to domains penalized by Google (you can review the history of organic traffic of any specific domain using SEMrush). ● Link to pages that are related to the topic of your article. ● Linking too much to the same site from different articles can make Google think you are selling links, or that the target domain name is yours. Make sure that you link to a page because it offers value to the user. You shouldn’t link for any other reason since it will eventually end up being spotted in the patterns of Google fraud detection. ● A lot of links to the same article or page with the same text from your website will make Google think it is a Site Wide Link (automatic link), and may penalize both the destination site and yours, too. ● Create links with different and creative anchor texts. For example, to link “pumpkin diet” using different articles do not always use the text “pumpkin diet”, because Google may think that it is an automatic link or you're trying to position for that keyword which makes the ANCHOR text lose value. Texts like “vegetarian diet,” “diet two days,” and “slimming pumpkin diet,” are examples of varied texts.
When the content of your site links to other sites, the links you include may be harmful or beneficial to the ranking of your website depending on which page is linked, and in what context. Please note the following rules:
Image 5 - Example of Anchor Text
New information Google is eager to find new information and provide new features compared to what is already on the Internet. It is important that every article that you we create offers unique findings and analysis created by your editorial team. If, for example, you talk about a new DNS service from Google, instead of copying the original press release it would be much more effective, in terms of SEO, to evaluate different DNS servers and provide a comparison of them, thereby adding new information on the topic. You can use Seologies to research hundreds of pages simultaneously.
Fresh and updated information Posting new items and updating those that are getting old gives you an important time advantage for a few days in search engines. The freshness of content is an important signal of relevance for positioning (https://moz.com/blog/google-fresh-factor). The following techniques also work: â—? Rewriting old articles that received good traffic that might have some fresh news (check Google Trends to make sure that the term by which you get visits is still in demand) â—? Using plugins or platforms that allow user interaction to enable visitors to add new content
(comments, user reviews and FAQs).
Ten most common content mistakes that writers make When creating your own online business, you must learn from the mistakes of others in order to improve in comparison to them. The following list can help you improve your content, and prevent you from making theses common mistakes: ● Including pages with duplicate content either from your site or elsewhere, which severely affects your ranking (you can use www.copyscape.com to prevent this). ● Pages with little content and, with few proof or relevant terms. ● Composing content thinking only about search engines, leaving out the needs and intentions of users. ● Always including the same anchor text in links in your articles, or unoriginal, repetitive texts. ● Content developed for the site by authors who are not classified as experts. ● Articles with less than 500 words. ● Not focusing on interesting or relevant content to attract the attention of users so that they share your page on social networks, which would create a natural collaboration. ● Not optimizing the appearance of your content on the SERPS (search engine results pages). ● Not using title formatting tags (H1, H2, H3) or bold format to highlight the importance of the different terms. ● Not adding multimedia content, such as pictures or videos to increase the on-page time.
Ten steps to creating perfect content The strategy for creating the content of your page is very important, as it will identify what type of page it is and what its specialty is. To improve the quality of your content, use these simple tips: ● Make abundant use of relevant and proof terms in your article. ● Learn about users’ needs and interests and adapt your content to the interests of your potential audience, making guides, explanations of common terms or any other type of FAQ guide. ● Use a striking title, between 50 and 70 characters long, so that it appears intact in the search engine header (you can use Headlinr “www.Headlinr.com” to brainstorm potential titles).
● Create content encouraging your users to participate with comments and share via social networks. ● Structure your content intuitively, creating consistent and tidy content so that users do not get lost when reading. ● Use multimedia content such as images and videos, and include relevant text in their description so that they are positioned in the various modes of search engines. ● Use a natural keyword density, with synonyms where necessary to avoid over-use. ● Create unique content for users interested in your site so they come back for more. ● No matter how interesting your content is, if you do not socialize and share it you won’t get the desired result (social networks are indispensable).
Additional content ratings Domain authority and hidden compensation laws Sometimes you will see that really poor content ranks much better than a lot of other relevant articles. This is because Google has found other signs that the website is relevant to that particular search. For example, when you search for “Apple” Google will show you www.apple.com as the first result because although that page does not contain an extensive article about the company, Google understands that it is a domain with a high relevance for the following reasons: ● The click-through rate of people looking for “Apple” for that domain is huge. ● The number of incoming links and quality thereof. ● The relevance of the website on social networks. ● The newsworthiness of the article (which has been recently written). ● Domain authority. ● Author specialization in the type of content that is written. ● Domain age (the more years you have been registered for, the better). ● Years until the domain expires (the more years that remain, the better).
The history and authority of a domain are very important when trying to rank new content. If you want to develop your SEO faster, consider buying an authoritative domain in a specific content category instead of trying to develop one from scratch.
Images Google is able to recognize if your website’s images have been copied, or copied with slight modifications. The search engine assigns more value to unique images on the Internet since Google considers that the site uses more resources to create original and unique media.
Google+ The services that Google has acquired during its expansion include all sorts of tools, including its foray into social networks through Google+. This should be considered when positioning yourself on the web providing additional information through the G+ profile. The integration of Google+ in the
space of the Google search engine has made this tool one of the most used worldwide. You can increase your ranking possibilities by using Google+ for your website and creating a unique domain profile. This social network is a very important space where you’ll encounter less competition than on others, like Facebook, and includes possibilities that the other social networks lack. As an integrated workspace for Google this network allows you to unify both your Gmail and Hangouts chat. As for the usefulness of Google+, its use has a major impact on the ranking results. When a user navigates using a Google account the search results are customized including the activity of the social network itself. Thus, if a user’s contacts share content, Google interprets this information as potentially interesting to the user. Given this possibility, it is important to create appropriate content on your G+ account, especially if it becomes popular for other users to share it with a +1. Content you generate on this social network can include links that have the function rel=”follow” which means you are explicitly linking their content to yours assuming an added value. Also, making use of the Author Rank to publish these articles always improves the result, taking into account the relevance of the author on issues in which he has specialized. By having higher activity on your profile through interaction with users, your website will get more relevance when displayed on the results of search engines, especially Google. Google Chrome users will have a more personalized search because when it is installed, the user may choose to share "anonymous" information with Google which Google uses to help improve the focus on the user. To achieve proper use of this network it is always advisable to follow a guide. You can try to begin with these actions: ● Use the same header as the original article to enhance the search results. ● Use text formats that make your posts appeal to users. ● Don’t fear writing long posts that provide useful content to your reader; you must write in your own style, using your own words. ● Use related images to trigger interest in your posts. ● Send posts to your circles, or even via email, to secure a more personal engagement. ● Organize your posts in Google+ using the appropriate hashtags for your topic, such as #football. ● Prioritize obtaining +1 and sharing of your content to improve your visibility. ● Create entries inciting readers to add comments in order to enhance interactivity with your users. ● Organize your posts to be more interactive with features included in Google+, such as inserting
videos, launching applications or saving posts in My Favorites
Google+ interaction is especially relevant to the ranking of your website or online business. Because of the importance of interaction with users within the search engine, the reviews and comments of visitors adds great value, which we should be concerned about obtaining. These comments open our eyes to possible enhancements in our service. Thus, solving problems that have given us a bad rating will improve external opinion about our business. Google will analyze not only the number of comments, but also what kind of feedback is received when setting a rank. By having an active profile on Google+, with your contact information up to date, you are likely to find your business in the results of a 'local' search. This feature helps to give an improved picture of your business, better capturing the attention of users searching for offline businesses such as restaurants and shops.
The author When you publish a document on your site it is important to link to your Google+ profile by using the function rel=”author” (which we will explain in full detail later). This will allow Google to identify that the item has been created by a particular author and consider the context of the article. To collect information about your profile and your friends on Google+ the search engine will try to find out if you are an expert in the area you write about. To achieve relevance as an author, follow these steps: ● Create your Google+ profile. ● Indicate in your profile that you are the author of the sites that involve creating content. ● Add your Google+ profile to related groups in which you are an expert. ● Add friends to your profile who are also experts in these matters. ● Create and participate in discussions of the topics in which you are an expert. ● Link those pages in which you write articles with your Google+ profile by using the author tag <a href="http...URL the author Google+" rel="author"> name </a>.
Note: In 2013, Google said that author markup was not used anymore, but stated at the end of 2015 that it may start to use it again, recommending that websites add this markup back,
The domain or subdomain Google also identifies a domain or subdomain as an expert by taking into account the following parameters: â&#x2014;? Back links to a domain from other domains with high authority rank in one specific area â&#x2014;? The number of articles in the same domain concerning the same subject that could identify expertise in that area
Since the search engine can classify your domain or subdomain as an expert source, based on the content and backlinks, make sure that the topics of your articles are related to the topic of your domain. Some websites, like about.com, create different subdomains for every area of knowledge they write about to optimize the Authority Rank of every specific subdomain (search site:about.com on Google to understand how they do this).
Image 6 - Examples of subdomains on about.com
Technical Platform Secure HTTPS protocol On August 6, 2014, Google announced that using secure navigation protocol (HTTPS) would improve the ranking of websites, since the level of security it provides is an advantage for the user. The algorithm, “HTTPS”, encrypts communications between the user's browser and the server so that if someone accesses the same network as the user, the communication will not be readable and therefore the security of the connection is heightened. This HTTPS communication was already mandatory in the cases of personal data and credit card information transfer, but, in 2014, Google was the first search engine to announce that it would prioritize pages that were fully encrypted. When you implement HTTPS on your pages, note the following facts:
● In HTTPS, all images, CSS, and embedded JavaScript should be housed in “HTTPS” as well ● Use local paths whenever you link to internal resources, e.g., /images/logo.png instead of http://www.mydomain.com/images/logo.png ● Use routes without protocol whenever you link to external resources. For example, instead ofhttp://www.otherdomain.com/images/image.png use //www.otherdomain.com/images/image.png ● Make sure your security certificate has not expired ● Make sure your security certificate is 2048 bits ● Confirm the exact domain/subdomain that is assigned to your security certificate ● Make a "permanent redirect" from each HTTP page to point to the equivalent HTTPS
When you migrate content from HTTP to HTTPS, use the guidelines included in the “Change URLs” section. Furthermore, to link safely you should check that the available content uses the HTTPS security protocol. For added security you should check that the SSL security certificate is valid and up to date as it contains safety tips for the website.
Speed optimization The loading speed of your website is one of the parameters that Google takes into account when
ranking your site. The faster your page, the more satisfied your users will be with it, and the longer they’ll stay. This is one of the reasons why Google chooses to display your site in the search engine results page. To ensure you meet all the requirements of speed on your website according to Google, perform the following steps: ● Audit your pages with Google Insights: https://developers.google.com/speed/pagespeed/insights. ● Use the Google Chrome extension PageInsights to check your page https://developers.google.com/speed/pagespeed/insights_extensions. ● Install Google PageSpeed of your Apache server module or Nginx: https://developers.google.com/speed/pagespeed/module.
Tips to increase the speed of your page: ● Use intelligent cache tools in your web server. ● Use external data tools to cache repeating data (Memcache, Xcache, etc.). ● Optimize queries to your database to make it faster (this depends on the type of databases you use). ● Oversize RAM and web server processors, or use balancing on them. ● Use different servers in different areas of the world that you want to reach, a lower ping time corresponds to a higher speed. ● Use a third party CDN (content delivery network), like maxcdn.com, for static files like images, CSS, and scripts.
Automatic Tests To perform tests on your pages, you can use an online monitoring tool such as Pingdom (https://www.pingdom.com), which allows searches for HTTP headers and specific text in the HTML. Or, if you know how to, you can schedule simple Linux scripts that perform these checks. Changes on your website platform are subject to potential errors, and therefore can harm the positioning of a given web page. Errors can be categorized as follows:
● System or network errors that harm the scope or operation of the website itself, and may be periodic (for catastrophic events), or systematic. ● Server overload errors. ● Errors due to computer attacks and denial of service. ● Errors displaying the content. ● Content errors that are involuntarily committed by the editors, and are often the hardest to find. ● Changes in structure of valid URLs. ● Changes in meta tags of specific or multiple pages.
Tests are essential tools to keep your website in a healthy state and detect problems so that you take the appropriate measures in time. It is recommended that the test server where you run tests is in a different location than the one used to serve your website. This is so that you can catch errors that depend on server and network issues. We will show you how to create some tests using the Linux platform. The command we will use to launch HTTP requests is curl, a common command line application on all platforms. Make sure it is installed, and if not, install it on your package distribution system. We will write a file with a list of URLs that seem relevant to our web traffic that we want to verify with a text editor. On each line a URL appears.
Name this document Requests.txt. It has the following structure. http://www.mydomain.com http://www.mydomain.com/pages/sports http://www.mydomain.com/pages/sports/football http://www.mydomain.com/pages/sports/football/champions-league-2015.html http://www.mydomain.com/pages/sports/basket/nba-ranking.html http://www.mydomain.com/pages/sports/search?q=mike+tyson
It will be our task to update the file by adding or removing URLs as the web changes over time.
Now we need a script that allows us to regularly launch requests to these URLs and inform us if there are problems. Consider test.sh, a text file with the following lines: #!/bin/sh # # THE FIRST COMMAND WRITES THE DATE IN THE RESULTS FILE, DELETING ITS # FORMER CONTENT date +%c > result.txt # RUN A FLAG THAT WILL DETERMINE IF THERE WAS ANY ERROR FAIL=0 # THIS COMMAND READS THE INPUT FILE AND DIVIDES IT IN LINES, PUTTING ON # EACH URL IN THE URL VARIABLE AND RUNNING THE CODE CONTENT FOR EACH URL for LINE in `cat requests.txt` do # LAUNCH AN INDEX PETITION TO THE WEBSITE, WHICH RETURNS ONLY THE HEADERS #THEN FILTER OUT ONLY THE FIRST LINE (CONTAINING THE HTTP STATUS) #AND THEN, LOOK FOR THE 200 CODE IN THIS LINE, SENDING THE OUTPUT TO #/dev/null, WHICH IS EQUIVALENT TO DISCARD IT #FROM THIS COMMAND, WE ARE INTERESTED THE RETURN CODE, WHICH WILL BE 0 IF # IT FINDS THE STRING, AND 1 IF IT DOESN'T FIND IT curl -s -I $LINE|head -n 1|grep 200 >/dev/null # WITH THIS COMMAND WE CAN CONTROL THAT THE RETURN CODE WILL DIFFER FROM ZERO if [ $? -ne 0 ]; then # IF IT IS RIGHT, SET THE FLAG FAIL=1 # WRITE A LINE WITH THE WRONG URL IN THE OUTPUT echo "$LINE failed" >> result.txt fi done if [ $FAIL -eq 1 ]; then echo " There have been mistakes by processing the web "|mail -s "mydomain.com Tests Error" your.mail@mydomain.com -A result.txt fi
This script must have execution permissions that are given with the command: chmod +x test.sh Additionally, use the mail command. If you do not have this installed yet, you can do so using the package manager. Other commands, including cURL, are part of the standard Linux distribution and need not be installed on major operating system versions. If you launch this command, it runs a battery of tests and sends you an email in the event that any URL fails and doesn’t return the code, 200. Now you need a tool to schedule the execution of script test.sh periodically. This tool is the cron table or crontab. Let's see how it works: crontab -e This command shows us a special text file that contains lines like this: 12 * * * /home/user/test.sh As it stands, this script schedules the daily execution every twelve hours. If you want more frequent tests, you can replace the value, 12, with the string, “* / 4”, which executes the script every four hours. Once the file is saved, if there are no syntax errors, the execution will be scheduled and we can be sure of our new automated tests. A useful exercise for the interested user would be to modify the test script to search and check a list of URLs that have to return the code 301 (permanent redirect). Now, going a little further, it might be useful to do some tests to look for a specific string within the text of a page. For example, suppose we want to write a list of URLs, and each URL is associated with a text string you want to search for. With a few changes to the previous script we will achieve this function. Consider a search_text.txt file as follows: http://mydomain.com|Home Page http://mydomain.com|Pages http://mydomain.com|a href="/pages" http://mydomain.com/pages/sports|Basket
Each URL is divided from the text string to search for the “|” sign, which cannot appear in a URL.
The central part of the script (inside the for loop), in this case, is this:
# THIS COMMAND DIVIDE THE LINE IN TWO USING THE SEPARATOR '|' URL=`echo $LINE|cut -d '|' -f1` TEXT=`echo $LINE|cut -d '|' -f2` #FROM THIS COMMAND HAS DISAPPEARED THE 窶的 OPTION WHICH ONLY FILTERS THE #HEADERS, AND THE COMMAND HEAD, WHICH ONLY FILTERS THE FIRST LINE. NOW THE #GREP COMMAND WILL ACT IN ALL THE SITE'S CONTENT curl -s $URL|grep "$TEXT" >/dev/null
By launching these scripts every time, you update your code and will be sure that certain important sections in your source code (generation of URLs and content generation) function correctly.
Web design and usability Improving usability When you improve your page design and usability you are improving the time spent on your page and the user experience. In this way you will also improve your organic positioning. Sometimes you think you are making improvements, but you are getting worse, perhaps complicating the usability of the page. This error is very common, not only among independent creators of websites, but also in large companies who think they know what the user needs in terms of design. An example of this was Google's customizable homepage (iGoogle) which was withdrawn eight years after its release. To analyze the impact of a new design in terms of experience and usability it is advisable to carry out A/B experiments which consist of showing the user different options on the same page, and then observing how the user interacts with them. There are tools available for this, such as Optimizely (https://www.optimizely.com/) and Google Analytics (https://support.google.com/analytics/answer/1745147?hl=en).
Above the fold An excess of ads can dramatically reduce the quality assessment of your site and thus your ranking. A specific additional rule for this is called “above the fold,” which is the amount of content that a user can see without needing to scroll down. If visitors have to scroll to see the content because the top ads are occupying most of the space, it is very likely that you will be penalized by the “above the fold” rule. This penalty was specifically designed for companies whose sole purpose is to get more interaction with their ads. Through the creation of content, these companies manage to attract traffic to their site, but the site is designed so that users are enticed to click on the ads.
Table of contents and sections within an article As we talked about previously in the section on long tail keywords, the use of a table of contents in your article can help Google understand your content better and, therefore, improve your rankings, too. Imagine you want to rank for “Star Wars.” First, define a title for your articles that includes the words, “Star Wars.” An example is “Star Wars: everything you wanted to know about it.” Create a table of contents targeting different sections (which will also be your long tail for SEO):
<h1>Star Wars Movies</h1> <a href="#thephantommenace">The phantom menace</a> <a href="#theclonewars">The Clone Wars</a> Identify the different sections in the content with this markup: <a id="thephantommenace">The Phantom Menace</a> Google interprets these sections, granting importance when the results are shown in the search engine, in terms related to the content on your page, especially in the titles of these sections. Pay attention to how Wikipedia plays with these structures by performing some searches: https://www.google.com/search?q=starwars+setting&ie=utf-8&oe=utf-8&hl=en
Image 7 - Example wikipedia section result https://www.google.com/search?num=30&hl=en&q=starwars+movies
Image 8 - Example wikipedia section result Structuring your articles in sections will help users to understand your content, increase the quality rating of your article, and help you with long tail SEO, which are all very interesting advantages. You can learn more about how to create a table of contents in HTML here: https://csstricks.com/automatic-table-of-contents/
Robots.txt The /robots.txt file tells robots (and search engines) which pages of your website they canâ&#x20AC;&#x2122;t access. It is a particularly useful file if you want to prohibit certain pages from being accessed by Google and other search engines for various reasons, such as a site under construction that should not be visible to search engines at that time. For example, this robots.txt removes access from all robots to three specific folders and prohibits the 'BadBot' robot from accessing the 'private' folder. User-agent: * Disallow: /cgi-bin/ Disallow: /images/ Disallow: /tmp/
User-agent: BadBot # substitute 'BadBot' with the bot's name Disallow: /private/ Robots.txt was the first form of control to limit access of search engines and other robots to websites or specific sections of websites, and has become the de facto standard for this kind of control. Search engines usually accept these directives, but not all do, and not completely. We have to understand this as a "recommendation." Your web server must return some lines (in a specific syntax) as plain text in response to the request to the URL /robots.txt. https://en.wikipedia.org/wiki/Robots_exclusion_standard https://www.google.com/webmasters/tools/robots-testing-tool Setting up robots.txt incorrectly can be dangerous, as it may even cause your website to completely disappear from the search engines. Also, disallowing Google to crawl your images folder could lead to a total decrease of your rankings, as image analyzing is currently an important part of SEO. Read this article from Search Engine Watch to understand how blocking, scripts, CSS, and images can affect your rankings:
Adaptability for mobile One of the most recent policies of Google, when it comes to ranking your page correctly, is the adaptability of your website to mobile devices. Due to the increasing traffic demand from these devices, lacking a page designed for screens of this size means losing a share of this growing market. One of the first actions to be taken is to adapt the display of your page to the mobile interface making the menus accessible by using buttons large enough to navigate them with a simple touch. Fortunately, you can count on various tools that will detect and indicate potential errors in the design of your site for mobile devices, such as PageSpeed Insights, or the Mobile-Friendly Test Tool (https://www.google.com/webmasters/tools/ mobile-friendly /), part of Google Webmaster Tools. As a general rule, to adapt the design of your page you should take into consideration the size of the font you choose, ensuring that it isnâ&#x20AC;&#x2122;t too small to be read, and ensure that links will be easy to tap. When designing the content of your page, as well as the overall design of the interface, many developers opt to create a specific subdomain for the mobile version, usually including an "m." before the name of the main domain. This helps to have more control over mobile traffic. However, in order to avoid penalties for duplicate content on the mobile and original versions of the website, it is essential to use the Canonical tag, as described in the corresponding section. Adaptability for mobile devices is not just about making the text fit the screen size, but also ensuring that the content is visible and accessible from any device. Googlebot, the robot used by Google for searching, should be able to access the CSS stylesheet and JavaScript code. You can emulate the
reproduction of these elements on a mobile operating system and check to see if it is fully compatible. If these items do not play properly, your siteâ&#x20AC;&#x2122;s ranking on search engines may suffer. So try to think of all kinds of users when designing your website. By using an adaptable design, you can be sure to leave no potential user out. When planning your web design for mobile devices you need to optimize performance, especially the loading speed. This feature is essential to make users feel comfortable with navigating the page, thus avoiding the immediate bounce and improving your traffic metrics. For a more comfortable development of your domain, the major browsers currently include an emulation for mobile devices. This function allows you to browse the mobile version of the page from your computer as if you were using a smartphone or tablet. To emulate a mobile device on Firefox, press Ctrl + Shift + I and click on responsive design:
To emulate it on Google Chrome, press F12 and click on Toggle Device Mode:
Relevant Meta Tags Meta tags are labels that are part of the source code of your website. They describe its content and the form in which it is classified. They are an important help for search engines to correctly understand your page. These HTML tags include relevant information such as metadata identifying the author of the text, the description of the tab, and the title, among other features. On some platforms, like WordPress, you have plugins that automatically include these tags in your HTML. The following are a series of meta tags that are important for the proper positioning of your page.
Rel = author
<a href="https://plus.google.com/â&#x20AC;Ś.?rel=author">Author name</a>
This link in our article identifies the original author. In order to make it work properly we edit our Google+ profile and indicate that we are authors of the domain on which the content appears. Although in previous versions of Google the author appeared directly in the search, this feature has been removed. However, despite not appearing as user information, the Author Rank is still useful for your articles. The use of this tag is important when indexing your articles. Using the authority that your writers have acquired with their work can help new content rank higher.
Rel=prev, rel=next These HTML meta tags are used to link the different pages of the same article together. When you index an article that spans several pages, you need a label to relate the different parts together so that the search, when indexing the content, may take into account that it is one entry with multiple related links. This meta tag should be implemented in all of the links you have inside the same entry. Thus, we have a structured article, divided into three parts as follows:
http://www.maindomain.com/article?type=history&page=1
http://www.maindomain.com/article?type=history&page=2 http://www.maindomain.com/article?type=history&page=3
Modifying the main tag <head>, weâ&#x20AC;&#x2122;ll have the following on the first page:
<link rel="next" href=" http://www.maindomain.com/article?type=history&page=2" />
In this post, we cannot refer to any previous page, as it has not been created. In the following link (http://www.maindomain.com/article?type=history&page=2) we would find the following structure:
<link rel="prev" href=" http://www.maindomain.com/article?type=history&page=1" /> <link rel="next" href=" http://www.maindomain.com/article?type=history&page=3" />
In the last example, the structure of the meta tag would read as follows:
<link rel="prev" href=" http://www.maindomain.com/article?type=history&page=2" />
Thus, all the links to the same article are linked to each other, ensuring the search engine interprets them as part of the entry to your page.
Canonical The canonical meta tag is meant to indicate that the page you are currently on might be duplicate content, or have content similar to another page that is already hosted on your domain and can be
accessed from another link. Thus, when you include the meta tag in the <head> of the various links, these versions will be linked, avoiding duplicate content penalties. This meta tag can have various uses, especially if you have different versions of the same page that differ in appearance but have the same content. If you set up your page to look differently, without loading the stylesheet (for slower connections or mobile versions) or redirecting the link, you'll need to use this meta tag. To configure this meta tag you only need to include the original link you want to be regarded in the <head> section of each link, copied as follows:
<link rel="canonical" href="http://www.maindomain.com/original-text"/>
For example, if we have the http://www.maindomain.com/sneakers-sport page, and we want to tell Google that the content is a copy ofhttp://www.maindomain.com/Sneakers-Sport (the same URL with lowercase), then we can include this tag in the HEAD section of the first page:
<link rel="canonical" href=" http://www.maindomain.com/Sneakers-Sport "/>
Canonical pages should be kept inside the same subdomain.
Rel=nofollow The attribute of the meta tag "nofollow" is indicated to identify the links that the webmaster does not want Google to follow. Every page indexed on Google has a rank (page rank) and an Authority Rank. Every link you add to your page will receive part of this authority. If you link to a good page because it adds relevant information to your article, this link will benefit the linked page and your page. However, if you link to a page with bad content, material unrelated to your content, or inbound links from bad sites, this will harm your page, too. In some cases, you donâ&#x20AC;&#x2122;t want to review all your links, such as in the case of user comments. To avoid being penalized, if any user adds a link to a non-reputable site you should include
rel=”nofollow” in the link tag. If you link to a page without using this tag you are expressly stating that you have verified the contents of this site and trust it. That means taking responsibility in the eyes of Google. If you take a look at all Wikipedia external links, you will always find “nofollow”. It is advisable never to use the “nofollow” tag in a link to internal page content because it does not offer any advantages. The structure of the tag rel = "nofollow" is used as follows:
<a href=http://page.com rel="nofollow">Page</a>
Tags for languages By including a meta tag identifying the language of the different versions of your website you are unequivocally indicating which language the content of your site is created in. Thus, search engines like Google will immediately identify the language for the domain position in its proper location and region. To include this label internationally, particularly for the Spanish language, we should add the following line to the code of your website:
<meta http-equiv="content-language" content="es-ES" />
Additionally, if you have the same content in multiple languages, you can use within the <head> section of your HTML tags “alternate” to identify URLs where the content is, as in the following example:
<link rel="alternate" hreflang="fr" href="http://www.mydomain.com/fr/help.html" /> <link rel="alternate" hreflang="it" href="http://www.mydomain.com/it/help.html" />
The tag, “hreflang,” has been associated with higher rankings, so we recommend that you use it - it's helpful.
Tags for mobile Since the mobile market is growing quickly, it is very important that your website displays successfully on any kind of mobile device. Before appearing indexed on mobile devices, Google will ensure that your website is fully compatible. You can get this using two options: ● Adaptive design: You detect your visitor’s device and generate a page that matches it. ● Responsive design: A simple page can be displayed on every kind of device thanks to a responsive CSS file that specifies how to display it in every situation.
It is increasingly common to request “responsive” designs as this allows you to have one page displayed correctly in any format. When you hire a designer, make sure they have experience in “responsive” designs. If you prefer to have different pages for different devices, you can use meta tags to specify which page to show in the search engine results for every kind of device:
<link rel="alternate" media="only screen and (max-width: 640px)" href="http://..." type="text/html" /> <link rel="alternate" media="handheld" href="http://..." type="text/html" /> <meta name="mobile-agent" content="format=html5;URL=..." /> <meta name="mobile-agent" content="format=xhtml;URL=..." />
See these tags on Google (https://developers.google.com/webmasters/mobile-sites/) as well as specifications for responsive designs, for a greater understanding of how to use them. If your website was developed using WordPress or another blogging platform, make sure the template you use is responsive.
The PageSpeed Insights tool has been created to check whether your site is well suited to mobile so that you can optimize your ranking in Google to improve compatibility with these devices.
Robots=noindex, follow The tag “robots,” allows you different combinations depending on how you want Google to classify your content: ●
<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW"> - You don’t trust the quality of your content, but you trust the quality of your links.
●
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> - You don’t trust the links on the page either.
●
<META NAME="ROBOTS" CONTENT="INDEX, FOLLOW"> - You trust your page completely.
●
<META NAME="ROBOTS" CONTENT="INDEX, NOFOLLOW"> - You trust the quality of your page, but not the quality of the links.
Noindex won't allow Google to index its content on the result pages. Therefore, Google won’t try to rate its quality. If you have pages with bad content this may affect the total quality rating of your site, so adding “noindex” in robots to those pages will help you rank other pages better. Bad content can be copied content from other sites, content that is not intended to be found on Google, or very small articles. Nofollow will tell Google that you don’t trust the links, so it won’t try to penalize you if you sell those links.
Description The meta tag description will be shown in the results page of Google. A good description tag will increase your click-through rate and therefore improve your ranking. This tag should describe the objective of the article in no more than 155 characters to display the entire message in the SERPs (Search Engine Results Page). Although this meta tag is not required, it is advisable to use it in order to improve the click-through rate. Be sure that this description is not duplicated on your site. Duplicate description tags and click-through rates can be found in your Webmaster Tools console. To customize the description of your page you must include the following tag in the head section:
<meta name="description" content="Description of the page you want to show">
Title The title tag shows the title of your page in the search results so that you can customize the way it is shown in the search engines. As with the label description it is important that the content is not duplicated. The length should not exceed 70 characters, or it won’t be displayed correctly in the results. To display the title, you want to include this line in the head section of the page:
<meta name="Title" content="Page’s title">
It is recommended that the keyword you want to rank appears as early as possible in the title.
Keywords The tag “keywords” was originally designed to identify to Google and other search engines which keywords to consider for indexing. However, considering the latest updates, Google has revealed that this tag does not affect the results of searches or their ranking, and if it does it has a negative effect. The reason is that Google may interpret that you have over-optimized the website in order to obtain a position that will not correspond to your content and site work. In the absence of this tag, the way to determine the keywords of your content is based on the density of these words, which should not exceed five percent. Remember that Google has an advanced system of semantic interpretation so it can accurately identify the meaning of your page. Keyword tag format is as follows, in case you decide to use it:
<meta name="Keywords" content="keyword1, keyword2, keyword3">
We recommend you remove this tag under all circumstances.
Sitemap Creating a map of your site is essential to inform Google of all the pages your website contains, thus helping search engines and bots to visit your domain. By including a sitemap of your domain, you can improve positioning in search engines, ensuring that every page of your domain can be found in this map. These maps are commonly made in an XML file, but everything depends on how you have built and designed your domain. When creating the sitemap you should check Google Webmaster Tools to avoid any possible failure or item not being included. As a recommendation, when generating your map, you cannot exceed 50MB in size, nor include more than 50,000 URLs. Divide the map into several sections if necessary to meet these conditions. Each link map of your site can provide additional information about the entry, specifying the following attributes: ● General URL: xmlns=”https://www.mydomain.com/sitemap/1” ● Image: xmlns:image=”https://www.mydomain.com /sitemap-image/1” ● Video: xmlns:video=”https://www.mydomain.com /sitemap-video/1” ● Mobile: xmlns:mobile=”https://www.mydomain.com /sitemap-mobile/1” ● News: xmlns:news=https://www.mydomain.com /sitemap-news/1
When including your Sitemap in your domain there are several labels that should be treated as mandatory, while others are optional: ● <urlset> (mandatory): the label that includes all information on the URLs in the Sitemap. ● <url> (mandatory): information about a specific URL. ● <loc> (optional): specifies the target page images and videos. ● <lastmod> (optional): indicates the last modification of the URL specified; it is formatted as YYYY-MM-DDThh: mmTZD, with optional time data. ● <changefreq> (optional), indicates the frequency at which the changes occur on the page and may indicate values as always, hourly, daily, weekly, monthly, yearly or never. ● <priority> (optional): indicates the priority of the URL, where the value 1.0 is the maximum and the minimum is 0.1.
This is an example: <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:image="http://www. mydoamin.com/schemas/sitemap-image/1.1" xmlns:video="http://www.mydoamin.com/schemas/sitemap-video/1.1"> <url>
<loc>http://www.imagedomain.com/foo.html</loc> <image:image> <image:loc>http://imagedomain.com/image.jpg</image:loc> </image:image>4
<video:video> <video:content_loc> http://www.videodomain.com/video123.flv </video:content_loc> <video:player_loc allow_embed="yes" autoplay="ap=1"> http://www.videodomain.com/videoplayer.swf?video=123 </video:player_loc> <video:thumbnail_loc> http://www.imagedomain.com/thumbs/123.jpg </video:thumbnail_loc> <video:title>Videogames video</video:title> <video:description> Discover the best games up to the date.
</video:description> </video:video>
</url> </urlset>
Rich Snippets Rich Snippets are the tags within the code that specify what type of content you are showing to the user. These labels are intended to simplify Google's interpretation of the data on your page. We write about a certain piece of information in infinite ways: ● I am 35 years old and a Pisces. ● I was born in early 1980. ● Date of birth: 02/10/1980. ● Date of birth: 10/02/1980. ● Born: 10th February 1980.
There are so many ways of saying the same thing that search engines have created a standard called 'Rich Snippets' to better understand the data shown to the user. This standard allows the definition of the information, shown with codes that make it more easily identifiable to a computer program. Look for snippets of a page with the Google “Testing Tool”: https://developers.google.com/structured-data/testing-tool/ Here you will find all the information about Rich Snippets: https://developers.google.com/structured-data/ There is currently a (Rich Snippets) structured format for the following data types: ● Products (https://developers.google.com/structured-data/rich-snippets/products). ● Recipes (https://developers.google.com/structured-data/rich-snippets/recipes). ● Product analysis (https://developers.google.com/structured-data/rich-snippets/reviews). ● Events (https://developers.google.com/structured-data/rich-snippets/events). ● Programs (https://developers.google.com/structured-data/rich-snippets/sw-app). ● Videos (https://developers.google.com/structured-data/rich-snippets/videos).
â&#x2014;? Articles (https://developers.google.com/structured-data/rich-snippets/articles).
Check the websites of your competitors using the Google Testing Tool to see how they use Rich Snippets. Here's an example of how the internal code (HTML) of a page would look using Rich Snippets: <div itemscope itemtype="http://schema.org/Product"> <span itemprop="brand">ACME</span> <span itemprop="name">Executive Anvil</span> <img itemprop="image" src="anvil_executive.jpg" alt="Executive Anvil logo" /> <span itemprop="description">Sleeker than ACME's Classic Anvil, the Executive Anvil is perfect for the business traveler looking for something to drop from a height. </span> Product #: <span itemprop="mpn">925872</span> <span itemprop="aggregateRating" itemscope itemtype="http://schema.org/AggregateRating"> <span itemprop="ratingValue">4.4</span> stars, based on <span itemprop="reviewCount">89 </span> reviews </span> <span itemprop="offers" itemscope itemtype="http://schema.org/Offer"> Regular price: $179.99 <meta itemprop="priceCurrency" content="USD" /> $<span itemprop="price">119.99</span> (Sale ends <time itemprop="priceValidUntil" datetime="2020-11-05"> 5 November!</time>) Available from: <span itemprop="seller" itemscope itemtype="http://schema.org/Organization"> <span itemprop="name">Executive Objects</span>
</span> Condition: <link itemprop="itemCondition" href="http://schema.org/UsedCondition"/>Previously owned, in excellent condition <link itemprop="availability" href="http://schema.org/InStock"/>In stock! Order now!</span> </span> </div>
Notice how the labels’ “span” has the attributes “itemprop”, “itemscope,” and “itemtype”. These attributes are not rendered, but will serve Google to better understand what it is showing on the page.
Name of the website Indicate the name of your website using these tags on the main page of your site:
<head itemscope itemtype="http://schema.org/WebSite"> <title itemprop='name'>Your Website Name</title> <link rel="canonical" href="https://example.com/" itemprop="URL">
The requirements for the name of the site are: ● Using natural names; it is better to use “My company'” rather than “My company inc.” ● It must be reasonably similar to the domain name.
Check that you set it up correctly with the Google Testing Markup tool:
Navigation bar - Breadcrumbs It is important that you organize your website into categories and explain to Google how it is categorized. To do this most websites use breadcrumbs, or a “navigation bar,” which are links that are usually found at the top of the page that indicate the category you are in and and what the higher
categories are. Here is an example of how to indicate the hierarchy of your site using HTML and CSS: <div itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="http://www.example.com/windows-apps" itemprop="url"> <span itemprop="title">Windows Apps</span> </a> â&#x20AC;ş <div itemprop="child" itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="http://www.example.com/windows-apps/browsers-plugins" itemprop="url"> <span itemprop="title">Browsers and plugins</span> </a> â&#x20AC;ş <div itemprop="child" itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="http://www.example.com/books/windows-apps/browsersplugins/browsers" itemprop="url"> <span itemprop="title">Browsers</span> </a> </div> </div> </div>
Check your breadcrumb effectiveness with the Google Testing Tool: https://developers.google.com/structured-data/testing-tool/
Google appreciates that your page has a navigation bar because it helps the user to navigate the site intuitively.
Rich snippets for videos When you include videos on your pages be sure to include the correct HTML markup (https://developers.google.com/structured-data/rich-snippets/videos) so that Google can understand it:
<div itemscope itemtype="http://schema.org/VideoObject"> <span itemprop="name">Title of video</span> <span itemprop="description">Video description</span> <img itemprop="thumbnailUrl" src="thumbnail1.jpg" alt="thumbnail text"/> <meta itemprop="uploadDate" content="2015-02-05T08:00:00+08:00"/> <meta itemprop="duration" content="PT1M33S" /> <link itemprop="contentUrl" href="http://www.example.com/video123.flv" /> <link itemprop="embedUrl" href="http://www.example.com/videoplayer.swf?video=123" /> <meta itemprop="interactionCount" content="2347" /> </div>
Sixtrix is an online tool that generates the microformats to YouTube videos automatically (by entering the URL): https://www.sistrix.com/video-seo/ Here is an example of a micro snippet created with this tool: <div itemprop="video" itemscope itemtype="http://schema.org/VideoObject"> <h2><span itemprop="name">https://youtube.com/devicesupport</span></h2> <meta itemprop="duration" content="T03M56S" /> <meta itemprop="thumbnailURL" content="http://i.ytimg.com/vi/UKY3scPIMd8/hqdefault.jpg" /> <meta itemprop="embedURL" content="https://youtube.googleapis.com/v/UKY3scPIMd8" />
<div id="schema-videoobject"><object width="640" height="380" classid="clsid:d27cdb6e-ae6d-11cf96b8444553540000"codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6, <param name="src" value="https://youtube.googleapis.com/v/UKY3scPIMd8" /> <embed width="640" height="380" type="application/x-shockwaveflash" src="https://youtube.googleapis.com/v/UKY3scPIMd8" /> </object></div> <span itemprop="description">https://youtube.com/devicesupport
http://m.youtube.com</span> </div>
Tools for content creation When creating unique content for your project it is essential that you work with the best tools to get the result you want when it comes to SEO. Sometimes it is not enough just to be an expert in a particular subject to create quality content; you will depend on the experience of other authors who are already positioned in major search engines. For creating content that ranks your site properly, you need a series of tools that offer diverse functions. This list will give you the tools that best suit users in getting the best content.
Seologies (www.seologies.com) This online tool has been designed to reveal the proof and relevant terms associated with a specific keyword. In other words, Seologies gives you the expert vocabulary that should appear in your content so that it is positioned in a specialized and relevant way. The latest updates to Google’s algorithms focus on improving the search engine’s interpretation of content. These improvements are achieved using semantic analysis. Seologies assists content creators discover what to add to their texts to facilitate their work in a simple and dynamic way. The terms discovery tool covers two kinds of terms: ● Proof terms: terms that give credibility to a text and therefore should be added to your text to make search engines trust your content. ● Relevant terms: terms that specialize a subtopic and make your content seem unique and elaborated. Finding ideas for a new article If you do not know how to approach a topic, turn to Seologies for the inspiration you need. Seologies gathers information from different sources in a single tool. It also allows you to see how the original source uses this vocabulary on the web. Verify your content’s quality Seologies includes an additional tool to measure the quality of your content given a search term. Just paste your main keyword and content, and you will get a rating and some recommendations about how to improve it.
This measurement is based mainly on how many relevant and proof terms you are using in your document.
CopyScape Before uploading content to your website it is important to ensure that what you have written is not considered plagiarism. Copying content from other sources carries a serious penalty so it is advisable to do a pre-check before you publish your page. Copyscape is an online tool that detects any hints of plagiarism and identifies the source. Thus, by changing the indicated sections, Copyscape helps you avoid the penalty for plagiarized content. Once you complete the tool registration and pay the user fee, you can paste the text that you have written, and the tool tells you on which pages you can find identical fragments. If you havenâ&#x20AC;&#x2122;t written the content, it is very important to use this tool even if you have full confidence in your editor. Conversely, if you have written the content, it is also important to use the tool because you sometimes unwittingly get too much inspiration from third-party texts.
Readability Score This website will help you optimize the readability of your documents by understanding the different factors that affect it: https://readability-score.com/
Tools for finding keywords Even if you have keywords in mind before you start creating your content we recommend a prior check to see if they are suitable for your page. In the following section, we explain the essential tools for choosing keywords according to their relevance and use by your competitors.
SECockpit This is an analytical tool that allows websites to generate reports based on a wide range of needs. SECockpit was especially designed for professionals who work in SEM and SEO.
Google Trends This resource aims to show keywords according to their popularity over time. Thus, the article's author can check the search history and the current trend of the topics they want to add to their page. When combining this tool with AdWords, you can clearly check the usefulness of the keywords you might use in your content. Using Google Trends, you can search for as many terms as you need, comparing their activity using the chart. For more realistic results, you can order the terms according to their category to make a more specific comparison. Since you have to consider your target audience, indicating the region will help you view the current results in your geographical area, so that you can see whether it is worthwhile to create the content you have in mind. Google Trends displays news regarding the searched keyword to justify movements in trends. Thus, with the timeline included in the graph, you can check the relative effect of traffic or news developments on keyword popularity. The tool also forecasts the future of the termâ&#x20AC;&#x2122;s trend by taking into account the previously collected data. This also helps to check which terms are those known as "seasonal," that is, their popularity is dependent on the time of the year. That explains the movement of various terms such as "swimwear," which are much more popular in summer. Along with Google Adwords for the selection of appropriate keywords, Google Trends lets you select what terms are appropriate, given their progression over time. In turn, using all the information that Google collects through its service, this tool allows you to identify in which regions the term in question is used the most. This is useful if you are looking to target a particular geographical area. This utility allows you to better target campaigns to specific locations, especially if you intend to sell products. With Google Trends, you can check the popularity of your trademark in relation to the competition so that you can track the progress and trending of the terms with which you work.
AdWords AdWords is a “pay per click” Google tool that allows you to advertise on its search engine and only pay for each click you receive. This tool is especially useful for SEO experts as it allows you to advertise your site on the search engine, and it offers you access to privileged information, like: ● Finding out the search volume for a particular keyword ● Finding keywords and related searches ● Knowing the rate per click for different terms
Google AdWords is essential for creating an effective strategy when selecting keywords in which you want to specialize. AdWords offers you the possibility to check the history of use of these keywords.
Other tools Google search parameters When you do a search in order to check your current ranking, consider adding these useful parameters to the URL: Pws=0 When you do a search on Google it automatically adapts the results to your preferences, studying exactly what websites you usually click on. In short, naturally you start to see your own website at the top of the Google results. The way to block this personalization and have access to the actual results of Google is by adding &pws=0 at the end of the URL obtained from Google. https://www.google.com/?q=iphone+prices&pws=0 /ncr When you enter google.com in the address bar, Google automatically redirects you to the domain of the country where you are. To avoid this redirection, enter /ncr in the domain name: Gl=[country code] If you want to see the Google results for a particular country, without being altered by the results of the country in which you are doing the search, enter &gl=[COUNTRY CODE] in the search URL. For example: http://www.google.com/?hl=en&pws=0&gl=US&q=Forex Here is a list of all countries available in format ISO 3166-1: Hl=[language code] To choose a particular language in which you want to see the results and translation of the search engine, enter &hl=xx in the search URL, where â&#x20AC;&#x153;xxâ&#x20AC;? is the language code. For example: http://www.google.com/?hl=es&pws=0&q=FOREX
Google webmaster tools Google Webmaster Tools are a set of tools that every manager should know how to use to optimize the operation of their website. WMT can do a full audit on the status of the entire website identifying potential flaws, whether they be design flaws, or flaws of other kinds. Thus, it is possible to control every aspect of your site for positioning within the Google search engine. Webmaster Tools also lets you see if your sitemap has a valid structure. You can check and produce a robots.txt file to handle the indexing of your pages. With this tool you can check how often the Googlebot visits your site, and the indexing frequency, allowing you time to adjust. It also analyzes the traffic coming to your site from different points of view, providing relevant information to enhance the content you add. It analyzes the keywords used by visitors arriving from the search engine so you can enhance those that you deem necessary to take advantage of their effectiveness. Using this analytical tool, Google will deliver a report that shows you how your site is indexed in the search engine, and what errors you can fix on your site to improve SEO. When you add more content to your website it is quite possible that several links to old entries are no longer operational, and searching for them one by one is time consuming. Webmaster Tools will scan all links in your page to verify their feasibility and usefulness when it comes to adding value to the website. WMT will be one of your main sources of information about the status of your domain. It is imperative that you have a good handle on it. Here you have some articles with helpful ideas for optimizing your SEO with Webmaster Tools: ● http://www.entrepreneur.com/article/236366 ● http://searchenginewatch.com/sew/how-to/2273660/how-to-use-google-webmaster-tools-tomaximize-your-seo-campaign. ● http://www.seochat.com/c/a/google-optimization-help/how-to-improve-your-click-throughrate-in-google-organic-search-results/ (improving click-through). ● http://www.searchenginenow.com/how-to/fix-duplicate-title-tags-google-webmaster-toolspaginated-posts/ (duplicated title tags).
Screaming Frog
The free version of this tool can trace the links of the websites that you specify. This application provides useful information when reviewing the important tags for SEO within your site. By setting the Screaming Frog options (http://www.screamingfrog.co.uk/) you can check the status of links, images, JavaScript, CSS, or follow external and internal links with the â&#x20AC;&#x153;nofollowâ&#x20AC;? attribute, among other functions. The application also displays information, such as meta keywords, detected in the pages of your website, as well as outgoing and internal inbound links, and the content of every page of your site including titles H1 and H2 in its analysis. It is notable that Screaming Frog is a useful application for the SEO analysis of your website especially if you have to solve nomenclature errors in your URLs, such as using underscores, or nonASCII characters. Its most immediate function is to detect the correct functioning of all links in the domain. Analyzing your site, you can quickly detect broken links with 404, 302, and 301 redirects, and server errors 5xx links. It is important to analyze and fix these links, as they have a lot to bear on the ranking of your page.
XML-Sitemaps For those users who do not want to complicate creating a map of their website there are tools that make this function much simpler. XML-Sitemaps (https://www.xml-sitemaps.com/) generates a sitemap as you enter the attributes in the tool such as the rate of change, or the last time the domain was changed. For a sitemap, you only have to enter the address of your primary domain and the tool will begin analyzing all internal links, creating an XML file that you can upload to attach to your domain using Google Webmaster Tools. This tool is free if you use it for less than 500 pages.
MailChimp When creating your mailing list, you need a tool that covers all the possibilities offered by email marketing. MailChimp (http://mailchimp.com/) is currently one of the most popular email marketing service. This is due to the ease of use and free service of its basic version which provides all the functions necessary to generate a campaign via e-mail. Once you register with the MailChimp service you will be able to create email marketing campaigns. With the free version you have the option of sending emails to up to 2,000 users in a single action. The life cycle of an email is longer than interactions on social networks since the latter depends on context and immediacy while you can always check your emails again by simply accessing your inbox. When you want to design a campaign, MailChimp offers different types of templates to facilitate your work when creating your content. You can choose to create your basic email in plain text, using a
predesigned template, or create it manually with a simple drag & drop editor, which allows you to customize all the elements of the message. MailChimp allows various forms of campaign development from the traditional post to sending unformatted text. However, the possibility of sending emails from an external HTML editor is designed to leverage content and increases the ease of use. One of the most interesting capabilities is the model A/B split. You create two different designs and the service sends each option to two 10 percent segments of your user base to collect data about which of the two is more effective. When the data experiment is finished, MailChimp chooses the best option to send to the remaining 80 percent of users. To create an email campaign, it is important to identify the target by focusing on the content, and marking an important issue to attract attention. The first line the user reads has to lure the user into opening the message, so designing the subject line properly may be the difference between the success and failure of a campaign. The body of the message should contain the essential information, and encourage the user to visit your page, so it must be a simple and attractive offer for conversion into visits. If your users respond directly to the email, set up an automatic response providing essential information directly. When creating the content for your campaign try to make the emails as personalized as possible to avoid the impression of being a generic email or spam. Create messages that encourage interaction, and especially check the spelling and the proper configuration of the campaign. The verification step is essential to avoid mistakes or giving a negative impressionâ&#x20AC;&#x201D;it is the image of your business that is at stake. MailChimp has the distinction of offering an analytical report of the emails you have sent. These reports indicate the rate of opening of your messages, visited links, and the number of clicks, as well as the users that drop out of your mailing list. By measuring the social behavior of your subscribers you can improve the service you give through this method, which has been proven to deliver a high conversion ratio, making campaigns effective.
Online Advertising For this section there are many tools in use, for example Google AdSense, Yahoo!, and various affiliate programs. Due to advances in the availability of high-speed connections, advertising efforts use media that employ more bandwidth including banners, animated images, and even embedded videos known as rich media. Google AdSense AdSense is a Google service that pays websites for advertising placement. If your website has significant traffic, you can register with Google AdSense and insert advertising related to your website. Google AdSense will automatically detect the topic of your content and insert related advertising.
It is especially important to carefully read the Terms of Service and not engage in activities not allowed by Google AdSense since Google is quite inflexible in this regard, and can leave you out of the service when it detects fraudulent use beyond recovery. The AdSense service will examine your page to categorize it by content so that it can offer relevant advertising to match the interests of your visitors. This results in higher conversion of views to clicks, revaluing the cost of advertising your website. Therefore, CTR (click-through rate) needs to be considered carefully, as the CTR history indicates the trend of the page and how much can be improved to attract more sponsors.
Google PageSpeed Insights https://developers.google.com/speed/pagespeed/insights From PageSpeed Insights, you can check that your website is displayed correctly on different devices. This tool shows the degree of compatibility with mobile devices and computers. The audit recommends the aspects that can be improved, especially according to the ranking policies of Google.
Healing a site affected by Panda Since the Google Panda algorithm was launched, many web pages have been penalized due to their main domain structure and position strategy. The generic advice is to focus on making impeccable content in the eyes of Google instead of trying to put into practice link-building strategies. The Google Panda penalty will apply to all sites that Google determines to not have adequate quality. A few pages with low quality can completely ruin the traffic to your entire website. If your website is affected by Panda and, you have not solved the problems identified, your ranking is radically affected. When you succeed in lifting a Panda penalization itâ&#x20AC;&#x2122;s typical that your ranking improves more than before being penalized. To recover your website from a Google Panda penalty, perform the following steps: 1.
Incorporate a Terms of Service section and a privacy policy.
2.
Include a contact section with phone numbers and a visible address and company information.
3.
Use a domain that is more than one month old, and has an expiration date at least five years in the future.
4.
Register your site with Google Webmaster Tools, and check sections of incidents to fix any possible failure indicated by Google.
5.
Check your website with Google Insights, and make sure that it complies with all regulations for desktop and mobile devices.
6.
Make a list of the pages that attract the most SEO traffic and audit them using Seologies to make sure they meet the quality requirements (usually 80% of traffic is brought to you by less than 20 percent of the pages).
7.
The remaining pages of your website can be unindexed with the meta tag robots 'noindex, follow.'
8.
Make sure the images on your site are unique and not copied from other websites.
9.
Check your website with a tool like Xenu to make sure you donâ&#x20AC;&#x2122;t have outbound links to nonexistent pages.
10.
Create content with the user in mind to increase the time they spend on the page.
11.
Add a navigation bar to your website where the user can find hierarchical access to related content.
12.
Make sure the most important content of your page is visible without requiring the user to scroll. (http://googlewebmastercentral.blogspot.com.es/2012/01/page-layout-algorithmimprovement.html).
13.
Make sure none of your pages repeat the meta description or title (https://www.youtube.com/watch?v=W4gr88oHb-k).
14.
Check the meta tags for languages if your site is available in several languages.
15.
Make sure your page has a modern and attractive design.
16.
Search your site on Google (site:www.mydomain.com), and make sure that all of the result pages follow these rules: ● There are no duplicated titles. ● There are no duplicated descriptions. ● The content of all the pages has a good rating according to Seologies or you add the tag “noindex” to the pages that don’t have a good rating and you don’t want to change. ● Check all the result pages with Copyscape to make sure that your content isn’t copied from other sites.
The main goal of the Google Panda algorithm is to penalize pages that do not provide useful content to users. Google is not against your website, it is against business models that do not add value to the user and create almost empty and worthless pages. It is important that you change your mindset in managing your website. Always think of bringing value to the web, instead of seeking immediate profits. Quality signals that Google looks for in your site are also signs of quality for users, so naturally your website will be more credible and generate higher sales if you adapt to Google’s requirements.
Forbidden Techniques Google always looks for any pattern of behavior that seems suspicious in everything you publish on the web. Here are some examples of patterns of fraudulent behavior that can be penalized: ● Excessive repetition of keywords in the text, URLs, headers (h1, h2), and image alt text. ● Multiple inbound links with the same text. ● Inbound links always appear on specific dates or at specific times. ● Overly positive reviews. ● Excessive comments on Google+ by inactive or potentially false users. ● Comments using the same IP address. ● Forcing users to vote for you on Google+ to access certain content.
These are some, but not all, of the techniques you should avoid. In short, any activity that is not natural that tries to improve the position of your webpage will end up being found by Google and penalized. Increasingly, only content that is really useful to the end user will appear on the front page of results.
Epilogue In this book we have explain all that we have learned in our years of investigation. Some of these techniques may seem silly or extravagant, and yet, when you apply them, you will see extraordinary improvements in your SEO. This is because we are sure that every simple guideline that we have offered affects your ranking in some way. If you encountered anything in this book that seems deceptive to the user or not good for him, please discard it. It is possible we didnâ&#x20AC;&#x2122;t explain it correctly or you misunderstood the meaning of that section. We will never advise you do anything against the user experience because this will always be penalized in the long term. Donâ&#x20AC;&#x2122;t try to find short term rewards in SEO. Instead, plant the seeds of a good site focused on the user and let them grow until they become a jungle. Thank you for reading this book and have a happy SEO.
The SeoWaz Team