Optimization SEO 2013 - Part 2

Page 1

Optimization SEO 2013 - Part (2)

A few days ago we introduced some interesting questions regarding the optimization, which you can read here. Today we will develop additional points of that speech, outlining further steps of SEO, and revealing what may be the "tricks of the trade". We left off the long list of interventions after speaking of the H1 tag. Another factor not to be missed is never content. Original, unique, rich: it is a fundamental indication of the quality of your site. Remember that pages containing more than 2400 words are what Google considers best. But do not go below the 300 words you enough protection. 400/600 words are sufficient for a page more than acceptable. Webmaster Tools and Screaming Frog reveal valuable information about the state of the link. Never exceed the limit massino than 100 links per page. As for the anchor text, it must be said that you should never overdo the use of anchor tex "rich." By "rich" I mean consisting of keywords. Every now and then the formulas such as "click here" or "read more" does not fail, because you avoid throwing keywords in a very specific manner in the text, making it in some places over-optimized and less user-friendly. Let's say that three internal links for each page is the ideal amount.

Regarding the images, it is essential that each being contains an ALT tags, which describes and which has in itself a relevant keyword. Do not let the name of the image is a number or a random string: it must necessarily be descriptive. The image size must be carefully set, and it can be used as a link, by putting in place of an anchor text. A small insight into the rel = "nofollow" it is now necessary. Google introduced this tag to


discourage traffic to specific pages that do not approve of everything (think of the traffic generated by commit spam in blogs, forums, etc!). It 'a kind of defense against the outgoing links of low quality, which eventually damage the good sites (blogs, forums) as the traffic they generate is directed to sites that are not as good, rather poor quality. The robots.txt file will help to prevent pages from being indexed, as do the rest of the tag is not indexing. Pages with mediocre content, duplicate or insufficient, may require interventions of this kind, to avoid the censure of the spider and the penalty. It 'a little' how to make them invisible to the crawler. Sitemap in HTML or XML, subject to Webmaster Tools, you will reveal the degree of indexation of every single page of your site, allowing you to do targeted and complementary. The 302 redirects instead should be completely avoided because they represent the failure of SEO. A permanent 301 redirect, however, can be used, but with wisdom. Not reiterate enough how dangerous it is to duplicate content. Where did the original, but present in other pages on the web, is an indication of poor quality: Google disapprove it! If you notice the presence of duplicate content act, depending on the case, like this: change the URL, you rewrite the content modified as necessary, do a 301 redirect to a / a canonical page / site; implements the rel: "canonical" tag to identify the source page. The canonical page is, in fact, the preferred version of a series of similar pages, which is why goes some way identified. Attention, now, to the broken link (link broken, corrupted, damaged). A tool like Xenu Link Sleuth can help you correct errors of this kind. Attention also to the programming codes. Stick with W3C standards, favored by Google. Now let's talk about the upload speed, which is essential to increase the quality of the site. Pingdom's free speed text tool can help you understand how fast your site. Reduce the size of images, operate on the browser caching CSS Sprites for the combination of images if possible, are valuable actions in case of excessive delays. Merge the CSS file and the JavaScript, compress or minimize them will be an action in more. Number of inbound links and their quality are two additional factors to consider. Open Site Explorer will show you an overview of this traffic. Since the links pointing to the home page will have little, watch out for the landing page! Trusted domains (domains certificates) are those most authoritative, from which to receive valid traffic, unlike domains scarce or even ban. Furthermore incoming links must never come from a single domain, but by a series of more or less long domains. Domain Authority is a tool that precisely measures the authority of a domain from 0 to 100, and helps you choose the best "off ramps" for your site.


Facebook, Twitter, Delicious, Pinterest and other social networks among the most heavily influence the positioning. Click here if you want to inform, for example, the growing importance of Facebook. Do not omit to include buttons sharing (share buttons) and content worthy of being shared: it is nothing short of essential. Finally it will be useful not only to monitor their site, but constantly compare it with its competitors: it will be well structured and have all the trappings, but if its competitors will be more effective it will be difficult dull in this war for positioning. Today seems to be all. Click here if you want to ask us some questions.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.