GoogleWebmasterHelp List of Question Answer by Matt Cutts

List of Google Webmaster Help Question and Answer by Matt Cutts in point form!
Onsite and Offsite Search Engine Optimization (SEO) to improve site visibility.

How to Rank #1 on Google
What are effective techniques for building links?
Can I use Robots.txt to Optimize Google’s Crawl?
Is over optimization bad for a website?
Should I use underscores or hyphens in URLs?
More than one H1 on a page: good or bad?
Can dofollow comments on my blog affect its reputation?
Is it better to have keywords in the URL path or filename?
Should internal links use rel=”nofollow”?
Do multiple links from one page to another page count?
How can I identify causes of a PageRank drop?
What are the top things to do in SEO?
How important is the frequency of updates on a blog?
How do you protect your blog from hackers?
Should I tweak my titles and descriptions to improve my CTR?
Should large corporations use rel=canonical?
Does the ordering of heading tags matter?
Do site load times have an impact on Google rankings?
Can the geographic location of a web server affect SEO?
Is excessive whitespace in the HTML source bad?
What’s a preferred site structure?
Which is better: HTML Sitemap or XML Sitemap?
Are footer links treated differently than paragraph links?
How do meta geo tags influence the search results?
Will PageRank split for links with or without trailing slash?
How much content should be on a homepage?
Is the time left before your domain registration expires an SEO factor?
How are site: results ranked?
Is Google putting more weight on brands in rankings?
Should I use pipes or dashes in my titles?
Do ids in heading tags affect search engines?
Can my blogroll affect my blog’s reputation in Google?
Will a link to a disallowed page transfer PageRank?
Does Google consider the URL of an image?
Does Google remove PageRank from incoming links that no longer exist?
Does position of keywords in URL affect ranking?
Will showing recent posts on homepage cause duplicate content issue?
Is the same content posted under different TLDs a problem?
Will SEO still exist in five years?
What impact does “page bloat” have on Google rankings?
Does PageRank flow through image links?
Is there a limit to how many 301 (Permanent) Redirects I can do on a site?
How does URL structure affect PageRank?
Does the number of subdirectories in a URL affect its ranking?
How do PageRank updates work?
Should I use nofollow in links to my disclaimer and privacy policy?
How does Google treat sites where all external links are no-follow?
Is there such a thing as building too many links?
Why do porn sites have lower PageRank?
If I don’t need to block crawlers, should I create a robots.txt file?
Will setting the rel=”canonical” attribute of a page to itself cause a loop?
Is there an advantage to using rel=”canonical” over a 301 redirect?
If I report the same news story as someone else, is that duplicate content?
What is Google Caffeine indexing?
Does indexing a mobile website create a duplicate content issues?
How can I make sure that Google knows my content is original?
How important is it to have keywords in a domain name?
Should I keep a domain parked without content before I launch the website?
Do human “quality raters” influence which sites are impacted by Panda?
Should I structure my site using subdomains or subdirectories?
Can I fetch an https URL as Googlebot in Webmaster Tools?
How does Google consider site-wide backlinks?
What is Google’s view on guest blogging for links?
Does Google take action on spammy guest blogging activities?
If I quote another source, will I be penalized for duplicate content?
How will Google interpret links to URLs ending with a campaign tag?
Do AdWords customers get special treatment in organic search results?
Why doesn’t Google release a SEO quality check up calculator?
Is freshness an important signal for all sites?
Do you think that “Search Engine Optimization” should be renamed?
What has having your own blog taught you about SEO?
Should I incorporate synonyms for important terms into my site?
Why do paid links violate Google’s guidelines while other ads don’t?
How long does a reconsideration request take to process?
Will Google get a webspam team outside of the US?
Why estimated number of results change when going from page 1 to page 2?
How many messages did Google send about unnatural links?
What is the ideal keyword density of a page?
My site doesn’t have much text. Is that a problem?
How can I make the pages on my site unique?

What are effective techniques for building links?

Organic link building, according to me is one of the most difficult tasks for SEOs of SMEs. Can you please list 5 effective ways of organic link building other than building great content?

1) Participate in the community, participating in answering questions to help others.
1a) If you have something valuable to add, post a comment.
1b) People will remember you answered their question, and willing to link to your site.
2) Dig into a research subject, and they are more likely to get links.
2a) Example, posting your research about products you used and which works best.
2b) Visitors will start linking to that research if they find it useful and spread it.
3) Run a service or product that people find it useful.
3a) Example: Chrome Extensions / Firefox Plugins / Games / Open Source Software.
3b) You can provide them free version or premium version to grab more people.

Can I use Robots.txt to Optimize Google’s Crawl?

Can I use robots.txt to optimize Googlebot’s crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a ‘normal’ robots.txt?

1) The answer is NO. Robots.txt is not the right method to do that.
2) If you want certain pages get crawled, place page links on the root page.
3) PageRank from root page flows more to these selected pages rather than others.
4) Restructure your site by putting important pages in root for better Googlebot crawl.
5) The root is where most of your PageRank enters because people commonly link to the root of your website.

Is over optimization bad for a website?

Is over optimization bad for a website? Ex. – excessive use of nofollow

1) Excessive use of no-follow links does not get you into trouble.
2) Instead of worrying going overboard, make it better for users to patron your site.

Should I use underscores or hyphens in URLs?

Underscores vs hyphens in URLS, does it make a difference? my-page vs. my_page?

1) Underscores and dashes (also known as hyphens) makes a difference.
2) Hypens are treated as separators while underscores are not.
4) Keywords with hyphens breaks up a statement into multiple keywords.
5) Keywords with underscores merges it as 1 single keyword.
6) Hyphens are more preferred if you wish to index multiple keyword for an article.

More than one H1 on a page: good or bad?

More than one H1 on a page: good or bad?

1) You can use H1 tags multiple times on a same page.
2) Use it where it makes sense and sparingly.
3) H1 Tags should be used in header/headings, not everywhere you can put on a page.
4) If entire page is H1 tags, it looks cruddy when CSS is turned off.

Can dofollow comments on my blog affect its reputation?

Are there negative SEO implications to having a blog with do-follow comments? What about commenting on do-follow blogs?

1) Yes. DoFollow comments can affect its reputation.
2) Spam comments can severely affect the site if comments url are set to dofollow.
3) However, this can mitigated if these comments are moderated for good quality info.
4) PageRank from a post are divided to all commenters equally.

Is it better to have keywords in the URL path or filename?

Would you rather have keywords in path name or all together at once in a filename?
Example: http://example.com/tools/hammers/acme-metal-pounder
Example: http://example.com/tools-hammers-acme-metal-pounder

1) It doesn’t really make much difference.
2) However, for user experience, its best to have it in pathname.
3) Savvy user will not likely to click long URL with many dashes.

Should internal links use rel=”nofollow”?

Should “back to top” links have rel=”nofollow”?

1) The answer is No.
2) Don’t add no follow if you are linking within your own website.
3) If you are linking to a non trusted/endorsed external site, add rel=”nofollow”.
4) Using rel=”nofollow” within your site does not sculpt PageRank.

Do multiple links from one page to another page count?

If we add more than one links from page A to page B, do we pass more PageRank juice and additional anchor text info? Also can you tell us if links from A to A count?

1) Based on original PageRank formula:
1a) You pass more PageRank from page A to page B if there are more than 1 link.
1b) Linking within Page A itself passes PageRank to itself. So no juice leak.
2) Focusing on PageRank sculpting obsessively isn’t the most productive way.
2a) PageRank sculpting is recirculating or hoarding PageRank within your website.
2b) Create great content so that a lot of people will link to your site.
2c) That way you get more PageRank Juice which then flows naturally within your site.
3) Placing essential links on your main page helps PageRank to circulate better.
4) Set your homepage logo so that when people clicks it, it points to your main page.

How can I identify causes of a PageRank drop?

I use the Google Toolbar to monitor PageRank. I read on the Internet that it gives old and quite unreliable data. Can I have reliable realtime PageRank information about the sites I administer? And how can I identify causes of a PageRank drop?

1) Information provided by Google Toolbar is only updated every 3-4 months.
2) It is updated irregularly to prevent webmasters from being obssess about PageRank.
3) Webmaster should spent more attention on titles, accessibility and good content.
4) There isn’t anything unreliable since PageRank is rounded 0 to 10 scale.
5) There are few reasons to why PageRank can drop:
5a) A reputable site stopped linking to your site or that site has disappeared.
5b) Googlebot is confused with your website internal links (www vs non-www)
5c) If you are selling links that passes your PageRank, your PageRank drops by 30%.
5d) Selling PageRank links violates Google quality guidelines.
5e) This can be avoided by setting your links with rel=”nofollow”.
6) If this penalty happend to you, to lift up PageRank penalization:
6a) Remove links you were selling PageRank.
6b) Submit reconsideration request about your wrong-doings.
6c) If Google sees there’s a good faith effort, they will lift the penalty.
Example of Recondieration Request:
“Hey, I was selling links that passed PageRank. I saw my PageRank dropped so I’ve removed those links. You can verify it. Please let me regain my trust with Google.”

What are the top things to do in SEO?

What are the some general guidelines and recommendations you would make to people who desire to increases their site visibility on Google?

1) Make your site crawlable. This can be tested by using a text-browser.
1a) If you can visit all pages on your site via text-links, you are in good shape!
1b) Having a sitemap (list of pages of your site) will greatly help in crawlability.
2) Interesting content on your site also helps in making people link to you.
3) Think about people who are related to your niche and have them know about you.
3a) If your friends or colleagues have related niche, both of you can link each other.
4) Newsletter, tutorials, viral content can help get more links.

How important is the frequency of updates on a blog?

Some people are under the impression that blogs are good for SEO only if they’re updated frequently. How much does frequency play into PageRank for blogs & other dynamic sites? Isn’t the content more important than the simple # of posts per day/week?

1) Users like to see new content whenever they show up instead of static.
2) It is more important to create useful contents than frequently posting contents.
3) When you think about Search Engine, its better to target great quality content.
4) Quality contents attracts more links and users to patronize your site.
5) Original pieces of contents do better than follow-on post without any insight.
6) Follow-on posts are like I’m the 100th person who wrote about IPhone leaks.

How do you protect your blog from hackers?

I just visited your blog. I noticed it was built with WordPress. How do you keep it safe from hackers? Ever since I got PR 5 last month – I’ve got dozens of hack attempts a minute.

1) Update to the latest WordPress Version
2) Use a long password
3) Set.htaccess to allow your IP address to access WordPress admin page.

Should I tweak my titles and descriptions to improve my CTR?

Are title and description tags helpful to increase the organic CTR — clicks generated from organic (unpaid) search — which in turn will help in better ranking with a personalized search perspective?

1. Yes. This allows your visitors to convert better for Return of Investment (ROI)
2. Make your snippet and title compelling to invites users to click on it.

Should large corporations use rel=canonical?

In regards to the new canonicalization tag, does it make sense for large corporations to consider placing that tag on every page due to marketing tracking codes and large levels of duplicate URLs like faceted pages and load balancing servers?

1. Yes. You can place it on every page but do consider your site structure.
2. Take time to study your site structure and access pages should get canonicalized.
3. Misusing canocalization tag might shoot yourself in the foot (hurt your site).
4. It’s best to use canocalization on absolute links than relative links.

Example of URL:
Absolute : http://example.com/anime/bleach.png
Relative : ../anime/bleach.png

5. Canocalization is used on pages that have similar content (duplicate contents).
6. Place preferred version page link in canocalization tag of duplicated content pages.

How do I specify a canonical URL?
Assuming the preferred page is http://example.com/anime?id=bleach
Create a link tag with rel=canonical and href=”preferred page”
Example: < link rel="canonical" href="http://example.com/anime?id=bleach" / >
Copy this link into the section of all non-canonical versions of the page, such as http://example.com/anime?id=bleach&sort=episode.

Extra Information
Canocalization are basically mini 301 redirects.
You can use it to redirect a page to a preferred page within a subdomain/domain.

Does the ordering of heading tags matter?

I’m using a template website (I’m an amateur!). The h1 tag appears below the h2 tag in the code. Does the spider still know what’s going on?

1. Don’t worry about H1 and H2 ordering as Google handles it very well.
2. However, avoid making the whole page H1 or H2.
3. Google can processed syntax error, ugly, broken or non-html web pages.
4. Google crawls and process these pages for potential good information for visitors.

Does the ordering of heading tags matter?

What impact do site load times have on Google rankings?”

1. None.
2. If site takes 20-30 seconds to load, this might cause timeout.
3. Timeout prevents GoogleBot from caching your site, this affects your ranking.
4. If site takes 1 second vs 2 seconds, this has no difference on Google Ranking.
5. Compact several javascript or css files into 1 to make page load faster.
6. Use GZip or Minify to compact your pages to make it return faster for users.
7. Don’t worry about Search Engine perspective right now, focus on user experience.

Can the geographic location of a web server affect SEO?

Does geographic location of the web host has any significant ranking factors for organic SEO?

1. Yes. Google looks at your site IP address of your Web Server.
2. If your site is based in Japan, Google thinks its useful for Japan users.
3. Google also looks at TLDs (example: .jp / .kr)
4. You can set your site in Google Webmaster Tool to target a specific country.
5. You can specify certain parts of your site to target specific country.
Example: jp.example.com for Japan or kr.example.com for Korea.

Is excessive whitespace in the HTML source bad?

Excessive white space in the HTML source is bad. Fact, myth, or somewhere in between?

1. Google ignores excessive white space, so it does cause them any harm.
2. Use whatever white space which is reasonable for you.
3. Clean HTML with nice indentations makes site easy to maintain and upgrade.

What’s a preferred site structure?

There seems to be little impact on human visitors where in the site’s structure a given page is, so: Is it better to keep key content pages close to root, or have them deep within a topical funnel-structure, e.g.: food/fast-food/burgers/hamburgers.php

1. Below is not a SEO advice but behavioral advice.
2. Visitors are likely to find a page if less clicks required from root page.
3. This way you have higher rate of improving your ROI.

Which is better: HTML Sitemap or XML Sitemap?

HTML sitemap vs XML Sitemap. Which one is yummy for Google search engine spider?

1. Both are yummy to Google Spiders.
2a. Google does not guarantee to crawl pages from XML Sitemaps.
2b. However, XML Sitemaps do help Google Spiders to discover new pages.
3. HTML Sitemaps is better as it helps users and Google to find all pages on your site.
4. Start with HTML Sitemaps first, then setup XML Sitemaps if you have the time.

Are footer links treated differently than paragraph links?

Does Google treat links in footers differently than links surrounded by text (e.g. in a paragraph)?

1. Yes. The original PR algorithm treated all links same but Google has modified it.
2. Links in footer does not carry same editorial weight as links in paragraph.
3. This is done to consider the relevance, reputation and trust.

How do meta geo tags influence the search results?

How do meta geo tags influence the search results?

1. Google does not look at that but IP address, gTLD and ccTLD.
2. ccTLD refers to country code TLD. (Example: .kr for Korea / .jp for Japan)
3. Set your subdomain/domain in Google Webmaster Tools to target specific country.

Will PageRank split for links with or without trailing slash?

There are a lot of rumors about whether or not slash at the end of a domain makes a difference. Is there a difference between getting a backlink pointing to www.website.com” vs “www.website.com/” ? if you get links to both would your pr be split?

1. PageRank will not be split as Google does a very good at canonicalization.
2. Example: Google knows a site with www and non-www are the same.
3. This same applies with links with and without trailing slash.
4. Canocalize means combining pages that are pointing to the same content.
5. This is done to prevent PageRank from splitting between these 2 variations.

How much content should be on a homepage?

More or less content on a homepage. There is certainly a difference of opinion here.

1. It’s good to have more content on homepage for Googlebot to find and cache.
2. For image sites, have captions along side describing the picture/photo.
3. Having 5 to 10 posts on main page at any given time is a good start.

Is the time left before your domain registration expires an SEO factor?

I was told that this is and domains with less than 1yr. left seen as spam. But if this is correct, how does Google manages auto renewal of domains, they get renewed on expiration day.

1. No. Google does not view a site as spam if they register domain less than 3 years.
2. Time factor is not used as a signal within search quality or Google’s Rankings.
3. Renewing domain name 2-3 years at a time is mainly for convenience.

How are site: results ranked?

If you do a site:example.com on Google, on what basis the results are ranked? Is this the order in which Google gives importance to each and every page in the website? Because most of the time the top listed pages get more search traffic.

1. Google uses similar PageRank algorithm to determine the results.
2. Google prefer shorter URLs (example: the one closer to root page)
3. Go through your server log and find out pages that drives most traffic.
4. Those high traffic pages are likely the ones listed top.

Is Google putting more weight on brands in rankings?

Can you verify that Google is putting more weight on “brands” in search engine rankings? If the answer is “Yes” — what is Google’s definition of a brand?

1. No. Google does not put weight on brands.
2. Google puts weight on trust, authority reputation, PageRank and high quality.
3. For example, if you type in Eclipse in Google Search Engine:
3a. You will find Eclipse.org (Eclipse Development IDE)
3b. You will find NASA Eclipse (Spaceship)
3c. You will find Eclipse (the name of the Twilight Book)
4. Google returns results which they think best for users.
5. In short, try to make a great site and become known authority in your niche!
6. People links to that kind of site that they’ll enjoy talking about.

Should I use pipes or dashes in my titles?

Does Google have any suggestions (or data) on the impact of pipes versus dashes in the title tag?

1. There isn’t much differences between using vertical pipes versus horizontal dashes.
2. Both pipes and dashes are separators, so either one should be fine.
3. Dashes are more common then pipes, but both are handled well by Google.
4. You might see a little bit of impact in terms of how users Click-Through.

Do ids in heading tags affect search engines?

Does using a class or an id in a header tag instead of plain headers interfere with the way search engines sees and understand headings?

1. No. IDs in H1 or any HTML tags does not interfere with Google Search Engine.
2. However, image attributes such as width, height and ALT tag is used by Google.
3. Have a clean HTML syntax to make it easier for yourself to maintain and upgrade.

Can my blogroll affect my blog’s reputation in Google?

I keep a “blogroll” page with link to all my friends’ blogs on my blog. Will that affect my blog’s reputation in Google? Recently my friend lost PR5 to 0 for such a page.

1. Who you link to certainly affects your site reputation.
1a. Linking to sites that Google considers junkies and spam will affect site reputation.
1b. Selling links in Blogroll posses a very high risk in ruining site reputation.
2. PageRank dropped might be canonicalization issue, not Blogroll.
3. In short, Blogroll does not kill site reputation, linking to bad sites kills.

Will a link to a disallowed page transfer PageRank?

If a page is disallowed in the robots.txt, will a link to this page transfer/leak link juice?

1. Yes. If external sites links to that page, Google will display it in search results.
2. Preventing GoogleBot from crawling a page might still be displayed in its results.

Does Google consider the URL of an image?

Does Google take into account the URL of an image when searching – e.g. could example.com/cats/lolcatz/m.jpg appear in a Google search for ‘lolcatz’?

1. Yes. Google uses keywords from URL to determine keyword relevance to image.
2. However, spamming keywords inside image URL does not work on Google.
Example: example.com/cats/lolcatz/lolcatz1/lolcatz.jpg where lolcatz is spammed.
3. Google also checks image keyword by its metadata and surrounding text.

Does Google remove PageRank from incoming links that no longer exist?

Websites lose back links due to other websites going out of business or closing (Geocities, AOL, member pages). Does Google remove the back link juice that once came from these pages?

1. Yes. Google removes PageRank if those pages are non-existing on web.
2. A site PageRank is determined by the current links linking to the site.
3. Current links are the ones that aren’t stale (active sites linking to you).

Does position of keywords in URL affect ranking?

Does the position of keywords in the URL have a significant impact:
example.com/keyword/London is better than example.com/London/keyword?

1. Yes but very slightly as Google takes it as second order.
2. It’s best to focus on having the right keywords inside URL.
3. Limit to 8 keywords in URL for better visitor click-through rate on page.

Will showing recent posts on homepage cause duplicate content issue?

Hi, I’m noticing more people are using the API for their blog to pull the latest X posts up to the front page of their website. This gives a refreshing feel to the home page, but is this considered duplicate?

1. No. If content resides on page that refreshes/updates often (such as homepage),
1a. Google can disambiguate that the actual content is linked to the following pages.
1b. Therefore, you will need to have permalinks to each post displayed on homepage.
2. The best practice is to show post excerpt on Category/Search/Homepage.
3. Post excerpt is the first paragraph of that post or a snippet of it (teaser).

Is the same content posted under different TLDs a problem?

Is the same content posted under different TLDs considered content duplication? We are an international company with a similar website in several countries, all in English. Atreyu, Spain

1. Yes if you have content on multiple domains where all ended in .com
1a. Google Algorithm will detect this as duplicate and only show 1 result.
1b. However, its different when under different TLD (e.g. site.com.us / site.com.uk)
2. Spammers tends not to build sites on different TLD because lots of hassle/cost.
2a. They stick to one particular Top Level Domain.
2b. Business sites that uses TLD with identical content tends to be real business.
2c. Therefore Google won’t penalize site if it registers different country TLD.
2d. However, make sure content are localized to each country (e.g. money currency)

Will SEO still exist in five years?

1. Yes. SEO puts your site’s best foot forward in search engine eyes.
2. It can be seen as an act of polishing a resume, SEO improves site visibility.
3. However, Google tries to make it so you don’t need to be a SEO expert.
4. SEO makes your site well represented and gives right impression for visitors.
5. Spamming nor black hat is not required for good SEO.

What impact does “page bloat” have on Google rankings?

What impact does ‘page bloat’ have on Google rankings? Most of the winners in SEO seem to have very simple pages (very few images, HTML-only design) – sometimes to the detriment to the user in a poorly designed page.

1. No impact as Google can handle pages that has many images, flash, and content.
2. Google does not limit content to 100 KB (kilobytes) within a page.
3. Therefore, don’t worry about having too much content on a page.

Does PageRank flow through image links?

1. Yes. PageRank does flow through image links similarly to visible URL hyperlinks.

Is there a limit to how many 301 (Permanent) Redirects I can do on a site?

1. No. You can have unlimited redirect links from an old site to new site.
2a. However, if it requires multiple hops, Google might stop crawling to your new site.
2b. To illustrate multiple hops, assume Page A is old and Page Z is the new.
2c. Page A redirect to Page B, followed by C, D, E till Page Z. (total of 25 hops)
2d. Having more than 3 hops, GoogleBot will ignore crawling to the last hop.
3. It is ideal to have only 1 hop (from old page to new page).
4a. 301 Redirect is Permanent Redirect while 302 Redirect is Temporary Redirect.
4b. Use 301 Redirect if you will never be visiting the old page again.
4c. Use 302 Redirect if old page is under construction, so point to a temporary page.

How does URL structure affect PageRank?

Does switching example.com/year/month/day/article to example.com/article has to do with how page rank flows within a site?

There are 2 ways you can see this. Good: It would be helpful to visitors as they can tell whether an article is old via the URL itself, so they can look for a newer version. Bad: It looks ugly and it makes it difficult for visitors to remember the URL with unnecessary dates which can be found on the article itself. Google is not worried how deep a set of directories you have on your site. If your deep page is linked from root page, it will get a slice of PageRank.

Does the number of subdirectories in a URL affect its ranking?

How much of a difference does the number of levels in the URL make? Does ‘example.com/keyword’ gives higher ranking on keyword than example.com/etc/etc/keyword

No. It does not change how much PageRank that a post received whether it is on high/low level in directory. If that page is linked directly from root or relatively close to the root page, it gets a big portion of PageRank. The root is where many people will link to your site, hence PageRank reservoir.

How do PageRank updates work?

On what basis you are increasing the PR for each PR update?

PageRank is Google’s opinion about how reputable the page. This is computed continuously by analyzing new links pointing to your site using machinery with little human intervention. PageRank is truncated into 10 levels that are visible is Google Toolbar. Every 3 months in a year, PageRank of all websites are updated.

PageRank is based on who links to you and how reputable they are. Highly reputable sites will pass high quality PageRank which in return increase your site PageRank. However, if a website is caught selling links that pass PageRank, Google will penalize those sites by reducing their PageRank by 30% as they are manipulating with Google Search Algorithm.

Should I use nofollow in links to my disclaimer and privacy policy?

Is it a good thing to put ‘nofollow’ in links to a disclaimer, privacy statement and other pages like that with the internal PageRank in mind? I hear different stories about this.

You should only use nofollow when you truly do not want a page to be indexed on Google Search Engine. However, if someone links to that page from another external site, it might be displayed in Google’s query. Putting noFollow on Disclaimer or Privacy Statement will not cause a spam penalty. Don’t worry too much on PageRank sculpting. Focus your effort on making great content that will attract links.

How does Google treat sites where all external links are no-follow?

How does Google treat sites where all external links are no-follow? I understand the purpose of no-follow is for webmasters to indicate which links are paid, but when sites like Wikipedia make EVERY outbound link no-follow, that defeats the purpose.

Wikipedia switched from dofollow to nofollow links to stop users spam edits on its articles for PageRank purposes. This might change if Wikipedia allows its senior good contributors URLs to be set as doFollow if they post related and useful links in the near future.

Is there such a thing as building too many links?

Is there such thing as building to many links, if you’re following Google’s webmaster guidelines exactly? Too many where you would get banned, even if you’re following the rules?

You can get as many links as you can but legally. This can be done by creating a fantastic website or blog that attracts visitors to share with their friends about your site and they start linking to your site. If those links are being attracted on their own based on its merit or interesting of your site, naturally you don’t need to work so hard on getting links.

Why do porn sites have lower PageRank?

What are the technical reasons porn sites have such low PageRank? None go over PR6. A lack of links from trusted sites? Rampant link exchanges with low quality sites? Penalties for affiliate links? Or is there a general penalty on the industry?

PageRank does not correspond to pure popularity but more towards reputation of a site. For example, The Iowa Real Estate Commission Board (official government site) gets a fair number of links although people seldom visit the website. People rarely link to porn sites even though they are frequent by many people. If pagerank is termed as popularity on Google Search, porn sites would have many quality incoming links, but it isn’t the case. Very few links from popular sites such as New York Times, The Wall Street Journal and CNN points to porn industries.

If I don’t need to block crawlers, should I create a robots.txt file?

Is it better to have a blank robots.txt file, a robots.txt that contains User-agent: * Disallow:” or no robots.txt file at all?

It is best to have 1 than nothing as your webhost might provide a 404 error page that could lead to various weird behaviors. An empty robots.txt is equivalent to allowing everybody to crawl your sites (user agent * disallow: ‘ ‘) In short, it is recommended to have a Robots.txt with indicators telling bots what content is allowed and disallowed to be crawled and indexed.

Will rel=”canonical” attribute of a page pointing itself causes a loop?

With canonical tags can you point a page to itself.? So if www.example.com/webpage points to www.example.com/webpage will this cause a loop?

No. It won’t cause a loop for Google Search Engine. Google handles it just fine even though you have hundreds of pages pointing to itself. However, for other search engines such as Yahoo and Bing, there isn’t an answer for this unless a representative could highlight publicly.

Is there an advantage to using rel=”canonical” over a 301 redirect?

It takes longer for Google to find the rel=canonical pages but 301 redirects seem to lose impact (link juice) over time. Is there similar churn with rel=canonical?

It is best to use 301 Redirects if you have control over your site. However, if you don’t (example: on free host sites), use a rel=canonical tag in your homepage header. However, all these methods does evaporates a tiny amount of PageRank (aka Link Juice) similarly to a user clicking a link. PageRank decay is Google’s method to balance everything. Take note that 302 Redirect will pass no link juice, however Canonical or 301 Redirect will pass 90%-99% link juice (ranking power).

If I report the same news story as someone else, is that duplicate content?

I have a news website. I heard Google doesn’t like duplicate content, but I can’t make up news! What can I do to stay in Google’s favor?

If you slap a wholesale duplicate content taken from another site or did rehashing to it, your content might not be displayed in Google Search Queries because showing these results (with 17 versions by other people doing the same thing) is not good for user experience. Concentrate on your specialties/expertise on a particular topic and create original content out of it or perhaps apply some different insight/perspective/point-of-view to a story that had came out earlier.

What is Google Caffeine Indexing?

Is Caffeine indexing is really helpful for SEO? What is Caffeine indexing exactly?

Google Caffeine means switching their batch method of indexing to an incremental method indexing. This is to shorten the time taken for Google crawled page to be placed into Google Search Index. When 9/11 incident happened, the news spread like wildfire and Google could not keep up. Therefore, to solve this issue, this measure was implemented to allow breaking news and other website content to be indexed quickly.

Once a content is first time crawled, it is immediately indexed and placed into Google Search Results. This on overall improve freshness of 50% content on the web. Be reminded that Google Caffeine is not an algorithm change that affects a site’s page ranked nor removing spam sites from Google Search Results.

The old Google Decaffeinate works by indexing your site (use site:example.com to check whether your sites are already on Google Cache) but the pages it has indexed is not pushed into Google’s Main Search Results where people type in keywords to search for it. This takes a few days or weeks (took 4 months in year 2000) on Google Decaf.

Does indexing a mobile website create a duplicate content issues?

Does indexing a mobile website create a duplicate content issues with the main site?

No, it will not create duplicate content issue because Google has 2 different bots (also known as user agent) that surf your site. One is the mobile version and the other is computer internet browser version. When you surf your website via mobile browser like on iPhone or Android, Mobile GoogleBot will return its search results whereas when surfing your website on computer internet browser, it will return the regular GoogleBot results.

How can I make sure that Google knows my content is original?

Google crawls site A every hour and site B once in a day. Site B writes an article, site A copies it changing time stamp. Site A gets crawled first by Googlebot. Whose content is original in Google’s eyes and will rank highly? If it’s A, then how does that do justice to site B?

If someone is ripping off your content without your consent, you can perform a DMCA notice. You can also submit a spam report to Google if you found that an auto-generated site that creates post by scrapping people’s content as its own, because its not a high quality site.

How important is it to have keywords in a domain name?

How would you explain “The Power of Keyword Domains” to someone looking to take a decision what kind of domains to go for?

If you’re registering a new domain name and you want to compete in a particular niche in SEO, you can take 2 types of strategy. The first is brandable domain name such as Twitter, Digg, Facebook that something people can remember easily but not necessarily keyword in domain name. The second is strictly keywords in domain name such as verb + name like watchanimeonline.com . Google Algorithm has been tweaked a little to put less weight on domain name laden with keywords compared to normal names, just to be fair to other sites, when displaying their search results. Personally, if you are trying to shoot a big success, go for brandable names as visitors can remember you easily.

Should I keep a domain parked without content before I launch the website?

I have a parked domain and want to launch a new website on it. Are there any pitfalls I should avoid? Should I keep my domain parked or put some sort of stub page there?

You should avoid having an empty page such as having no content or paragraph. Google has a Park Domain Detector that checks the page whether its useful to visitors. If its a blank page, Google will not be indexing it in its search results. This is because returning a search result that does not have any valuable information ruin visitors search experience. A common example of a blank page or dummy page would be an advertisement showing a smiling girl with her backpack and some not related nor useful links towards what the visitor searched for. Write up some contents describing that the page will be up within xxx time and a summary of what it will post information on. Therefore, Google will place it in a special place where it will re-index that specific page once the real content is launched.

Do human “quality raters” influence which sites are impacted by Panda?

If you have human ‘quality raters’ evaluating the SERPs and influencing which sites may be impacted by Panda, how do you confirm that consumers are more satisfied with the results?

No. Human Quailty Raters do not influence Panda algorithm in any sense. These folks only test the search results by ranking 1-10 in a blind test, side by side with 2 types of search results generated by Google Search Engine. One of it is the current version that people uses in public and the other is patched version of Google Search Engine. These feedback helps Google Developers to determine whether their changes have made their system produce a more accurate and relevant search results for visitors, thereby improving quality and search experience. In short, the human quality raters does not down vote specific site, but they follow through a checklist or guidelines to assess whether the site is useful for users based on its content against their random keyword search.

Should I structure my site using subdomains or subdirectories?

I’m interested in finding out how Google currently views subdomains — whether there’s any difference between a site structured as a group of subdomains and one structured as a group of subdirectories.

They are roughly equivalent. It depends on your convenience and usage. For instance, if you are using different types of platforms such as WordPress (Blog) + Drupal (CMS) + phpBB (Forum), its best to use subdomains, whereas if you are focus in building an authoritative site, its best to go for a subdirectory site. You can also gain more attention in Google and Yahoo search engine since it will be displayed multiple times of your subdomain + main domain. However, people abuse the system and Google had made it such that they will cluster those information if these subdomains belong to the same root domain. The good thing about having subdirectories is that you do not need to move it around, you don’t need to worry about creating another subdomain or configuring the system, just make additional subfolder under the root. Authoritative site loves to use the later method. Google, on the other hand, uses the first method such as maps.google.com or play.google.com or news.google.com because all of these aren’t related to google.com as in searching for something from people’s site. If it’s not related to their main cause, its best to work with subdomains.

Can I fetch an https URL as Googlebot in Webmaster Tools?

The Webmaster Tools “Fetch as Googlebot” feature does not allow one to fetch an https page, making it not very useful for secure sites – any plans to change that?

It has been tested by Google Team that it works for them. The issue lies in whether you have access of that particular site http and https. To do so, simply register in GoogleWebmaster to prove you are the owner of the site. Once that is done, try it again. If it still does not work, give a shoutout in the webmaster forum and give them some feedback. They will then look into it, the root cause that prevents you from viewing in https (secure link).

How does Google consider site-wide backlinks?

Are sitewide backlinks considered good or bad by Google? Or do they just count as 1 link from the whole domain?

Site wide links are outbound links that are commonly located in blogroll, footer, widget or header to attract clicks from visitors. When one of these is clicked, it teleports the user to other external sites. These types of links are often irrelevant to the post or page or the current article that the user is viewing. In Google eye’s, it is seen as spam or useless links. An example would be your site is in Polish but you have a site wide link that links to an English site about Renting Cheap Apartments. Regarding the second question, Google only counts the link once if its from the same domain, similarly to how Google’s algorithm works on keyword stuffing issue where mentioning the same longtail or shortail keyword within the article few times is fair but beyond that and Google completely ignores them. Inbound links such as Copyright Policy or Privacy Policy are not affected by this algorithm. Manual investigation by the web spam analyst are also performed when they receive a spam report of a site to authenticate whether their algorithm did a good or bad job and revert or correct them if necessary.

What is Google’s view on guest blogging for links?

What is Google’s view on guest blogging for links?

There are two ways to interpret this question: The first is someone would be a high quality guest (e.g. Lisa Baronne, Vanesa Fox, Danny Sullivan) who you shoutout to ask whether they could publish a blog article on your site to depart his or her knowledge or insight on a particular topic. Indirectly, this attracts many visitors to your site to read their articles and also helps you establish your reputation or become popular if you are one of those new but good blog writers who is under the spotlight. The second is extremist who outsource as in having someone to write an article (300-500 minimum words) for them on a particular topic. The purchased article is published on the main site and after its being indexed in Google Search Results, they use automatic spin robots to create copies of it with different version (changing synonyms or sentence) to be publish on other article bank site such as Enzine Articles. This is done to attract clicks from other sites and PageRank backlinks to your main site.

Does Google take action on spammy guest blogging activities?

Currently, guest blogging is the favorite activity of webmasters for link acquisition. Due to its easy nature, lots of spammy activities are going on like article spinning etc. Is Google going to hammer websites for links acquired by guest blogging?

Yes. Google does take action on spammy guest blogging by using their algorithm to check your backlinks site whether they are low quality sites. These kind of sites containing spam content can indirectly hurt your PageRank and reputation. Spam guest blog is done by performing many guest blogs where each article is spun and/or allowing many guest bloggers to publish spun articles on your site. Article spinning is run by a robot or plugin to create multiple version of the original article instead of creating an original content of their own. [Ayumilove Notes] However this can be seen as a double edge sword where you can link multiple spam sites to your competitor’s site to downgrade them. Google would need to look into this issue as well to avoid getting this abused by webmasters.

If I quote another source, will I be penalized for duplicate content?

Correct quotations in Google. How can you quote correctly from different sources without getting penalized for duplicated content? Is it possible to quote and refer to the source?

Yes and No. Google penalizes by not displaying your article in their search results or have it ranked lower in the search result if you are just simply quoting paragraphs from one or multiple site into 1 post without putting any of your thoughts or insights on the article you are publishing. Give it some perspective to your article such as to why you agree or disagree to the other person’s quote based on your analysis or technical research or your review on someone’s product to contradict or support them. This also gives the reader something to read other than clone articles found in many other sites, and perhaps bookmark your page to visit again for some new insight of their favorite topic. Just include a blockquote and a link to the original source and write below it with your content. Having done that, you’re pretty in a good shape without needing to worry about getting dinged for duplicate content.

How will Google interpret links to URLs ending with a campaign tag?

Will Google interpret links to URLs ending with a campaign tag like ?hl=en (www.example.com?hl=en) as a link to www.example.com or to a completely different page? What about the SEO effect of inbound links?

Google has a crawl-indexing team who develops and maintain algorithm in canonicalizing links if these links refer to the same page even though each has one or more different tags attach/suffix to it. These tags could be parameter tags or tracking tags. If you don’t trust the search engine to do it right, you do have other options to choose from. You can use a rel=”canonical” in the header of the page to cluster multiple pages to refer one page. If its a tracking URL that you do not want Google to index, you can set the page to do a 301 redirect to your main page. You can also use Google URL Parameter Tool in google.com/webmasters to check whether that URL/Link is showing up twice in Google Search Result and later strip them in your GoogleWebmaster Account.

Do AdWords customers get special treatment in organic search results?

I am an important AdWords customer and recently I have seen a drop in ranking for my site. Why can’t I get advice on optimizing my site for Google’s search results through my AdWords point of contact?

No. Adwords client does not get any special treatment in organic search results or have access to premium content even though they are paying Google for using one of their advertisement services. However these customers can go through the exact support channel like others do to get advice on optimizing their site to rank better in Google Search Results. Google does this to be fair towards all users on the web by having their Quality Teams deny their requests for special treatment.

Why doesn’t Google release a SEO quality check up calculator?

Why doesn’t Google release a SEO quality check up calculator? This will help people to optimize their websites in the right direction. Is Google willing to do this or it just wants people to keep guessing what’s wrong with their websites?

Google does not release a SEO Quality Calculator is to prevent spammers taking advantage of this algorithm to rank their spun or copied articles high in Google Search Results, leaving the legit webmasters ranking down. However, Google does provide GoogleWebmasterTool to aid regular webmasters to check their site. One of the feature that this tool provide is checking how fast your site loads up so you can try to optimize it to load pages faster for GoogleBot and also for your website patrons. If a site loads fast, your patrons will be happy not to wait long to get to the content. This indirectly improves user experience and also help shape webmasters to produce quality content and make the world wide web a better place.

Is freshness an important signal for all sites?

Google has expressed in the past that frequently updated pages get a boost in rankings (QDF), that seems to favor blogs and news sites over company sites, which have less reason to be updated often. How important of a signal is “freshness”?

QDF means Quality Deserves Freshness. Articles published in world wide web regarding current events or issues deserves QDF. One of the example I can give is Earthquake or Tsunami or Hurricane Sandy. If content is evergreen (that does not change as time flows) or just a navigational content or research paper, QDF does not apply to this much. QDF is just one of Google signals to rank your site. If your website is not under breaking news niche or latest technologies, you don’t need to worry about QDF. Google is good in detecting whether your article is QDF, so changing your article publish dates with some random change of new words might or might not help much.

Do you think that “Search Engine Optimization” should be renamed?

Hi Matt, do you think that search engine optimization is descriptive in the way it is used today? Do you think we need to call it something else?

There are different ways of interpreting SEO (good and bad). For those who are already in the field of SEO (Search Engine Optimization), their mindset sees them as a tool to market one’s site using paid or free methods such as Google Adwords. The other one would be SEO (Search Experience Optimization) where a webmaster analyzes and execute some action to assist patron. For instance, the question they would ask themselves is: Would users like to see the snippet on the page? Once they land, do they convert well? Are they happy? Do they want to bookmark it? Tell their friends about it? Come back to it? And all kind of questions. Sometimes, this mysterious sounding name SEO could make a non-SEO literate person to relate it to another thing like a movie or tv program such as CSI (Crime Scene Investigation) and might make up in their mind that these SEO professionals are worthless shady criminals or work undercover (example: black hats / gray hats) In short, changing the name SEO to whatever you pick would not do any good since as time passby there will be some bad apples who spoil its reputation (black hats who misuse / abuse SEO for their upper gain). In Matt’s personal opinion, think SEO in broader terms such as making their site faster and more accessible for patrons and helping people with keyword research instead of link farming.

What has having your own blog taught you about SEO?

Have you learned something about SEO that you wouldn’t know if you haven’t had your blog?
Matt Cutts created his own blog to have a personal space to post his opinions and thoughts about Google and not to worry about getting altered, basically an unvarnished channel. It helps him to step into the mindset of webmaster (site owner) a lot better and also avoid being overly obsessed with penalizing websites by treating each of them as spam whenever there are something wrong with it since every webmaster including him wants to do better in providing user friendly experience site to their patrons. Being obsessed with spam clouds one judgement in filtering the good (legitimate) and bad sites (illegal).

Should I incorporate synonyms for important terms into my site?

If two terms are used essentially interchangeably, does Google realize that the terms are interchangeable? Should you be trying to use both terms, or just focus on one term to get the best search engine traffic? An example is EMR and EHR (EMR = Electronic Medical Record, EHR = Electronic Health Record)

Yes. Google has a Synonym Team that is responsible to relate 1 word to the other. For example: USB Drive and Flash Drive are related and can be used interchangeably. However, Google is not perfect in linking each word and might not be able to link those new words or phrases that pops up in future quickly. Therefore, you can mention those 2 related terms within your article by writing it naturally. Ask a friend or family member to read it aloud. If it sounds stilted or artificial, rewrite the paragraph. A bad example is this: SEO guy enters pub, bar, club, restaurant, tavern and orders whiskey, beer, tequila and cocktail. This sentence sounds spammy since its trying to incorporate all the keywords in 1 sentence. In general, write synonyms that users would type in a natural way, Google does not have to somehow guess what your page is really about.

Why do paid links violate Google’s guidelines while other ads don’t?

Google gives penalties to sites who do paid links, but doesn’t penalize site with other ads such as AdSense, Chitika, etc. Why? They both are source on income to a site, a site needs money to grow or sustain.

There are 2 types of ad links: One that manipulates search engine and the other one does not. Google is not against advertising since it is a useful tool for webmaster to reach out to their targeted audience and educate them about a product or service on their site or a means to get more traffic to one’s site by giving freebies or special deal or promotions. However, whenever you pay links that pass Page Rank, fundamentally you are paying to manipulate Search Engine that violates Google Guidelines without any disclose. A layman example would be paying a radio station to play the musician’s song more frequently. To avoid getting penalize by Google, those external links must be tagged with rel=nofollow attribute in the URL link. This way you are directly informing Google (bot) that these links are paid. Google is not concern on what type of advertisement you used like Google Adsense, Chitika, Facebook, Double Click but rather these external links are tagged nofollow or use a javascript to redirect them to the other site. [Ayumilove Note] Googlebot does not know whether the links posted on your site is either paid or out of courtesy, but it assumes every external link without nofollow are paid.

How long does a reconsideration request take to process?

I’ve been waiting for 2 months to hear back regarding a reconsideration request. Is this normal? There is no one I can contact about it.

It is not normal. Typically, you will get a reply within 3 business days from Google Reconsideration Request Team about your site status, whether it has violated one or more of the following Google’s Webmaster Guidelines. Send another Reconsideration Request if no feedback received after a couple of weeks. Meanwhile, ask for assistance at Google’s Webmaster Help Forum on your site. You will receive 1 of 3 types of replies: (1st) Yes, we think you’re in good shape and your reconsideration request has been granted. (2nd) No, we still think you have some work to do and keep improving your site based on Google’s Webmaster Guidelines. (3rd) We have processed your reconsideration request. This means there might be multiple issues (some issues have been resolved but not all). [Ayumilove Note] If you get the 3rd type of reply, ask in Google Webmaster Forum for advise on what to improve on your site and once you done those changes, send in another Google Reconsideration Request with the forum link pasted in the description, stating that you have done all of these corrections. Ask them to reply the forum thread if there are still something wrong with your site. Google’s Reconsideration Request E-mail Reply does not contain any useful hints on how to fix your site, so requesting them to reply to your forum thread helps you to figure out what went wrong. Forum: http://groups.google.com/a/googleproductforums.com/forum/#!forum/webmasters

Will Google get a webspam team outside of the US?

Europe is small compared with USA, so will Google get a webspam team for smaller markets?

Google already has a webspam team based around the world such as Hong Kong and Zurich to deal with spam in dozen of popular languages so that they can alter the anti-spam algorithm more accurately to suit each language. In short, Google does not focus mainly on being a US-centric or English-centric company but their goal is to be an international-centric. It will take some time for Google to cover other non-popular language but will get there soon. Currently Google is handling over 65 languages which are Afrikaans, Albanian, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Catalan, Chinese, Croatian, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hebrew, Hindi, Hugarian, Icelandic, Indonesian, Irish, Italian, Japanese, Kannada, Korean, Lao, Latin, Latvian, Lithuanian, Macedonian, Malay, Maltese, Norwegian, Persian, Polish, Portuguese, Romanian, Russian, Serbian, Slovak, Slovenian, Spanish, Swahili, Swedish, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Vietnamese, Welsh and Yiddish as of 2012 December 12.

Why estimated results change when going from page 1 to page 2?

How reliable is the site: query and why does the total count sometimes change from page 1 to page 2?

Google Estimated Search Results is only accurate within the 3 significant digits. When you dig deeper into deeper pages, Google fine tunes its search results to provide more better estimates. In short, its not affected by the site colon query but digging deeper in Google’s data and as a result, you get different estimates on how many results there are.

How many messages did Google send about unnatural links?

At a search conference, someone said that earlier this year, Google sent out 700K warnings to websites about shady links. True? Not true?

Not true. The 700,000 warning messages sent out by Google to webmasters earlier this year constitutes 90% on black hat and 3% are unnatural links. Previously, Tiffany (one of Google Webspam Team) enlightened the audience at a search conference using a graph that displays 700 thousand messages broadcast in January and February 2012. However, most people misinterpreted the information that all of these messages are unnatural links since Google was working on link networks at that time. So hopefully, this has debunked the myth discussed in black hat forums where the perception is Google is trying to get everyone to perform Reconsideration Request on their site due to unnatural links to their site. Unnatural links are basically a sudden high surge of backlinks pointing to your site to increase PageRank.

What is the ideal keyword density of a page?

What is the ideal keyword density: 0.7%, 7%, or 77%? Or is it some other number?

Google Algorithm is not based on keyword density to rank your article in the top 10 Google Search Results. Mentioning the keyword at least once or twice absolutely helps with ranking your articles. However, stuffing your article with keywords unnaturally does not help but hurt. In short, write an article with keywords in a natural way and check if it sound artificial by reading it aloud or having someone to read it to spot anything stilted or article doesn’t read right without getting annoyed by it, then you are doing relatively well. Don’t get scammed by people who tries to sell you keyword density software which helps to stuff keyword phrases as many times as possible into your article since it will not help ranking your articles higher. There is no hard and fast rules in getting higher Google Search Rankings and keyword density isn’t the case.

My site doesn’t have much text. Is that a problem?

Yes. It is a problem for Google and Users. This happens when webmaster just uses Flash Content (SWF) for navigation or content or perhaps only displays photo in their blog. It’s best add some description to photo filename and some description around the text as well as their alt=”” tag. For Flash Contents, use it for decoration purposes since most users might disable flash or their mobile phone does not support Flash. Having said that, this makes it easier for Google to index your site based on the keywords on your article as well as users have an easier time to identify the picture content from the filename when they are loading a large high definition image or they have slow loading browser.

How can I make the pages on my site unique?

We have an ecommerce site, and around 1000 product/pages on that site, so how can create unique meta details for those pages?

Instead of asking this question, you should be asking yourself how many pages that you can make are high quality standard to provide value to your users. If you can’t manage to have something unique on all these thousand pages compared to your competitor sites, why should these pages rank higher compared to someone’s else affiliate page content? As a food for thought, think about how to make these product pages compelling which makes people want to patronize your eCommerce site such as is it easier to navigate your site, has comment-review features by official product owners similarly to Amazon.com and etc.

How to Rank #1 on Google (Matt Cutts Mashup by SamApplegate)

Matt, your webmaster help videos are a big success and I think everybody loves them. It’s a great help for webmasters. But, if one wanted to rank number 1, how would you go about doing this? You know, on what basis are the results ranked?
77% Keyword Density + Links to Porn Sites + Use AdSense = #1 Spot (JOKE!)
[youtube url=http://www.youtube.com/watch?v=b7W0o65tTIQ]

5 thoughts on “GoogleWebmasterHelp List of Question Answer by Matt Cutts

  1. iphone 5 64gb

    Heya i’m for the first time here. I came across this board and I to find It really useful & it helped me out a lot. I am hoping to provide something again and help others like you aided me.

  2. Ayumilove

    @pozycjonowan: Thanks for stopping by and leaving a comment 🙂

  3. pozycjonowanie

    I have been into blogging for quite some time now and I can tell you with some authority that these are the best seo tips I have ever read. Simple and easy to understand. All I can say is Hats Off To You.

  4. Margeret Starcevich

    Keep up the superb piece of work, I read few posts on this web site and I think that your web blog is very interesting and contains lots of superb information.

  5. The best static website design company in Dubai

    Great beat ! I would like to apprentice even as you amend your website, how could i subscribe for a weblog website? The account helped me a acceptable deal. I were tiny bit acquainted of this your broadcast provided shiny clear idea

Comments are closed.