What should a site owner do if they think they might be affected by Panda?

How will webmaster come to know whether website is hit by Panda?

And, if  site is already hit, how to recover from Panda?

Answer: You need to read UNDERSTAND and implement Google Webmaster Guideline not when site hit but before and after that happens.

Implementation is a key.

Improve site contents on regular basis if possible.

Understanding Google Webmaster Guideline correctly – Success Factors for Implementation.

Answer from Matt Cutts ‎ Google Software Engineer:

Published on Sep 11, 2013

Reprinted with sole purpose to remind webmasters and website owners to read Google Webmaster Guideline.

Do not pay for SEO all info you every need Google provide.

But if you do not have time to read and follow simple and common scene instructions do not blame Google if your site is our of first page on Google Organic Search results
Some website owner looking for one single reason site was “hit” by algorithm update… ;-(

how to recover from google panda updateWhat counts as a high-quality site?

Our site quality algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content. The recent “Panda” change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to access the “quality” of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren’t disclosing the actual ranking signals used in our algorithms because we don’t want folks to game our search results; but if you want to step into Google’s mindset, the questions below provide some guidance on how we’ve been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do

We’ve been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you’ve been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We’re continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term. In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Some ideas on how to evaluate the quality of a site:

Have a question? Ask it in our Webmaster Help Forum: http://groups.google.com/a/googleprod…

Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: http://twitter.com/googlewmc

More videos: http://www.youtube.com/GoogleWebmaste…
Webmaster Central Blog: http://googlewebmastercentral.blogspo…
Webmaster Central: http://www.google.com/webmasters/


All About SEO on WordPress.com

All About SEO on WordPress.com.

SEO DOs and DON’Ts

  • Regularly publish original content.
  • Use a few precise categories and tags.
  • Write for human ears.
  • Build your traffic in smart, organic ways.
  • Choose simple, meaningful post slugs.
  • Create a descriptive tagline.
  • Include keywords selectively.
  • Start duplicate sites.
  • “Stuff” your site with irrelevant, broad categories, tags, or buzzwords.
  • Write with search engines in mind.
  • Purchase or exchange meaningless “backlinks.”
  • Buy into SEO fads.
  • Worry too much about SEO at the expense of writing good content!

We get a lot of questions about SEO here on WordPress.com, and no wonder — you work hard on your site and want to get the word out! SEO stands for Search Engine Optimization. SEO recommendations are intended to help your site rank higher and more accurately in search engines, like Google. Say you write a blog about sailboats. When someone Googles “sailboats,” how many pages of results do they have to scroll through before they see a link to your blog? The goal behind having good SEO is to increase your website’s SERP (Search Engine Results Page) ranking.

many sailboats

On the busy internet, it can be tough to make your “sailboat” stand out from all the others.

Ideally, you want your link to be on the first page of results. The best ways to accomplish this are:

  • consistently publish useful, original posts about sailboats; and
  • promote your blog in intelligent ways to people who are looking for information about your topic.

The more traffic your blog receives for sailboat-related searches, the higher it will climb in Google’s results. No mystery to that, right? But if you look around the internet, you’ll find dubious advice about how to increase your blog’s SERP ranking. Some of the suggestions you’ll find are just extra busywork, but some can actually end up hurting you with Google.

Common myths about SEO

Myth: I need a plugin for SEO.

Fact: WordPress.com has great SEO right out of the box — you don’t have to do anything extra. In fact, WordPress takes care of 80-90 percent of the mechanics of SEO for you, according to Matt Cutts, head of Google’s webspam team. All of our themes are optimized for search engines, which means they are designed to make it easy for the Googlebot (and other search engines) to crawl through them and discover all the content.

Myth: I need to regularly submit Sitemaps to Google so it knows I’m blogging regularly.

Fact: Every WordPress.com blog has an XML Sitemap. To view your Sitemap, type yourblogname.wordpress.com/sitemap.xml in your browser’s address bar. What you see there is code, so it’s not meant to be easily readable by us. For the Googlebot, however, it’s a “what’s hot” guide to the latest and greatest on your site. WordPress.com also automatically sends notifications to Google every time you publish or update a post or page. This is similar to how your subscribers get email updates. Every time you post, you’re telling Google, “Hey! Check this out.”

XML Sitemap

Here’s how this blog’s XML Sitemap looks in Chrome.

Myth: The more tags and categories I use for a post, the better it is for Google.

Fact: Using a bunch of tags and categories that have little to do with your posts won’t increase your site’s visibility. Actually, Google doesn’t rely on tags or categories — it can tell what your post is about from its content (or it should be able to), as Matt Cutts explains here. Plus, any post on WordPress.com with too many categories and tags will be excluded from the Reader Topics pages. It’s best to use only a few, carefully selected categories and tags for each post — those that are most relevant to what the post is about. Likewise, avoid overly broad tags: “catamaran” is a better tag than “boat.”

Myth: Creating several identical sites about sailboats and making frequent use of sailboat-related terminology in my posts will help me get a lot of sailboat-related traffic.

Fact: Google frowns on duplicate content, and if you have multiple identical sites, your search ranking will suffer for it. Also, while it’s a good idea to use accurate keywords in your posts and post titles, going overboard with so-called “keyword stuffing” will hurt your SERP rank. Strive for clear, natural-sounding writing that reads like it was intended for human ears, not search engine crawlers.

Myth: One effective way to improve my blog’s SERP rank is to purchase or exchange links (sometimes known as “backlinks”) with as many bloggers as possible, so that there’s a lot of traffic going to my blog.

Fact: If you blog about sailboats, the more sailboat-focused sites and articles that organically link to your blog as a fantastic source of sailboat info, the better. On the other hand, Google won’t be impressed if it sees a ton of links to your sailboat blog from blogs about, say, marketing, basketry, lipstick, electronics, or SEO tactics.

Think of it this way: Google wants people to use its search engine as much as you want them to visit your website, so its goal is to return the most useful results for any given query. The more tactics bloggers come up with to fool Google into ranking their sites higher than they deserve to be, the more Google corrects its search algorithms to screen out such bad behavior.

Paying for backlinks is a case in point: in April 2012, Google introduced its controversial Penguin algorithm that improved screening for this bad practice, and many bloggers with excessive backlinks found that their SERP rank plummeted. The moral of this story is that while SEO fads might bump your site artificially for a bit, in the long run, they won’t work.

Myth: SEO requires a strategy and possibly an expert…

For more read original post: http://en.blog.wordpress.com/2013/03/22/seo-on-wordpress-com/

For more information about social media networking, how to add podcast to your website, SEO tips, tricks, social media  good practice, online tools and how to market your site visit New York Web Designer Agency Website

Duplicate Content – SEO Best Practices for ecommerce

duplicate content on multiple online e-commerce sites?

How to fix duplicate content on multiple online e-commerce sites?

What is the duplicate content?

Content can be considered duplicate if it repeated in multiple TLD or in the same domain and subdomain.
Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:
• Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
• Store items shown or linked via multiple distinct URLs
• Printer-only versions of web pages
Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.

Many sites have a press release section, or a news section that re-posts relevant articles. Since it’s all duplicate content, they be better off removing these sections (even with plenty of other unique content)?

What content can be considering as duplicated?

• visible text on the page,
• images and its “alt” and “title” tags,
• link and its “title”,
• Meta “title” tag – most important (source: GoogleWebmaser tools);
• Meta “description” tag
• Meta “keyword” tag ( some search might ignore it)
• Same video and links to the video sources.
• Even Parameter driven URLs can be seen as a form of duplication by search engines (source: http://www.webmasterworld.com/google/4258404.htm)
• maybe more… and in comment
Again, if all above (mostly together) repeated exactly the same over multiple pages or/and over multiple TLDs then it might be a problem.
Hope I am correct or very close to be correct on duplicate content definitions. 😉
I looked at Google Webmaster (the best reference for online businesses) and find very nice post.

When we need to start worry about?

… Google tries to filter out duplicate documents so that users experience less redundancy. You can find details in this blog post, which states:
1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.
2. We select what we think is the “best” URL to represent the cluster in search results.
3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL.
Here’s how this could affect you as a webmaster:

It is clear that authentic online businesses that try to sell similar products on couple or several own websites should not be worry about much instead nee to concentrate and committed to make changes and try to make as unique content as possible.
Unless they are spamming and use “black hat SEO” techniques.
(In case iframe from yourtube does not work use this link: http://www.youtube.com/playlist?list=PL6720C94A83F0F740)

How to fix duplicate content on multiple online e-commerce sites?

1) Some CMS like mambo, osComemrce, Zen Cart… are well know for being non-search friendly. But it depends on the context of the page and what you are trying to achieve.
How can it be fixed? Depending on the context of the page you may wish to try implementing the canonical tag – ex:
2) Should we add “noindex” “nofollow” meta tag?
No, you should not.
It might prevent Google and other search engines to index “couple of your online shopping carts”.
If your long term goal for online shopping cart is to hide it from search engines much faster and better way exist:
• Remove content from my own site from Google’s search results
• Remove content from another site from Google’s search results
But make sure you read this first: URL removal explained, Part I: URLs & directories
Using the tools for other purposes may cause problems for your site, so read this: When NOT to use the URL removal tool.

You can later send “resubmit” request but what is the logic if you would like to sell from all of your’s shopping carts.

Update your content on static pages like About, testimonials… and on dynamic categories and products page, make it more or less uniquely.

Of course it is hard to make Policies, terms and conditions, or help unique
Yes, it is possible to have some unique content even if you and your competitors sell same products from same suppliers on hundreds of sites.
• Be creative, it is not easy way!
• Look at Google or other keyword tools and try to see how people search. What Web search query they type in search box.
Search for same “products names” as your competitors. I guarantee that you will be very surprise to see huge or non ;-() amount of variations on “Local”, “Global” and “Competition” results.
• I will look at “Local” and “Global” results depend on your Geo and Demographic.
• “Competition” is if you use pay per click but also important.
• So, compare the results and try to see the pattern that make your title or product name uniquely but popular in search result:

manufactory name;
type of product;
SIZE (very often users know what size they are looking for);

Try to “Geo and Demographic” your content if possible. Find something unique to highlight in your product that people might looking for.

Compare results for (case lower or upper dose not matter in this example):
• Plywood,
• 2 inch Plywood,
• Waterproof 2 inch Plywood,
• Waterproof 2 inch plywood in New York or as close as possible. All depend where you can ship and how much it will coast to ship. Or if you “local pick up store only”
• Stainless steel gutter
• Made in USA
• Locally Made
• Factory direct
• Free Shipping
• Cheap “something” – nothing to be shame of. Your competitor might sell more because of this term

Choose correct linking structure and navigations.

Is there such thing as building to many links, if you’re following Google’s webmaster guidelines exactly? Too many where you would get banned, even if you’re following the rules?

Look closely how you link to your products and use “unique”, “descriptive” and short (2-5 words) title tag where possible in HTML for links and images.
Test it with mouseover links in Firefox or Chrome to see your title tags got links and images.
Use clean URL if possible with clear name of the category and products names.
Try to use name of the category in short searchable way in your URL.

Is there such a thing as building too many links?

It all depends on what you do and how you do it.

If you are spammer good luck with that.
If you are business owner them proper use of social media sites help to have good (read Google webmaster guideline) backlink
People tend to blame someone who makes advice instead of use imaginations during implementations process.
Ex. Using twitter are helpful but if done incorrectly with no imaginations just posting 5 – 15 links to the same site per day or mentioning people who will never follow back or/and had account with “bad standing” will not work…
You need to talk to people make you tweets fun to read and in between link to something you want them to pay attention on your site.
or this from 2 years ago:

As you might see that “Rule” does not change.
Provide relevant content, try to be different while selling same content and properly use social media will help
Very good ideas, very good techniques or structure can be damages during implementations time.
If you find this post informative and useful I leave comment and I will continue. Some might find this post irrelevant or not useful if so please leave your suggestions in comment area too.
About Author of this post:
Personal page: http://www.youneeditall.com
Twitter: @infotechusa – Web Design & SEO
LinkedIn: Nikolay Gul – Web Design, Web Development in Syracuse New York

Government Mandated Monopolies,

Government Mandated Monopolies

By Alla Gul (MBA) – Our Contributor

Why are drug companies such as Pfizer allowed to establish and maintain monopolies (patents) on drugs – form of barrier?

Conventional wisdom might suggest that generally monopoly is bad for consumers because of the absence of competition.
A) What type of barrier is this?
B) Do you agree that drug companies should have this government mandated monopoly?  Why?

In economics, a monopoly (from Greek monos / μονος (alone or single) + polein / πωλειν (to sell)) exists when a specific individual or an enterprise has sufficient control over a particular product or service to determine significantly the terms on which other individuals shall have access to it. (This is in contrast to a monopsony which relates to a single entity’s control over a market to purchase a good or service. And contrasted with oligopoly where a few entities have )[1][clarification needed] Monopolies are thus characterized by a lack of economic competition for the good or service that they provide and a lack of viable substitute goods.[2] The verb “monopolize” refers to the process by which a firm gains persistently greater market share than what is expected under perfect competition.

A) Monopoly is the situation in which there is a single seller of a product for which there are no close substitutes (Mankiw, 2004, p.314). A monopoly remains the only seller in the market because other firms can not enter the market and compete with a seller. It might happen due to the following reasons (“Monopoly: A Brief Introduction”, n.d.). First, a single firm owns a key resource.

Second, the government gives a single firm the exclusive right to produce some goods or services. Finally, the costs of production make a single producer more efficient than a large number of producers. As a result, all of the above create barriers to entry causing monopolies to arise. “The fundamental cause of monopoly is barriers to entry” (Mankiw, 2004, p.314). Regardless that government mandated monopolies have had some negative effects on the economy, the government grants the monopoly because doing so is viewed to be in public interests.
When a pharmaceutical company discovers a new drug, it can apply to the government for a patent. If the government approves the patent, the company has an exclusive right to manufacture and sell the drug for 20 years. The drug can not be copied due to protection of a patent (“Monopoly: A Brief Introduction”, n.d.). Many drug companies have been allowed to establish and maintain monopolies (patents) on drugs. These government mandated monopolies have created obstacles for other pharmaceutical companies to enter the market and compete. For example, Pfizer has patents on many drugs including Quinapril, Atorvastatin, and Sildenafil. Until these patents expire, no other company is allowed to produce the same drugs. This gives a company strong monopoly power allowing them to set higher prices and lower level of production than under competition that is considered to be harmful to the economy. However, monopolists argue that granting patents is in the public interest because it would allow them to spend more money on research and development in order to develop new and improved products. “It has long been recognized that government-granted monopolies (i.e., patents, copyrights, trademarks and franchises) can benefit society as a whole by providing financial incentives to inventors, artists, composers, writers, entrepreneurs and others to innovate and produce creative works” (“Monopoly: A Brief  Introduction”, n.d.). In fact, the importance of establishing monopolies of limited duration for this purpose is even mentioned in the Article I, Section 8 of the U.S. Constitution which states that “The Congress shall have Power . . . To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries” (“Monopoly: A Brief Introduction”, n.d). Thus, “The government grants the monopoly because doing so is viewed to be in public interests” because “granting patents for discovered drugs encourages research and development” (Mankiw, 2004, p.316).

To summarize, the laws governing patent have benefits and costs. “The benefits of the patent and copyright laws are the increased incentive for creative activity. These benefits are offset, to some extent, by the costs of monopoly pricing” (Mankiw, 2004, p.316).

B) There are intensive discussions on whether drug companies should have this government mandated monopoly. Supporters of such monopolies argue that even if patent laws do impose costs in the form of higher prices and lower availability for consumers, under patent laws, more innovation will occur, which is beneficial  for society as a whole (“Patent Laws and the War on Good Drugs”, 2001). On the other hand, oppositionists state that the law encourages drug monopolies to create artificial scarcity of some drugs in order to have a higher price for their products (Boldrin & Levine  , Chapter 4). Next argument states that the law deprives the poor from affordable drugs and blocks rights of developing nations under TRIPS (Agreement on Trade-Related Aspects of Intellectual Property Rights).

Despite the clear need for developing countries to exercise their rights to compulsory licensing and parallel imports to enable their people to have access to affordable medicines, a major and perhaps the most disturbing aspect of the crisis of patents and drugs is that obstacles have been and are being put in the way of developing countries seeking to make use of TRIPS provisions on compulsory licensing or parallel imports in order to buy or produce drugs at more affordable prices. (“Patents and monopoly prices”).

To continue, they argue that the high prices can not be justified by large expenses on Research and Development (R&D) since often most of the profits go to cover marketing expenses rather than R&D: “Pfizer says this pricing is necessary to fund new drug research, but 35 percent of its profits drain into marketing and only 15 percent support R&D, according to the Securities and Exchange Commission in 2002…”(“Gov’t should use power to make drugs affordable”). Oppositionists also state that due to the patent law, the pharmaceutical companies are getting less efficient.

In economics, a monopoly exists when a specific individual or an enterprise

In economics, a monopoly exists when a specific individual or an enterprise has sufficient control over a particular product or service

… Another major problem with pharmaceuticals today: The pharmaceutical companies are getting less efficient. They are increasingly turning out drugs that are less important to public health because they’re not as profitable. For example, roughly 70% of new FDA approved drugs are copycats or “me too” drugs which are small variations on existing drugs, usually done to reduce R&D costs and extend the patent life of an existing drug. (“Prescription Drugs”).

Finally, oppositionists conclude that “Patent protection is the most effective tool for drug manufacturers to keep out competition from generic producers and thus maintain monopoly control over the production, marketing and pricing of medicines” (“Patents and monopoly prices: Third Word Network” ). They state that “The net loss to society – from this policy is real and enormous” (Boldrin & Levine  Against Intellectual Monopoly, Chapter 4)

I incline to support those opposing the law. However, I also understand that the protection of intellectual property rights is important, and its violation “not only harms those innovators, such as the drug companies, who would be directly affected, it also does great damage to innovative activity, and indeed all types of capital (“Patent Laws and the War on Good Drugs”, 2001). I do not think I am ready to take one side or the other at this point since this is a complex issue and I do not want to jump to a conclusion ahead of time. I would like to investigate it more thoroughly.


Boldrin & Levine YEAR: Against Intellectual Monopoly, Chapter 4: The Evil of Intellectual

Monopoly Retrieved on October 21, 2007  from http://www.micheleboldrin.com/research/aim/anew04.pdf

Gov’t should use power to make drugs affordable

October 17, 2007)

Mankiw,G.(2004). Principles of Economics. Mason, OH: Thomson South-Western

Monopoly: A Brief Introduction, Retrieved on October 20, 2007  from


Morgan Rose , 2001, Patent Laws and the War on Good Drugs. Retrieved on October 21,            2007  from http://www.econlib.org/library/Columns/Teachers/patent.html.)

Novartis lawsuit threatens access to medicines for millions,  January 20 2007 Retrieved

on October 19, 2007  from http://www.oxfam.org.uk/applications/blogs/pressoffice/2007/01/novartis_lawsuit_threatens_acc.html)

Oxfam Press Release – 12 December 2006: India, Thailand and Philippines must face

down conflicts to guarantee affordable medicines Retrieved on October 21, 2007 from http://www.oxfam.org/en/news/pressreleases2006/pr061212_affordable_medicines6

Patents and monopoly prices: Third Word Network. Retreived on October 19, 2007 from http://www.twnside.org.sg/title/twr131b.htm

Pfizer, Novartis flayed for blocking new drugs to poor nations. Retreived on October 21,  2007  from http://www.dancewithshadows.com/pharma2/pfizer-novartis.asp

Prescription Drugs, April 2006. Retreived on October 20 from http://www.kucinichforcongress.com/issues/prescriptiondrugs.php April 2006

” Novartis lawsuit threatens access to medicines for millions”http://www.oxfam.org.uk/applications/blogs/pressoffice/2007/01/novartis_lawsuit_threatens_acc.html Jan 26 2007

B2B Social Media Websites Integrations

External Blog RSS URL: https://infotechusa.wordpress.com
Twitter Account URL: http://twitter.com/infotechusa
Facebook Account URL: http://facebook.com/infotechusa
LinkedIn Account URL: http://www.linkedin.com/in/webdesignerny
YouTube: http://www.youtube.com/1webmaker
For more information about social media networking, SEO tips, tricks, social media good practice, online tools, how to market your site and if you are looking for web site design and development agency visit New York Web Design Agency Website

What is Google PageRank?

Google PageRank and Link Analysis

PageRank is a link analysis algorithm used by the Google Internet search engine that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of “measuring” its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is also called the PageRank of E and denoted by PR(E).

Link Analysis and Google PageRank

No single element is more important to a Web page’s Google ranking than the perceived quality of the links that point to the page — the so-called back links. It is crucial to understand that Google cares less about the quantity of the back links than about the quality of each individual link. The link quality is determined by reviewing the importance of the site that contains the link.
Google does not disclose the specifics of its ranking algorithm to the public. However, the Google Web site states the following:PageRank relies on the uniquely democratic nature of the Web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.” Thus: As you get more quality links pointing to your site, your PageRank will increase.
The bottom line is that effective link building is critical to gaining an opportune Google ranking. Convincing a number of “important,” topic-similar Web sites to link to yours ultimately will prove more useful than any other technique to optimize your site. The tricky part, of course, is to figure out which sites you would like to link to yours and, more important, how to convince their owners to do so.
Effective link building requires patience and persistence. Lots of it. The first step in the link building process is to find out which links are currently pointing to your site. To do so:
Go to the Google Web site.
Type in “link:”+ your Web site URL.
Click “Google Search.
Google will list your Web site’s back links.
If you are using the Google browser toolbar, you can check the back links by pointing your browser to your Web site; then select the “Backward Links” option from the site information drop-down menu.
You should then visit and review the content and PageRank of some of the sites that point to yours. That will enable you to determine whether or not the current back links are beneficial to your site’s Google ranking.
The next step in the link building process is to build more of them…

To learn more about linking strategy and how to improve your PR visit  Link Analysis and Google PageRank web page.