1) Check for canonical domain issue: Do the www and non-www version of the homepage go to the same url? If not, then look at www.mattcutts.com/blog/seo-advice-url-canonicalization/ for an understanding of how duplicate content should resolve to a single page. If php site, then mod_rewrite. If asp, then isapi_rewrite. If you’re giving a SEO presentation, talk about permanent (say 301 to sound fancy) redirects. We need server access to fix this
2) Dynamic looking urls: Closely related to the above issue on url rewrite, the site should not pass parameters in the url. For example, mysite.com/?page_id=34§ion=2&session_id=123abc is no good. Question marks and equal signs in the url mean that parameters are being passed. Instead, the url should look like mysite.com/large_blue_widgets—that is more descriptive from a user standpoint, plus helps robots index more easily. That is a static looking url, but could be dynamically generated.
3) Look for text versus images/flash across site: Do a control-A (select all) on the homepage. Is it just a gigantic image or a big flash video? Search engines cannot read images or flash—they need text. Is the navigation images, instead of text? How much text is on the homepage—at least a few sentences about what the business does? When you right click on images and see properties, is the alt-text blank, instead of filled with a few good keywords? Is the phone number in text and large in the upper corner.
4) Unique page titles: Starting from the homepage, do they have page titles that are just a few words long (65 characters or less, if possible) and begin with their most important terms? The important terms are what we deem to be both relevant and high search volume. Do not start the page title with the name of the site unless it’s a strong brand. Make sure each page title is reflective of what that particular page is about.
5) Sitemaps: There should be a link to “sitemap” on the home page—and that sitemap should have links with anchor text (the words highlighted in bold) that are key search terms. For example, “our products” is terrible, while “peoplesoft testing software” is relevant. You should also be able to grab mydomain.com/sitemap.xml. See more at http://www.sitemaps.org/, where you can learn to use Google Webmaster Central to tell Google what pages you want crawled, plus check for errors.
6) Run PPC competitor reports: Go to Spyfu.com and do a few searches. Put in a few keywords that you think are important to the site, plus the site url itself. If you’re looking at a keyword, it will come back with who is buying it on PPC, what it costs, how many clicks are available, and what terms are related. If you’re looking at a url, it will tell you what terms (if any) they are buying on Google, what terms they rank naturally on, and who are related competitors. Remember that is just Google—this company specializes in writing scrapers, which are robots that repeat millions of search results to see who comes up.
7) Source code: Do a view > page source from your browser, starting from the home page. Do a find (control F) for H1 and see if any come up. If so, are they using keywords that they want folks to find them on? H1 tags carry more weight than regular text, but are not as strong as the page title. Do you see frames on the site (look for “frame”)—perhaps 1 out of 15 sites you see will be just one page with the rest of the content inside a frame. Thus, search engines see only 1 page. At the top of the page, do you see a meta description with good keywords woven into a sentence? The meta description should mimic ad copy you’d have in AdWords.
8) Analytics code: While still on that view of the code (mentioned in step 7), find instances of “google”, “analytics”, or “urchin”. You may also see examples of other tracking software, too, towards the bottom of the body. The majority of sites will have Google Analytics installed—if they are small businesses, then we recommend that as a cheap, effective solution. Large sites may include Omniture SiteCatalyst, ClickTracks, or WebSideStory—easy enough to see.
9) Validation: Go to validator.w3.org and put in the url. Like the other tools, it will return usually a few dozen errors with explanations. Usually, you’ll see tons of missing attributes, which can mostly be ignored.
10) seoquake.com: Another great Firefox browser plug-in to show you Google PR, pages indexed by Google, pages indexed by Yahoo, backlinks (how many sites link to you), age of site, and Alexa rank (popularity). As you browse sites, it’s handy to see these stats. Glaring problems are if there are less than a few dozen pages in the Google index, less than a few dozen backlinks, Google PR of n/a on many pages (especially bad if true on the homepage), and Alexa rank of greater than 500,000 (lower is better), which means almost no traffic.