Dunstan et al v. comScore, Inc.
Filing 255
DECLARATION of Robyn Bowland regarding response in opposition to motion 254 (Attachments: # 1 Exhibit A, # 2 Exhibit B (filed under seal), # 3 Exhibit C, # 4 Exhibit D)(Bowland, Robyn)
Brandon's Baseball Cards
Heading tags (not to be confused with the HTML tag or HTTP headers) are used to present structure on the page to users. There are six sizes of heading tags, beginning with, the most important, and
ending with , the least important (1).
Since heading tags typically make text contained in them larger than
normal text on the page, this is a visual cue to users that this text
is important and could help them understand something about
the type of content underneath the heading text. Multiple heading
sizes used in order create a hierarchical structure for your content,
making it easier for users to navigate through your document.
News - Treasure Trove of Baseball Cards Found in Old Barn
News - Treasure Trove of Baseball Cards Found in Old Barn
A man who recently purchased a farm house was pleasantly surprised ... dollars worth of vintage baseball cards in the barn. The cards were ... in news papers and were thought to be in near-mint condition. After ... the cards to his grandson instead of selling them.
(1) On a page containing a news story, we might put the name of our site into an
tag and the topic of the story into an tag.
Heading tags are an important
website component for
catching the user's eye, so be
careful how you use them!
Best Practices
Imagine you're writing an outline
Similar to writing an outline for a large paper, put some thought into what the main points and subpoints of the content on the page will be and decide where to use heading tags appropriately.
Avoid:
placing text in heading tags that wouldn't be helpful in defining the structure of the page
using heading tags where other tags like and may be more appropriate
erratically moving from one heading tag size to another
Use headings sparingly across the page
Use heading tags where it makes sense. Too many heading tags on a page can make it hard for users
to scan the content and determine where one topic ends and another begins.
Avoid:
excessively using heading tags throughout the page
putting all of the page's text into a heading tag
using heading tags only for styling text and not presenting structure
Glossary
HTTP headers
In HTTP (HyperText Transfer Protocol), different types of data that are sent off before
the actual data itself.
An HTML tag denoting emphasis. According to standard, it will indicate emphasis
through use of italics.
An HTML tag denoting strong emphasis. According to standard, it will indicate
emphasis through use of bold print.
20
Wildcard
A character (*) that takes the place of any other character or string of characters.
.htaccess
Hypertext access file, a file that allows you to manage web server configuration.
Referrer log
Referrer information that is written into the access log. When it is traced, one can find
out from which sites visitors arrived.
Dealing with Crawlers
A "robots.txt" file tells search engines whether they can access
and therefore crawl parts of your site (1). This file, which must be
named "robots.txt", is placed in the root directory of your site (2).
There are a handful of other ways to prevent content appearing in
search results, such as adding "NOINDEX" to your robots meta tag,
using .htaccess to password protect directories, and using Google
Webmaster Tools to remove content that has already been crawled.
Google engineer Matt Cutts walks through the caveats of each URL
blocking method in a helpful video.
Disallow: /images/
Disallow: /search
(1) All compliant search engine bots (denoted by the wildcard * symbol) shouldn't
access and crawl the content under /images/ or any URL whose path begins with /
search.
(2) The address of our robots.txt file.
Keep a firm grasp on
managing exactly what
information you do and don't
want being crawled!
Use more secure methods for sensitive content
Avoid:
allowing search result-like pages to be crawled
- users dislike leaving one search result page and landing on another search result page that doesn't
add significant value for them
allowing URLs created as a result of proxy services to be crawled
robots.txt generator
http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-ofrobots.html
Using robots.txt files
http://www.google.com/support/webmasters/bin/answer.py?answer=156449
Caveats of each URL blocking method
http://googlewebmastercentral.blogspot.com/2008/01/remove-your-contentfrom-google.html
Promotions and Analysis
Links
Robots Exclusion Standard
A convention to prevent cooperating web spiders/crawlers, such as Googlebot, from
accessing all or part of a website which is otherwise publicly viewable.
Proxy service
A computer that substitutes the connection in cases where an internal network and
external network are connecting, or software that possesses a function for this
purpose.
SEO for Mobile Phones
You shouldn't feel comfortable using robots.txt to block sensitive or confidential material. One reason
is that search engines could still reference the URLs you block (showing just the URL, no title or
snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also,
non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could
disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or
subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Encrypting the content or password-protecting it with .htaccess are more secure alternatives.
Dealing with Crawlers
Best Practices
Optimizing Content
You may not want certain pages of your site crawled because they
might not be useful to users if found in a search engine's search
results. If you do want to prevent search engines from crawling your
pages, Google Webmaster Tools has a friendly robots.txt generator to
help you create this file. Note that if your site uses subdomains and
you wish to have certain pages not crawled on a particular
subdomain, you'll have to create a separate robots.txt file for that
subdomain. For more information on robots.txt, we suggest this
Webmaster Help Center guide on using robots.txt files
User-agent: *
Improving Site Structure
Restrict crawling where it's not needed
with robots.txt
SEO Basics
Make effective use of robots.txt
21
Dealing with Crawlers
Be aware of rel="nofollow" for links
Combat comment spam with "nofollow"
Setting the value of the "rel" attribute of a link to "nofollow" will
tell Google that certain links on your site shouldn't be followed
or pass your page's reputation to the pages linked to.
Nofollowing a link is adding rel="nofollow" inside of the link's anchor
tag (1).
When would this be useful? If your site has a blog with public
commenting turned on, links within those comments could pass your
reputation to pages that you may not be comfortable vouching for.
Blog comment areas on pages are highly susceptible to comment
spam (2). Nofollowing these user-added links ensures that you're not
giving your page's hard-earned reputation to a spammy site.
Comment spammer
(1) If you or your site's users link to a site that you don't trust and/or you don't want
to pass your site's reputation, use nofollow.
(2) A comment spammer leaves a message on one of our blogs posts, hoping to
get some of our site's reputation.
Automatically add "nofollow" to comment
columns and message boards
Many blogging software packages automatically nofollow user
comments, but those that don't can most likely be manually edited to
do this. This advice also goes for other areas of your site that may
involve user-generated content, such as guestbooks, forums, shoutboards, referrer listings, etc. If you're willing to vouch for links added
by third parties (e.g. if a commenter is trusted on your site), then
there's no need to use nofollow on links; however, linking to sites
that Google considers spammy can affect the reputation of your
own site. The Webmaster Help Center has more tips on avoiding
comment spam, like using CAPTCHAs and turning on comment
moderation (3).
(3) An example of a CAPTCHA used on Google's blog service, Blogger. It can
present a challenge to try to ensure an actual person is leaving the comment.
Glossary
Comment spamming
Refers to indiscriminate postings, on blog comment columns or message boards, of
advertisements, etc. that bear no connection to the contents of said pages.
22
CAPTCHA
Completely Automated Public Turing test to tell Computers and Humans Apart.
SEO Basics
Brandon's Baseball Cards - Buy Cards, Baseball News, Card Prices
Improving Site Structure
About using "nofollow" for individual
contents, whole pages, etc.
(4) This nofollows all of the links on a page.
Optimizing Content
Lastly, if you're interested in nofollowing all of the links on a page, you
can use "nofollow" in your robots meta tag, which is placed inside the
tag of that page's HTML (4). The Webmaster Central Blog
provides a helpful post on using the robots meta tag. This method is
written as .
Dealing with Crawlers
Make sure you have solid
measures in place to deal
with comment spam!
SEO for Mobile Phones
Avoiding comment spam
http://www.google.com/support/webmasters/bin/answer.py?answer=81749
Using the robots meta tag
http://googlewebmastercentral.blogspot.com/2007/03/using-robots-meta-tag.html
Promotions and Analysis
Links
23
SEO for Mobile Phones
Notify Google of mobile sites
Configure mobile sites so that they can be
indexed accurately
It seems the world is going mobile, with many people using mobile
phones on a daily basis, and a large user base searching on Google’s
mobile search page. However, as a webmaster, running a mobile site
and tapping into the mobile search audience isn't easy. Mobile sites
not only use a different format from normal desktop sites, but
the management methods and expertise required are also quite
different. This results in a variety of new challenges. While many
mobile sites were designed with mobile viewing in mind, they weren’t
designed to be search friendly.
Here are troubleshooting tips to help ensure that your site is properly
crawled and indexed:
(1) Example of a search for [baseball cards] on Google’s
desktop search (above) and mobile search (left). Mobile
search results are built for mobile devices and are
different from "standard" desktop results.
Verify that your mobile site is indexed by
Google
If your web site doesn't show up in the results of a Google mobile
search even using the site: operator, it may be that your site has one
or both of the following issues:
1. Googlebot may not be able to find your site
Googlebot must crawl your site before it can be included in our search
index. If you just created the site, we may not yet be aware of it. If
that's the case, create a Mobile Sitemap and submit it to Google to
inform us of the site’s existence. A Mobile Sitemap can be submitted
using Google Webmaster Tools, just like a standard Sitemap.
Make sure your mobile site is
properly recognized by Google
so that searchers can find it.
Glossary
Mobile Sitemap
An XML Sitemap that contains URLs of web pages designed for mobile phones.
Submitting the URLs of mobile phone web content to Google notifies us of the
existence of those pages and allows us to crawl them.
User-agent
Software and hardware utilized by the user when said user is accessing a website.
24
XHTML Mobile
XHTML, a markup language redefined via adaptation of HTML to XML, and then
expanded for use with mobile phones.
Compact HTML
Markup language resembling HTML; it is used when creating web pages that can be
displayed on mobile phones and with PHS and PDA.
SEO Basics
SetEnvIf User-Agent "Android" allow_ua
SetEnvIf User-Agent "BlackBerry" allow_ua
SetEnvIf User-Agent "iPhone" allow_ua
SetEnvIf User-Agent "NetFront" allow_ua
SetEnvIf User-Agent "Symbian OS" allow_ua
SetEnvIf User-Agent "Windows Phone" allow_ua
Order deny,allow
deny from all
allow from env=allow_ua
(2) An example of a mobile site restricting any access from non-mobile devices.
Please remember to allow access from user agents including “Googlebot-Mobile”.
(3) An example of DTD for mobile devices.
Dealing with Crawlers
Once Googlebot-Mobile crawls your URLs, we then check for whether
each URL is viewable on a mobile device. Pages we determine
aren't viewable on a mobile phone won't be included in our
mobile site index (although they may be included in the regular web
index). This determination is based on a variety of factors, one of
which is the "DTD (Doc Type Definition)" declaration. Check that your
mobile-friendly URLs' DTD declaration is in an appropriate mobile
format such as XHTML Mobile or Compact HTML (3). If it's in a
compatible format, the page is eligible for the mobile search index.
For more information, see the Mobile Webmaster Guidelines.
Optimizing Content
Verify that Google can recognize your
mobile URLs
SetEnvIf User-Agent "Googlebot-Mobile" allow_ua
Improving Site Structure
2. Googlebot may not be able to access your site
Some mobile sites refuse access to anything but mobile phones,
making it impossible for Googlebot to access the site, and therefore
making the site unsearchable. Our crawler for mobile sites is
"Googlebot-Mobile". If you'd like your site crawled, please allow
any User-agent including "Googlebot-Mobile" to access your
site (2). You should also be aware that Google may change its Useragent information at any time without notice, so we don't recommend
checking whether the User-agent exactly matches "GooglebotMobile" (the current User-agent). Instead, check whether the Useragent header contains the string "Googlebot-Mobile". You can also
use DNS Lookups to verify Googlebot.
SEO for Mobile Phones
Google’s mobile search page
http://www.google.com/m/
site: operator
http://www.google.com/support/webmasters/bin/answer.py?answer=35256
Mobile Sitemap
http://www.google.com/support/webmasters/bin/topic.py?topic=8493
Submitted using Google Webmaster Tools
http://www.google.com/support/webmasters/bin/answer.py?answer=156184
Use DNS Lookups to verify Googlebot
http://googlewebmastercentral.blogspot.com/2006/09/how-to-verify-googlebot.html
Mobile Webmaster Guidelines
http://www.google.com/support/webmasters/bin/answer.py?answer=72462
Promotions and Analysis
Links
25
SEO for Mobile Phones
Guide mobile users accurately
Running desktop and mobile versions of
your site
One of the most common problems for webmasters who run
both mobile and desktop versions of a site is that the mobile
version of the site appears for users on a desktop computer, or
that the desktop version of the site appears when someone
accesses it on a mobile device. In dealing with this scenario, here
are two viable options:
Redirect mobile users to the correct
version
When a mobile user or crawler (like Googlebot-Mobile) accesses the
desktop version of a URL, you can redirect them to the corresponding
mobile version of the same page. Google notices the relationship
between the two versions of the URL and displays the standard
version for searches from desktops and the mobile version for
mobile searches.
If you redirect users, please make sure that the content on the
corresponding mobile/desktop URL matches as closely as possible
(1). For example, if you run a shopping site and there's an access from
a mobile phone to a desktop-version URL, make sure that the user
is redirected to the mobile version of the page for the same
product, and not to the homepage of the mobile version of the
site. We occasionally find sites using this kind of redirect in an
attempt to boost their search rankings, but this practice only results
in a negative user experience, and so should be avoided at all costs.
Redirect
Homepage
Product page
Mobile user
Redirect
Being automatically transported from one specified web page to another specified
web page when browsing a website.
26
Homepage
Redirect
On the other hand, when there's an access to a mobile-version URL
from a desktop browser or by our web crawler, Googlebot, it's not
necessary to redirect them to the desktop-version. For instance,
Google doesn't automatically redirect desktop users from their mobile
site to their desktop site; instead they include a link on the mobileversion page to the desktop version. These links are especially helpful
when a mobile site doesn't provide the full functionality of the desktop
version—users can easily navigate to the desktop-version if they
prefer.
Glossary
Mobile version
Desktop version
Product page
(1) An example of redirecting a user to the
mobile version of the URL when it's accessed
from a mobile device. In this case, the content
on both URLs needs to be as similar as possible.
SEO Basics
Switch content based on User-agent
Desktop user
Desktop contents
Googlebot
Googlebot-Mobile
Mobile user
(2) Example of changing the format of a page based on the User-agent. In this case,
the desktop user is supposed to see what Googlebot sees and the mobile user is
supposed to see what Googlebot-mobile sees.
Be sure to guide the user
to the right site for their
device!
Promotions and Analysis
Google mobile
http://www.google.com/m/
Cloaking
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Must be same
SEO for Mobile Phones
Links
Can be
different
Dealing with Crawlers
One example of how you could be unintentionally detected as
cloaking is if your site returns a message like "Please access from
mobile phones" to desktop browsers, but then returns a full mobile
version to both crawlers (so Googlebot receives the mobile version).
In this case, the page which web search users see (e.g. "Please
access from mobile phones") is different from the page which
Googlebot crawls (e.g. "Welcome to my site"). Again, we detect
cloaking because we want to serve users the same relevant content
that Googlebot or Googlebot-Mobile crawled.
Can be
different
Website
Mobile contents
So what does "the page that the user sees" mean if you provide both
versions with a URL? As I mentioned in the previous post, Google
uses "Googlebot" for web search and "Googlebot-Mobile" for
mobile search. To remain within our guidelines, you should
serve the same content to Googlebot as a typical desktop user
would see, and the same content to Googlebot-Mobile as you
would to the browser on a typical mobile device. It's fine if the
contents for Googlebot are different from those for Googlebot-Mobile.
Must be same
Optimizing Content
However, note that if you fail to configure your site correctly, your site
could be considered to be cloaking, which can lead to your site
disappearing from our search results. Cloaking refers to an attempt to
boost search result rankings by serving different content to
Googlebot than to regular users. This causes problems such as less
relevant results (pages appear in search results even though their
content is actually unrelated to what users see/want), so we take
cloaking very seriously.
Improving Site Structure
Some sites have the same URL for both desktop and mobile content,
but change their format according to User-agent. In other words, both
mobile users and desktop users access the same URL (i.e. no
redirects), but the content/format changes slightly according to the
User-agent. In this case, the same URL will appear for both mobile
search and desktop search, and desktop users can see a
desktop version of the content while mobile users can see a
mobile version of the content (2).
27
Promotions and Analysis
Promote your website in the right ways
Website
About increasing backlinks with an
intention to increase the value of the site
News: “I have a new card!”
While most of the links to your site will be gained gradually, as people
discover your content through search or other ways and link to it,
Google understands that you'd like to let others know about the hard
work you've put into your content. Effectively promoting your new
content will lead to faster discovery by those who are interested
in the same subject (1). As with most points covered in this
document, taking these recommendations to an extreme could
actually harm the reputation of your site.
My blog
Product page
Master making announcements via blogs
and being recognized online
A blog post on your own site letting your visitor base know that you
added something new is a great way to get the word out about new
content or services. Other webmasters who follow your site or
RSS feed could pick the story up as well.
User’s blogs
Social media service
Online
Putting effort into the offline promotion of your company or site can
also be rewarding. For example, if you have a business site, make sure
its URL is listed on your business cards, letterhead, posters, etc. You
could also send out recurring newsletters to clients through the mail
letting them know about new content on the company's website.
Newsletter, DM,
Posters, etc.
Offline
(1) Promoting your site and having quality links could lead to increasing your site’s
reputation.
If you run a local business, adding its information to Google Places
will help you reach customers on Google Maps and web search.
The Webmaster Help Center has more tips on promoting your local
business.
(2) By having your business registered for Google Places, you can promote your
site through Google Maps and Web searches.
Glossary
RSS feed
Data including full or summarized text describing an update to a site/blog. RSS is an
abbreviation for RDF Site Summary; a service using a similar data format is Atom.
28
SEO Basics
Best Practices
Sites built around user interaction and sharing have made it easier to match interested groups of
people up with relevant content.
Improving Site Structure
Know about social media sites
Avoid:
attempting to promote each new, small piece of content you create; go for big, interesting items
involving your site in schemes where your content is artificially promoted to the top of these services
Chances are, there are a number of sites that cover topic areas similar to yours. Opening up
communication with these sites is usually beneficial. Hot topics in your niche or community could
spark additional ideas for content or building a good community resource.
Optimizing Content
Reach out to those in your site's related
community
Avoid:
spamming link requests out to all sites related to your topic area
purchasing links from another site with the aim of getting PageRank instead of traffic
Dealing with Crawlers
Is your site doing OK?
SEO for Mobile Phones
Google Places
http://www.google.com/local/add/
Promoting your local business
http://www.google.com/support/webmasters/bin/answer.py?answer=92319
Promotions and Analysis
Links
29
Promotions and Analysis
Make use of free webmaster tools
Make Googlebot crawling smoother by using Webmaster Tools
Major search engines, including Google, provide free tools for
webmasters. Google's Webmaster Tools help webmasters better
control how Google interacts with their websites and get useful
information from Google about their site. Using Webmaster Tools
see which parts of a site Googlebot had problems crawling
notify us of an XML Sitemap file
analyze and generate robots.txt files
remove URLs already crawled by Googlebot
specify your preferred domain
identify issues with title and description meta tags
won't help your site get preferential treatment; however, it can help
you identify issues that, if addressed, can help your site perform
better in search results. With the service, webmasters can:
understand the top searches used to reach a site
get a glimpse at how Googlebot sees pages
remove unwanted sitelinks that Google may use in results
receive notification of quality guideline violations and request a site
reconsideration
Yahoo! (Yahoo! Site Explorer) and Microsoft (Bing Webmaster Tools)
also offer free tools for webmasters.
High-level analysis is possible via Google Analytics and Website Optimizer
If you've improved the crawling and indexing of your site using Google
Webmasters Tools or other services, you're probably curious about
the traffic coming to your site. Web analytics programs like Google
Analytics are a valuable source of insight for this. You can use these
to:
get insight into how users reach and behave on your site
discover the most popular content on your site
measure the impact of optimizations you make to your site
- e.g. did changing those title and description meta tags improve traffic from search engines?
For advanced users, the information an analytics package provides,
combined with data from your server log files, can provide even more
comprehensive information about how visitors are interacting with
your documents (such as additional keywords that searchers might
use to find your site).
30
Lastly, Google offers another tool called Google Website Optimizer
that allows you to run experiments to find what on-page changes will
produce the best conversion rates with visitors. This, in combination
with Google Analytics and Google Webmaster Tools (see our video on
using the "Google Trifecta"), is a powerful way to begin improving your
site.
SEO Basics
Google Analytics
http://www.google.com/support/forum/p/webmasters/
Have questions or feedback on our guide? Let us know.
http://www.google.com/analytics/
Find the source of your visitors, what they're viewing, and benchmark
changes.
Google Webmaster Central Blog
http://googlewebmastercentral.blogspot.com/
Frequent posts by Googlers on how to improve your website.
Google Webmaster Help Center
http://www.google.com/support/webmasters/
Filled with in-depth documentation on webmaster-related issues.
Google Webmaster Tools
http://www.google.com/websiteoptimizer/
Run experiments on your pages to see what will work and what won't.
Tips on Hiring an SEO
http://www.google.com/support/webmasters/bin/answer.
py?answer=35291
If you don't want to go at it alone, these tips should help you choose
an SEO company.
Google Webmaster Guidelines
http://www.google.com/webmasters/guidelines.html
Design, content, technical, and quality guidelines from Google.
Dealing with Crawlers
Make the most of useful tools
and information!
Optimizing Content
https://www.google.com/webmasters/tools/
Optimize how Google interacts with your website.
Google Website Optimizer
Improving Site Structure
Google Webmaster Help Forum
SEO for Mobile Phones
Google Trifecta
http://www.youtube.com/watch?v=9yKjrdcC8wA
This booklet is also available in PDF format. You can download the PDF version at ...
http://www.google.co.jp/intl/en/webmasters/docs/search-engine-optimization-starter-guide.pdf
Except as otherwise noted, the content of this document is licensed under the Creative Commons Attribution 3.0 License.
Promotions and Analysis
Links
31
Check out Google's SEO
resources and tools.
Google Webmaster Central
http://www.google.com/webmasters/
©Copyright 2010 Google is a trademark of Google Inc.
All other company and product names may be trademarks of
the respective companies with which they are associated.
32
Search
Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.