Virtual Solutions, LLC v. Microsoft Corporation
OPINION AND ORDER re: 43 MOTION for Summary Judgment of Invalidity for Indefiniteness, filed by Microsoft Corporation. Also, before the Court are the parties' submissions related to the Markman hearing that took place on January 22, 201 3. At this hearing, the Court also heard oral arguments on the present motion. For the following reasons, the motion is granted. Microsoft's motion for summary judgment is granted. The Clerk of the Court is directed to close the motion (Doc. No. 43) and the case. (Signed by Judge Shira A. Scheindlin on 2/14/2013) (ja)
UNITED STATES DISTRICT COURT
SOUTHERN DISTRICT OF NEW YORK
VIRTUAL SOLUTIONS, LLC,
OPINION AND ORDER
12 Civ. 1118 (SAS)
SHIRA A. SCHEINDLIN, U.S.D.J.:
Virtual Solutions, LLC ("Virtual") brings this action against
Microsoft Corporation ("Microsoft"). Virtual claims that Microsoft has infringed
on claims 1-3,5, 7,8-9, and 22 of U.S. Patent No. 6,507,353 ("the '353 Patent"),
of which Virtual is the exclusive licensee. Microsoft now moves for summary
judgment on the grounds that claims 1 and 8 of the '353 Patent are invalid for
This Court has jurisdiction over this patent infringement action under
28 U.S.C. § 1338(a).
indefiniteness. Also before the Court are the parties’ submissions related to the
Markman hearing that took place on January 22, 2013. At this hearing, the Court
also heard oral arguments on the present motion. For the following reasons, the
motion is granted.
The ‘353 patent was filed on December 10, 1999 and was issued by
the Patent and Trademark Office (“PTO”) on January 14, 2003.3 It is entitled
“Influencing Virtual Actors in an Interactive Environment.”4 The patent claims
“[a] method for generating a behavior vector for a virtual actor in an interactive
theat[er] by interpreting stimuli from visitors . . . .”5 The specification states that
The following facts are drawn from Microsoft’s Rule 56.1 Statement
of Undisputed Material Facts (“Microsoft 56.1”); Plaintiff Virtual Solutions LLC’s
Response to Microsoft Corporation’s Rule 56.1 Statement of Undisputed Material
Facts and Virtual Solutions LLC’s Statement of Additional Undisputed Material
Facts (“Virtual 56.1”); and Microsoft Corporation’s Response to Virtual Solution’s
Rule 56.1 Statement (“Microsoft 56.1 Resp.”), as well as the parties memoranda
and other submissions relating to the present motion.
See ‘353 Patent, Ex. 1 to Declaration of Robert Courtney in Support
of Microsoft Corporation’s Motion for Summary Judgment of Invalidity for
Indefiniteness, at 1.
Id. at col. 16:5-7.
the object of the patent is “to provide a method for interacting with virtual actors in
an interactive environment[,]” and “to simulate ‘real-life’ behaviors of the virtual
The specification teaches that “[c]ombining the security of a
controlled environment with the possibility to interact almost naturally with animal
kind has always been a dream[,]”7 and that viewing animals in captivity does not
live up to this dream, because animals in captivity alter their behaviors.8 The
specification further teaches that, prior to the ‘353 Patent, virtual reality had
showed promise in bringing this dream to life, but that the interactivity of virtual
reality at the time was limited by its use of scripted scenarios.9
The specification describes a preferred embodiment consisting of a
dome-shaped theater into which images are projected for viewing by an audience,
and notes that the projection of images could be replaced by holography or any
other type of presentation.10 In a preferred embodiment of the main modules
Id. at col. 2:14-18.
Id. at col. 1:28-30.
See id. at col. 1:35-37.
See id. at col. 2:1-10.
See Microsoft 56.1 ¶ 2 (citing ‘353 Patent at col. 3:3-8); Virtual 56.1 ¶
2 (citing ‘353 Patent at col. 3:1-17).
described in the patent, these modules are to implemented via software.11
The ‘353 patent also describes sensors in the theater area that detect
physical information about audience members and “Stimulus Generators” that
analyze that information.12 The patent states that a system could feed this sensor
data into the “behavioral module” of a “virtual actor,” which is “likely [t]o be [a]
virtual animal or [a] virtual physical actor which ha[s] behaviors that are easier
to simulate than those of humans.”13
The “behavioral module” of the virtual actor would then calculate the
reaction of the actor.14 One component of the “behavioral module” would be the
“behavioral model” of a virtual actor, a set of factors specific to a virtual actor that
would mediate its response to the data provided to it by the Stimulus Generator.15
For example, the specification describes the age of a virtual actor as a possible
factor within that actor’s behavioral model, and states that “[a]n older animal will
See Microsoft 56.1 ¶ 15 (citing ‘353 Patent at col. 10:44-48).
See id. ¶ 3 (citing ‘353 Patent at col. 3:22-29). Virtual disputes this
statement, but not with the specificity required by Local Rule 56.1. See Virtual
56.1 ¶ 3.
‘353 Patent at col. 4:57-59.
See id. at col. 4:55-56 (“The behavioral module 28 will calculate,
from the data collected at the user interface, the reaction of the actor.”).
See id. at col. 11:61-12:13.
more likely be passive with respect to the stimuli of the visitors.”16
In sum, the system would collect and analyze physical information
from visitors, feed that information to virtual actors, the behavioral models of
which would mediate a response (or non-response).17 In this way the system would
allow the virtual actors to respond to the physical information collected from
viewers in real time.18
The Parties’ Experts19
Both parties have offered expert testimony in connection with this
motion. Microsoft’s expert is Aaron T. Bobick, Ph.D. (“Bobick”). Bobick
received bachelor of science degrees in mathematics and computer science from
the Massachusetts Institute of Technology (“MIT”) in 1981, and a Ph.D. in
See id. at col. 5:1-3.
This sentence is not meant to be a complete summary of the claimed
invention. A more complete statement of the Court’s understanding of the portions
of the ‘353 Patent that are relevant to the parties’ contentions is provided below.
See id. at col 2:14-19.
The facts presented in this section are drawn from the Curriculum Vita
of Microsoft’s Expert Aaron T. Bobick, Ph.D., Ex. A to 10/5/12 Declaration of
Aaron T. Bobick (“Bobick Decl.”); the List of Testimony for Aaron Bobick, Ex. B
to Bobick Decl.; and the Curriculum Vita of Vyacheslav Zavadsky, Ph.D., Ex. A to
Declaration of Vyacheslav (“Slava”) Zavadsky, Ph.D. (“Zavadsky Decl.”); Bobick
Decl.; and Zavadsky Decl.
cognitive science from MIT in 1987. He has been in the academic field since
receiving his Ph.D., with stints at MIT and Stanford. Since 2003, he has been
employed as full professor at the Georgia Institute of Technology’s College of
Computing, where he was the founding chair of the School of Interactive
Computing. Additionally, from 1993-1996, he served as the Chief Technology
Officer of Cybergear, a company that he founded. In connection with Cybergear,
Bobick received several patents for an “interactive exercise apparatus.”
Over the course of his career, Bobick has published twenty-two
articles in peer-refereed journals on topics related to machine perception and
virtual reality. He has also received nine grants, as principal investigator, on the
same topics. Finally, Bobick has provided expert testimony in eight prior cases,
mostly in the field of computer vision.
Virtual’s expert is Vyacheslav Zavadsky, Ph.D. (“Zavadsky”).
Zavadsky received a Masters in Computer Science in 1994 from Belarusian State
University, and a Ph.D. in the same field from Belarusian State University in 1998.
He is currently employed as the Principal of Zavadsky Technologies, where his
duties include patent assessment, software project management, and software
development. From 2003 to 2011, he worked at UBM TechInsights, where he was
engaged in patent analysis and reverse engineering. During his tenure at UBM,
Zavadsky reviewed hundreds of patents.
Zavadsky is the named inventor on twelve issued United States
patents, at least five of which pertain to image processing and computer vision. He
has ten additional patents pending.
Indefiniteness is an issue that is amenable to summary judgment.20
Summary judgment is appropriate “if the movant shows that there is no genuine
dispute as to any material fact and the movant is entitled to judgment as a matter of
law.”21 “‘An issue of fact is genuine if the evidence is such that a reasonable jury
could return a verdict for the nonmoving party. A fact is material if it might affect
the outcome of the suit under the governing law.’”22 “The moving party bears the
burden of establishing the absence of any genuine issue of material fact.”23 “When
the burden of proof at trial would fall on the nonmoving party, it ordinarily is
See Net MoneyIN, Inc. v. VeriSign, Inc., 545 F.3d 1359, 1364, 1367
(Fed. Cir. 2008).
Fed. R. Civ. P. 56(a).
Fincher v. Depository Trust & Clearing Corp., 604 F.3d 712, 720 (2d
Cir. 2010) (quoting Roe v. City of Waterbury, 542 F.3d 31, 35 (2d Cir. 2008)).
Zalaski v. City of Bridgeport Police Dep’t, 613 F.3d 336, 340 (2d Cir.
sufficient for the movant to point to a lack of evidence . . . on an essential element
of the nonmovant’s claim.”24 In turn, to defeat a motion for summary judgment,
the non-moving party must raise a genuine issue of material fact. To do so, the
non-moving party “‘must do more than simply show that there is some
metaphysical doubt as to the material facts,’”25 and “‘may not rely on conclusory
allegations or unsubstantiated speculation.’”26
In deciding a motion for summary judgment, a court must “‘construe
the facts in the light most favorable to the non-moving party and must resolve all
ambiguities and draw all reasonable inferences against the movant.’”27 However,
“‘[c]redibility determinations, the weighing of the evidence, and the drawing of
legitimate inferences from the facts are jury functions, not those of a judge.’”28
“‘The role of the court is not to resolve disputed issues of fact but to assess
Cordiano v. Metacon Gun Club, Inc., 575 F.3d 199, 204 (2d Cir.
Brown v. Eli Lilly and Co., 654 F.3d 347, 358 (2d Cir. 2011).
Id. (quoting Federal Deposit Ins. Corp. v. Great Am. Ins. Co., 607
F.3d 288, 292 (2d Cir. 2010)).
Brod v. Omya, Inc., 653 F.3d 156, 164 (2d Cir. 2011) (quoting
Williams v. R.H. Donnelley, Corp., 368 F.3d 123, 126 (2d Cir. 2004)).
Kaytor v. Electric Boat Corp., 609 F.3d 537, 545 (2d Cir. 2010)
(quoting Reeves v. Sanderson Plumbing Prods., Inc., 530 U.S. 133, 150 (2000))
whether there are any factual issues to be tried.’”29
“A determination of indefiniteness is a legal conclusion that is drawn
from the court’s performance of its duty as the construer of patent claims . . . .”30 I
will therefore lay out the law of claim construction that is applicable to this case
prior to stating the law applicable to indefiniteness.
“Analysis of patent infringement starts with ‘construction’ of the
claim, whereby the court establishes the scope and limits of the claim, interprets
any technical or other terms whose meaning is at issue, and thereby defines the
claim with greater precision than had the patentee.”31 Claim construction is a
question of law.32 Judges construe claims with the goal of “‘elaborating the
normally terse claim language in order to understand and explain, but not to
Brod, 653 F.3d at 164 (quoting Wilson v. Northwestern Mut. Ins. Co.,
625 F.3d 54, 60 (2d Cir. 2010)).
Atmel Corp. v. Information Storage Devices, Inc., 198 F.3d 1374,
1378 (Fed. Cir. 1999) (quotation marks and citation omitted).
Pall Corp. v. Hemasure Inc., 181 F.3d 1305, 1308 (Fed. Cir. 1999).
See Markman v. Westview Instruments, Inc., 517 U.S. 370, 384, 390-
change, the scope of the claims.’”33
The claim construction inquiry begins from the “objective baseline”
of the ordinary and customary meaning that a claim term would have to a person of
ordinary skill in the art at the time of the effective filing date of the invention.34 An
accused infringer faces a “heavy presumption” that the ordinary meaning of claim
terms applies, and a claim term may not be narrowed merely by pointing out that
the preferred embodiment of a patent is narrower than the ordinary meaning of its
With this being said, the ordinary and customary meaning of a term to
a skilled artisan is only a baseline. A deeper inquiry is needed when a claim term
is ambiguous or when “reliance on a term’s ‘ordinary meaning’ does not resolve
DeMarini Sports, Inc. v. Worth, Inc., 239 F.3d 1314, 1322 (Fed. Cir.
2001) (quoting Embrex, Inc. v. Services Eng’g Corp., 216 F.3d 1343, 1347 (Fed.
Phillips v. AWH Corp., 415 F.3d 1303, 1313 (Fed. Cir. 2005)
(citations omitted). It is a condition precedent to receiving patent protection that
the subject matter of the invention not be obvious to “a person having ordinary
skill in the art,” in light of the prior art at the time the invention was made. 35
U.S.C. § 103(a).
SunRace Roots Enter. Co., Ltd. v. SRAM Corp., 336 F.3d 1298, 1305
(Fed. Cir. 2003) (quotation marks and citations omitted).
the parties’ dispute.”36 As such, a judge may not leave questions of claim
construction to the finder of fact by instructing a jury that an ambiguous or
disputed term is to be given its ordinary meaning.37
“A person having ordinary skill in the art” is a legal fiction, like tort
law’s “reasonable person.” She is a hypothetical person who knows and
understands all of the relevant art within the field of invention and any related
technical fields, but is utterly uncreative.38 To determine how such a hypothetical
person would have understood the meaning of a disputed claim term, courts look to
publicly available sources, including both intrinsic evidence and extrinsic
Because intrinsic evidence provides “a more reliable guide to the
O2 Micro Int’l Ltd. v. Beyond Innovation Tech. Co., 521 F.3d 1351,
1361 (Fed. Cir. 2008) (“A determination that a claim term ‘needs no
construction’ or has the ‘plain and ordinary meaning’ may be inadequate when a
term has more than one ‘ordinary’ meaning or when reliance on a term’s ‘ordinary’
meaning does not resolve the parties’ dispute.”).
See Creative Internet Advertising Corp. v. YahooA, Inc., 476 Fed.
App’x 724, 728 (Fed. Cir. 2011) (citing O2 Micro Int’l Ltd., 521 F.3d at 1360–62).
See Standard Oil Co. v. American Cyanamid Co., 774 F.2d 448, 454
(Fed. Cir. 1985).
See Phillips, 415 F.3d at 1314.
meaning of a claim term than  extrinsic sources[,]”40 and because a person having
ordinary skill in the art would use it to understand a patent,41 district courts use the
intrinsic record as “the primary tool to supply the context for interpretation of
disputed claim terms.”42 The intrinsic record consists of information found within
the public record of a patent, such as its specification and prosecution history.43
The most important source of intrinsic evidence is the language of the
claims, because a patent’s claims define the scope of the patentee’s right to
exclude.44 Sometimes “the ordinary meaning of claim language as understood by a
Chamberlain Group, Inc. v. Lear Corp., 516 F.3d 1331, 1335 (Fed.
Cir. 2008) (citing Phillips, 415 F.3d at 1318-19).
See Multiform Desiccants, Inc. v. Medzam, Ltd., 133 F.3d 1473, 1477
(Fed. Cir. 1998). Accord Phillips, 415 F.3d at 1311-14.
V-Formation, Inc. v. Benetton Group SpA, 401 F.3d 1307, 1310 (Fed.
Cir. 2005) (citing Vitronics Corp. v. Conceptronic, Inc., 90 F.3d 1576, 1582 (Fed.
See Innova/Pure Water, Inc. v. Safari Water Filtration Sys., Inc., 381
F.3d 1111, 1116 (Fed. Cir. 2004) (“Those sources [relevant to claim construction]
include the words of the claims themselves, the remainder of the specification, the
prosecution history, and extrinsic evidence concerning relevant scientific
principles, the meaning of technical terms, and the state of the art.”).
See Haemonetics Corp. v. Baxter Healthcare Corp., 607 F.3d 776,
783 (Fed. Cir. 2010) (“[T]he claims perform the fundamental function of
delineating the scope of the invention . . . .”). See also Investment Tech. Group,
Inc. v. Liquidnet Holdings, Inc., No. 07 Civ. 510, 2010 WL 199912, at *2
(S.D.N.Y. Jan. 10, 2010) (“Judicial interpretation must begin with and remain
focused upon the ‘words of the claims themselves . . . to define the scope of the
person of skill in the art” will be so apparent from the claim language itself that no
further inquiry is needed.45 Even if claim language is not self-explanatory, the
context in which a claim term is used will often be highly instructive as to its
The second most important source of intrinsic evidence is provided by
the patent’s specification,47 which typically includes: “an abstract of the invention;
a description of the invention’s background; a summary of the invention; patent
drawings; and a detailed description that discusses preferred embodiments of the
invention.”48 A patent’s specification is “always highly relevant to the claim
patented invention.’”) (quoting Vitronics, 90 F.3d at 1582).
Phillips, 415 F.3d at 1314.
See, e.g., id. (“The context in which a term is used in the asserted
claim can be highly instructive. To take a simple example, [the use of the term]
‘steel baffles’ . . . strongly implies that the term ‘baffles’ does not inherently mean
objects made of steel.”).
Technically, the specification of a patent includes its claims, but in
common parlance, the specification of a patent is exclusive of its claims. I use the
second definition here.
Investment Tech. Group, Inc., 2010 WL 199912, at *3 (citing 35
U.S.C. § 112 (“The specification shall contain a written description of the
invention, and of the manner and process of making and using it, in such full, clear,
concise, and exact terms as to enable any person skilled in the art to which it
pertains, or with which it is most nearly connected, to make and use the same, and
shall set forth the best mode contemplated by the inventor of carrying out his
construction analysis[,]” and therefore “claims must be read in view of the
specification . . . .”49
While it is permissible (and often necessary) to “us[e] the
specification to interpret the meaning of a claim[,]” it is generally impermissible to
“import limitations from the specification into the claim.”50 This is so because
the words of the claim “define the scope of the right to exclude . . . .”51 For the
purposes of claim construction, this rule defines the boundary between a patent’s
claims and its specification.52 In light of this rule, the specification may be used to
limit a claim only if:
(1)  the claim “explicitly recite[s] a term in need of definition”;
or (2)  the specification unambiguously defines a term, i.e., if “a
patent applicant has elected to be a lexicographer by providing an
explicit definition in the specification for a claim term.”53
Phillips, 415 F.3d at 1315 (quotation marks and citations omitted). In
fact, a claim interpretation that excludes a preferred embodiment described in the
specification is “rarely, if ever, correct.” Vitronics, 90 F.3d at 1583.
Phillips, 415 F.3d at 1323.
Renishaw PLC v. Marposs Societa’ Per Azioni, 158 F.3d 1243, 1248
(Fed. Cir. 1998).
See id. (“These two rules lay out the general relationship between the
claims and the written description.”).
Medisim v. BestMed, No. 10 Civ. 2463, 2011 WL 2693896, at *4
(S.D.N.Y. July 8, 2011) (quoting Renishaw PLC, 158 F.3d at 1248-49).
In drawing the distinction between using the specification to interpret
a claim term, and improperly importing a limitation from the specification to
construe a claim term, a court should bear in mind “that the purposes of the
specification are to teach and enable those of skill in the art to make and use the
invention and to provide a best mode for doing so.”54 In most cases, a person
having ordinary skill in the art would appreciate the difference between a preferred
embodiment presented as an example, and one intended as a limitation on the
The extrinsic record “‘consists of all evidence external to the patent
and prosecution history, including expert and inventor testimony, dictionaries, and
learned treatises.’”56 Although “district courts  [may] rely on extrinsic evidence, .
. . extrinsic evidence in general  [is] less reliable than the patent and its
Phillips, 415 F.3d at 1323 (citation omitted).
Id. (quoting Markman v. Westview Instruments, Inc., 52 F.3d 967, 980
(Fed. Cir. 1995)). Previous judicial opinions related to the subject matter of the
patent are also considered extrinsic evidence. See MEMS Tech. Merhad v. ITC,
447 Fed. App’x 142, 153 (Fed. Cir. 2011) (“Related judicial holdings can be an
appropriate form of non-binding extrinsic evidence in a claim construction
prosecution history in determining how to read claim terms.”57 This is so for many
reasons, including: (1) extrinsic evidence was not created at the time of the patent
for the purpose of explaining the patent’s scope and meaning; (2) external
publications “may not reflect the understanding of a skilled artisan in the field of
the patent”; (3) expert reports and testimony are created at the time and for the
purpose of litigation and may suffer from bias; (4) “there is a virtually unbounded
universe of potential extrinsic evidence of some marginal relevance that could be
brought to bear on any claim construction question”; and (5) unlike intrinsic
evidence, extrinsic evidence is not necessarily part of the public record, and
therefore undue reliance on it undermines the public notice function of patents.58
Expert reports and expert testimony are forms of extrinsic evidence.
They can be helpful to a court by:
[providing] background on the technology at issue, [explaining]
how an invention works, [ensuring] that the court’s understanding
of the technical aspects of the patent is consistent with that of a
person of skill in the art, or [establishing] that a particular term in
the patent or the prior art has a particular meaning in the pertinent
Phillips, 415 F.3d at 1317-18.
Id. at 1318-19.
Id. at 1318.
However, it is unhelpful to a court if an expert “simply recites how [she] would
construe [a disputed term] based on [her] own reading of the specification.”60
Further, “a court should discount any expert testimony ‘that is clearly at odds with
the claim construction mandated by’” the intrinsic evidence.61
Canons of Construction
In construing the meaning of claim terms, courts may employ
interpretive aids known as canons of construction. Because they are merely
interpretive aids, they cannot displace contrary intrinsic evidence. Indeed, “no
canon of construction is absolute in its application. . . .”62 This section discusses
several canons that are pertinent to this dispute.
Because “claims are interpreted with an eye toward giving effect to all
terms in the claim[,]”63 courts strive to avoid constructions that render a term or
Symantec Corp. v. Computer Assocs. Int’l, Inc., 522 F.3d 1279, 1291
(Fed. Cir. 2008).
Phillips, 415 F.3d at 1318 (quoting Key Pharms. v. Hercon Labs.
Corp., 161 F.3d 709, 716 (Fed. Cir. 1998)).
Renishaw PLC, 158 F.3d at 1248. Indeed, a famous law review article
undertook to demonstrate that for every canon of construction, there is a canon of
opposite effect. See id. at n.2 (citing Karl N. Llewellyn, Remarks on the Theory of
Appellate Decision and the Rules or Canons About How Statutes Are to be
Construed, 3 Vand. L.Rev. 395, 401-06 (1950)).
Bicon, Inc. v. Straumann Co., 441 F.3d 945, 950 (Fed. Cir. 2010).
phrase superfluous.64 This is known as the rule against surplusage. For the same
reason, “different claim terms are presumed to have different meanings . . . .”65
This presumption is rebuttable when the intrinsic evidence weighs against it.66
And just as distinct words are presumed to have distinct meanings, the same words
are presumed to have the same meaning each time they appear.67 This presumption
is also rebuttable by the intrinsic evidence,68 at least insofar as the same phrase may
warrant a different meaning when it appears in different claims (as opposed to the
See Phillips, 415 F.3d at 1314.
Helmsderfer v. Bobrick Washroom Equip., Inc., 527 F.3d 1379, 1382
(Fed. Cir. 2008). Accord Applied Med. Res. Corp. v. U.S. Surgical Corp., 448 F.3d
1324, 1333 n.3 (Fed. Cir. 2006) (“[U]se of two terms in a claim requires that they
connote different meanings.”).
See Nystrom v. TREX Co., Inc., 424 F.3d 1136, 1143 (Fed. Cir. 2005)
(“Different terms or phrases in separate claims may be construed to cover the
same subject matter where the written description and prosecution history indicate
that such a reading of the terms or phrases is proper.”).
See Rexnord Corp. v. Laitram Corp., 274 F.3d 1336, 1342 (Fed. Cir.
2001) (“[A] claim term should be construed consistently with its appearance in
other places in the same claim or in other claims of the same patent.”). See also
Phonometrics, Inc. v. Northern Telecom Inc., 133 F.3d 1459, 1465 (Fed. Cir. 1998)
(“A word or phrase used consistently throughout a claim should be interpreted
See Felix v. American Honda Motor Co., Inc., 562 F.3d 1167, 117778 (Fed. Cir. 2009).
35 U.S.C. § 112(b) (“Section 112, ¶ 2”) states:
The specification shall conclude with one or more claims
particularly pointing out and distinctly claiming the subject matter
which the inventor or a joint inventor regards as the invention.
Under this section, the inventor has a duty to “inform the public of the bounds of
the protected invention, i.e., what subject matter is covered by the exclusive rights
of the patent.”70 An inventor’s failure to “particularly point out and distinctly
claim” the subject matter of her invention provides a basis for invalidating a
Whether a claim is indefinite is a question of law.72 Patents issued by
See Haemonetics, 607 F.3d at 783 (holding that a disputed phrase
consistently had the same meaning in one claim, and a different meaning in a
Halliburton Energy Svcs., Inc. v. M-I LLC, 514 F.3d 1244, 1249 (Fed.
Cir. 2008) (citation omitted).
IPXL Holdings, LLC v. Amazon.com, Inc., 430 F.3d 1377, 1384 (Fed.
See Biomedino, LLC v. Waters Techs. Corp., 490 F.3d 946, 949 (Fed.
the PTO enjoy a presumption of validity,73 and so indefiniteness must be proved by
clear and convincing evidence.74 “A claim is definite if one skilled in the art would
understand the bounds of the claim when read in light of the specification[,]”75 and
“[a] claim is . . . indefinite only where it is not amenable to construction or is
insolubly ambiguous.”76 If a narrowing construction can properly be adopted, then
a claim is not indefinite.77
The Parties’ Contentions Relating to Claim 1, “physical
Claim 1 is the only independent claim of the ‘353 patent. It includes
the phrase “physical characteristic signal.” Microsoft argues that this phrase is
See 35 U.S.C. § 282.
See Schumer v. Laboratory Comp. Sys., Inc., 308 F.3d 1304, 1315
(Fed. Cir. 2002) (collecting cases). See also Microsoft Corp. v. i4i Ltd. P’ship, —
U.S. —,131 S.Ct. 2238, 2252 (2011).
IGT v. Bally Gaming Intern., Inc., 659 F.3d 1109, 1119 (Fed. Cir.
Deere & Co. v. Bush Hog, LLC, Nos. 2011–1629, 2011–1630,
2011–1631, — F.3d —, 2012 WL 6013405, at *7 (Fed. Cir. Dec. 4, 2012)
(quotation marks and citation omitted).
See Exxon Research & Eng’g Co. v. United States, 265 F.3d 1371,
1375 (Fed. Cir. 2001).
indefinite, and that as a consequence Claim 1 and its dependents are invalid.78
Claim 1 is reproduced below, with the phrases relevant to Microsoft’s contentions
appearing in bold:
1. A method for generating a behavior vector for a virtual actor
in an interactive theatre by interpreting stimuli from visitors, the
[a] providing a plurality of sensors detecting and sensing at
least one physical characteristic at a plurality of positions
within a theatre area within which a number of visitors are
free to move about, said sensors generating sensor signals;
[b] interpreting said sensor signals to provide at least one
physical characteristic signal including position
information, wherein said physical characteristic signal
provides information on visitor activity and location within
said theater area;
[c] providing a behavior model for at least one virtual actor;
[d] analyzing said at least one physical characteristic signal,
a change over time of said physical characteristic signal and
said behavior model for said at least one virtual actor to
generate a behavior vector of said at least one virtual actor
using said position information and said at least one
physical characteristic signal, said behavior vector being
generated in real-time;
See Memorandum of Law in Support of Microsoft Corporation’s
Motion for Summary Judgment of Invalidity for Indefiniteness (“Microsoft
Mem.”) at 6-7 (citations omitted).
[e] whereby a virtual actor reacts and interacts, in real-time,
with visitors depending on the visitors’ actions and
evolution of said actions.79
Microsoft argues that Claim 1 is invalid because it requires that
“physical characteristic signal” both include and exclude “position information,”
which, Microsoft argues, is logically impossible.80 Microsoft further argues that
neither the patent prosecution history nor the written description of the ‘353 patent
shed light on the relationship between “position information” and “physical
characteristic signal.”81 Bobick, Microsoft’s expert, declares that “one of ordinary
skill in the art would view it as critical to know the relationship between these two
claim elements [i.e. position information and the physical characteristic signal].”82
Bobick further states that, because “[t]he claims simultaneous[ly] require  two
contradictory elements . . . it [is] impossible to provide a meaningful interpretation
to these claim terms.”83
In its claim-construction brief, Virtual argued that the phrase “a
‘353 Patent, col. 16:5-28 (emphasis added).
See Microsoft Mem. at 7-8.
See id. at 8-9.
Bobick Decl. ¶ 13.
plurality of sensors detecting and sensing at least one physical characteristic signal
at a plurality of positions within a theatre area” did not require construction, as it
could be given its plain and ordinary meaning.84 However, in the same brief
Virtual reserved the right to respond to Microsoft’s indefiniteness contentions until
after Microsoft filed the present motion.85
In its opposition to this motion, Virtual argues that “[p]osition
information is merely information, or data, that pertains to the location of visitors
in the theater area.”86 Virtual further argues that “[a]s described in claim 1, the
‘physical characteristic signal’ is a signal that includes, inter alia, the position
information.”87 On Virtual’s construction, because “position information” is
merely data, the fact that it may be used in different ways does not render Claim 1
See Plaintiff Virtual Solutions, LLC’s Opening Claim Construction
Brief at 3.
See id. at 2 n.1 (“Virtual  will respond to Microsoft’s indefiniteness
contentions in Virtual[’s]  response to Microsoft’s summary judgment motion . . .
Plaintiff Virtual Solutions, LLC’s Response in Opposition to
Defendant Microsoft Corporation’s Motion for Summary Judgment of Invalidity
for Indefiniteness (“Opp. Mem.”) at 7 (citation omitted).
Id. at 8 (citation omitted).
Microsoft’s response is that it does not matter whether it may make
sense to use position information in different ways, because claim 1 “requires that
two specific elements (‘said physical characteristic signal’ and ‘said position
information’) are two inputs to one function (‘generat[ing] a behavior vector’) . . .
.”89 In other words, Microsoft clarifies that it is not arguing as a general matter that
data cannot be used in two different ways. Instead, Microsoft is arguing that the
‘353 Patent requires that two elements be used in a seemingly contradictory way,
without disclosing to the public their relationship to one another.
The problem is particularly acute in the embodiment of the ‘353
Patent where the physical characteristic signal is only position information. The
‘353 Patent clearly encompasses this embodiment, because limitation 1[b] recites
the phrase “physical characteristic signal including position information.”90 In this
See id. at 11 (The ability to use ‘position information,’ which is data,
in multiple manners is contemplated by the ‘353 patent. It does not render the
claim indefinite.”) (citations omitted).
Reply Memorandum of Law in Support of Microsoft Corporation’s
Motion for Summary Judgment of Invalidity for Indefiniteness at 2 (emphasis
added) (citing ‘353 Patent, col. 16:19-25).
See id. at 3 (emphasis added) (citation omitted). At oral argument,
counsel for Virtual acknowledged that the ‘353 Patent requires an embodiment
where the physical characteristic signal is comprised solely of position
case, limitation 1[d] would read: “to generate a behavior vector of said at least one
virtual actor using said position information and said [position information].”
It does appear to be both surplusage and contradictory for claim 1 to
state “using said position information and said at least one physical characteristic
signal [which may just be position information].” However, Virtual argues that
when claim 1 is read in light of the specification, the apparent contradiction
dissolves. Before addressing Virtual’s arguments, I will provide my interpretation
of the ‘353 Patent’s teachings about the use of “position information.”
The Specification’s Discussion of “Position Information”
“Position information” in the ‘353 Patent refers to positional data
collected from the users of the virtual reality system disclosed in the ‘353 Patent.91
In the preferred embodiment disclosed in the specification, position information is
information. See 1/22/13 Hearing Transcript at 101:8-11 (“[Mr. Grochocinski]:
And yes, in this . . . potential hypothetical, you end up in a scenario where you
have . . . only position information on both sides. But the fact is that the claim
allows for that.”) For example, this embodiment could be a virtual game of
dodgeball in which only the position data about the user was used by the system.
See id. at 78:3-7 (“[Mr. Cordell]: Okay. Well, then we can take a simpler game
where it is dodgeball and the question is whether you are being hit by the ball or
not. [The Court]: And that is only position. [Mr. Cordell] Only position, right.”).
See Microsoft Mem. at 7 (citations omitted); Opp. Mem. at 7
collected by sensors situated at a plurality of positions throughout the dome.92 The
raw data collected from these sensors is then fed into an “interpreter,” which
“analyzes the raw signals from these sensors a [sic] produces a physical
characteristic signal . . . .”93 For example, the interpreter would take the first
derivative of the raw position information (rate of change of the user’s position) in
order to determine the visitor’s displacement, and it would take the second
derivative of this information (rate of the rate of change) to determine the visitor’s
The filtered positional data, which is termed the “physical
characteristic signal,” is then fed into the “behavioral module” of the virtual
actor.95 The behavioral module processes this input in light of the virtual actor’s
characteristics (e.g., age, position within the simulation, sleepiness) to plan an
action for the virtual actor.96 (It is important to note that the position of the virtual
See ‘353 Patent, col. 4:29-31.
Id. at col. 7:6-7.
See id. at col. 4:41-44.
See id. at col. 3:29-31.
See id. at col. 3:31-33 (“The behavioral module 28 determines,
according to the behavioral characteristics of the actor, an action to undertake.”).
actor, as opposed to the user, is an independent input to the behavioral module.97
This positional data pertaining to virtual actors should not be confused with the
position information pertaining to users that is at issue here.)
The planned action decided upon by the behavioral module, which is
sometimes called a “behavior vector,”98 is then fed into the “biophysical model
action generator[,]” which calculates the physics of the planned action in light of
the virtual actor’s physical characteristics, e.g. its gross anatomy, as well as the
physical situation within the simulation.99 The biophysical model action
generator’s output is a “rendering vector,” which it sends to a “rendering module,”
a component that determines how the planned action is to be displayed in light of
the physical constraints imposed by the biophysical model action generator.100
The sum total of the planned actions of all of the virtual actors in the
See id. at col. 8:50-53 (“Also, the calculation of the position of the
actor will be given as input to the behavior module and will be used as a
supplementary virtual stimulus.”). At two points, the ‘353 Patent refers to a
“behavior module.” See id.; id. at col. 5:48-50. I interpret these references as
pertaining to the “behavioral module” under discussion.
Id. at fig. 3.
Id. at col. 3:33-36 (“This planned reaction of the actor is fed into the
biophysical model action generator 30 which calculates the physical reaction of the
actor with respect to its bodily limitations and characteristics.”).
See id. at col. 3:44-57.
simulation is summed and then sent to the “Overall Rendering Module,” which
determines how the actions of all of the virtual actors in the simulation are to be
displayed, generating an “overall rendering effect.”101 This overall rendering effect
is then decomposed into haptic, sonic, and visual elements by a “haptic effect
generator,” a “sound effect generator,” and an “image generator.”102 Finally, each
of these generators sends outputs into a corresponding module, e.g., “the haptic
module,” and these modules “generate what the visitor will hear, feel and see.”103
Data from the “biophysical model action generator” is also sent to the
“virtual environment database,” which stores the locations of all of the virtual
actors at any given point in time.104 The “virtual environment stimulus generator”
reads from this database and, based on the information received, decides whether
to generate random events (e.g. creating a new virtual actor).105
The specification further states that the biophysical model action
Id. at col. 3:56-57.
See id. at col. 3:58-61 (“The biophysical model action generator 30
also sends a virtual environment update to the virtual environment database 26.
This database comprises all data concerning all actors at any point in time.”).
See id. at col. 3:61-64.
generator also reads from the virtual environment database in the process of
computing the virtual actor’s planned action. It is claimed that this information
from the virtual environment database could, for example, enable a virtual actor to
avoid a collision with another virtual actor.106 However, it is unclear whether it is
the biophysical model action generator, or the behavioral module, that reads from
the virtual environment database. Logically, one would expect that it would be the
behavioral module reading from the database, because that module plans the virtual
actors’ actions. The specification, though, is unclear on the point.107
In sum, “position information” is data about the position of the user of
the system that is collected by sensors. After being filtered, this data is transmitted
to the “behavioral module” of a virtual actor, which plans a reaction. This planned
reaction is refined in light of the virtual actor’s physical characteristics, and the
See id. at col. 4:1-11.
Compare id. at col. 4:1-2 (“The biophysical model action generator 30
also [i.e., in addition to sending information to the database] reads information
from the virtual environment database 26 in order to decide if an interaction with
another actor is necessary.”) with id. at col. 8:50-53 (“Also, the calculation of the
position of the actor will be given as input to the behavior module and will be used
as a supplementary virtual stimulus.”). Compare also id. at fig. 2 (depicting a
flowchart in which the biophysical model action generator sends input to and
receives output from the virtual environment database, and in which the behavior
module is unconnected to the virtual environment database) with id. at fig. 3
(depicting a flowchart in which the behavioral module receives input from the
virtual environment database, but in which the biophysical model action generator
is unconnected to that database).
prevailing conditions within the simulation, by the “biophysical model action
generator.” After this refinement, the planned reaction is transmitted to a rendering
module where, after being combined with the actions of the other virtual actors in
the system, it is displayed to the user of the system.
The Specification Does Not Resolve the Contradiction in Claim 1
It Is Not Inherently Paradoxical to Use a Data Structure in
Two Different Ways in a Computer Program
With this understanding in mind, I now turn to Virtual’s arguments
that the specification resolves the apparent contradiction in claim 1. As an initial
matter, I agree with Virtual’s expert, Zavadsky, that it is not inherently
contradictory for a computer system to use data, such as “position information,” in
two different ways.108 For the purposes of this motion, I also agree with Zavadsky
that, because the specification of the ‘353 Patent indicates that object-oriented
programming techniques were to be used in implementing the claimed invention,
there are a number of reasons that data might be used in different ways within the
See Zavadsky Decl. ¶ 26.
See id. ¶ 25 (“Having many years of experience implementing [Object
Oriented Programming] systems, I know that the data . . . are normally copied in
many objects and intermediate data. The reasons for such copying may include:
establishing a local copy for faster access, providing standardized ways to access
However, Zavadsky’s observation that there might be a reason for
position information to be both included within and separately added to the
physical characteristic signal to generate a behavior vector misses the mark.
Bobick’s real point is that because “one of ordinary skill in the art would view it as
critical to know the relationship between [position information and the physical
characteristic signal][,]”110 and because the ‘353 Patent fails to disclose this
relationship, the inventor failed to distinctly claim what he regarded as his
Microsoft has unearthed an apparent logical contradiction within
claim 1, and it will have met its burden of proving invalidity by clear and
convincing evidence unless Virtual resolves this contradiction. To do that, Virtual
must demonstrate that the claim’s apparent contradiction is resolved by the
specification, as read by a skilled artisan. More specifically, Virtual must show
that the specification discloses a way in which position information may be used
alongside a physical characteristic signal (which might contain only position
information) to generate a behavior vector. Virtual makes two arguments on this
data (see e.g. method getData 88 in Figure 10), scaling sensor data to a common
calibrated scale (see, e.g. column 11, lines 20-30), [et. cetera.] . . . .”).
Bobick Decl. ¶ 13.
See Section 112, ¶ 2.
score, both of which are addressed below.
Figures Three and Four of the ‘353 Patent Provide No
Support for Virtual’s Position
First, Virtual argues that figures three and four of the ‘353 Patent, and
their accompanying text, demonstrate how position information and a physical
characteristic signal containing position information may be used together to
generate a behavior vector.112 Figure three, “a block diagram of the details of the
behavioral module[,]”113 is a flowchart that shows that positional information is fed
into the behavioral module, which generates a behavior vector.
Figure four is described only as “a flow chart of the general steps
involved . . . .”114 It depicts a “sensor signal” being sent to an “interpreter,” which
in turn sends a “physical characteristic signal” to an “analyzer.” The analyzer is
depicted as also receiving input from a “behavior model,” which is not networked
with anything else on the flow-chart. Finally, the analyzer’s output is termed a
See Opp. Mem. at 8 (“The mere fact that the physical characteristic
signal includes the position information does not mandate that the position
information, which is data, cannot be used independent of the physical
characteristic signal. This is highlighted by the intrinsic evidence and, in
particular, viewing Figure 3, and its corresponding description, in conjunction with
Figure 4, and its corresponding description.”) (citation omitted).
‘353 Patent at col. 2:46-47.
Id. at col. 2:48.
According to Virtual, the fact that figure four shows the analyzer
receiving input from the “behavior model” as well as the physical characteristic
signal demonstrates that position information is both a part of, and distinct from,
the physical characteristic signal.115 This is so because, in Virtual’s view, the
“behavior model” depicted in figure four is the same as the “behavioral module”
depicted in figure three, and the behavioral module contains the position
Virtual supports this view by quoting the description accompanying
figure four. It states: “[a]n interpreter 66 filters and analyzes the raw signals from
these sensors and produces a physical characteristic signal . . . . The analyzer 67
reads from the behavioral module 68 and produces a rendering vector which will
be used to display the environment.”117
There are two discrepancies between figure four and its accompanying
text. First, the text refers to “behavioral module 68,” while figure four depicts
“behavior model” 68. Second, the text also states that the output of the “analyzer”
See Opp. Mem. at 9-10.
Id. at 10 (quoting ‘353 Patent, col. 7:5-10) (emphasis added).
is a “rendering vector,” while figure four depicts the output of the analyzer as a
“behavior vector.” Virtual does not identify, much less attempt to justify, these
discrepancies. Instead, it implicitly asserts that the text’s label for the component
(“behavioral module”), and the figure’s label for the output of the analyzer
(“behavior vector”), should be credited.118
The tension between figure four and its text may be resolved with
reference to the inputs and outputs of the figure’s “analyzer.” The input of the
“analyzer” is a “physical characteristic signal,” which implies that the “analyzer” is
related (if not identical) to the “behavioral module,” because the behavioral module
has the same input.119
The text states that the output of the “analyzer” is a “rendering
vector.” As discussed above, a “rendering vector” is the output that the
“biophysical model action generator” produces after it has received information
from the “behavioral module” (which has in turn processed a “physical
characteristic signal” containing position information). If the output of the
See id. (“As a result, Figure 4 depicts exactly what is claimed: the
generation of a behavior vector for at least one virtual actor using: (1) position
information (from the Behavioral Module); and (2) the physical characteristic
signal that also includes the position information.”) (emphasis added).
See ‘353 Patent at col. 3:29-31.
analyzer is a rendering vector, then the component that the text accompanying
figure four calls “behavioral module 68” must be the biophysical model action
generator. But this reading make no sense, because the biophysical model action
generator receives inputs from the behavioral module, and not vice versa.120
Figure four – as opposed to the text – identifies the output of the
“analyzer” as a “behavior vector.” This is also the output of the behavioral
module, according to figure three, which reinforces my earlier conclusion that the
analyzer is related to the behavioral module.
This still leaves the question of what the “behavior model” referred to
by figure four is. Whatever it is, it is plainly distinct from both “position
information” and the “physical characteristic signal.” The summary section of the
‘353 patent states that one of the functions of the claimed invention is: “analyzing
the at least one physical characteristic signal and the behavior model for said at
least one virtual actor to generate a behavior vector.”121 Likewise, claim 13 states
“wherein said generating a behavior vector comprises adding a reaction for each
physical characteristic signal, using said behavior model and said position
See id. at col. 3:33-36.
Id. at col. 2:31-34 (emphasis added).
information to an overall reaction to generate a behavior vector.”122
The nature of the “behavior model” is revealed by Claim 19, which
states: “[a] method as claimed in claim 18, wherein said behavior model comprises
psychological factors, wherein said psychological factors are at least one of age
factor, hunger, thirst, sleepiness, attention span and disability. . . .”123 In other
words, the “behavior model” of a virtual actor relates to its disposition and
The most plausible reading of figure four, then, is that it means what it
says. The “behavior model” in figure four, which is fed into the “analyzer,” really
is the “behavior model” of the virtual actor. It does not contain a “physical
characteristic signal,” or position information about the user. And the “analyzer”
takes input from it, and takes a separate input from the “physical characteristic
signal,” in generating a “behavior vector.”
Figure four does not show, as Virtual had hoped, that position
information and the physical characteristic signal (containing, or consisting wholly
of, position information) are both used jointly to generate a behavior vector. And
even if it did, it is doubtful that a figure with such a muddled explanation would
Id. at col. 16:62-67.
Id. at col. 18:1-4.
resolve the ambiguity in claim 1 to a skilled artisan. In short, Virtual’s attempt to
clarify ambiguous claim language with a confusing reference to two figures and
their accompanying text fails.
The Claims of the ‘353 Patent Do Not Indicate How Position
Information May Be Both Part of and Distinct from the
“Physical Characteristic Signal”
Virtual’s final argument is that certain other claims – namely 10, 13,
and 15 – indicate that position information is to be used in different ways.124 The
central flaw in this argument is that it does not respond to Microsoft’s point that
claim 1 requires that two elements (position information and the physical
characteristic signal, which may just be position information) generate one
function (a behavior vector), but does not disclose the relationship between these
elements. Moreover, the claims that Virtual points to in order to resolve the
ambiguity in claim 1 are, themselves, ambiguous.
Claim 10 discloses a “new actor creation module” that creates a new
virtual actor using the physical characteristic signal, the behavior model, and
position information.125 Like claim 1, it is unclear why claim 10 requires that
See Opp. Mem. at 10-11.
Claim 10 discloses “[a] method as claimed in claim 9, further
comprising a step of providing a new actor creation module, wherein said module
creates a new actor in said interactive theater using said at least one physical
position information be used twice to generate its output. It is possible that the
position information contained in the physical characteristic signal might be
distinct from the explicitly referenced position information, in that the latter term
might refer to the position of the virtual actors within the simulation.126 But this is
just conjecture based on a poorly written specification. Virtual is adamant that
“[p]osition information is . . . data that pertains to the location of visitors in the
theater area[,]”127 and the weight of the specification supports this view, as does the
consistency canon.128 Furthermore, even if “position information” in claim 10 may
be re-written as “position information [about the virtual actor(s)],” there is nothing
in the specification to indicate that the same holds true for limitation 1[d].
Limitation 1[d] relates to generating a behavioral vector, not a new actor.
Claim 13 relates to generating a behavior vector, but it does not
illuminate the relationship between the “physical characteristic signal” and
characteristic signal, said behavior model and said position information.” ‘353
Patent at col. 16:53-56.
See id. at col. 3:64-67 (“Once the Virtual Environment Stimulus
Generator 27 decides that a new actor should be created, a signal is sent to the new
actor creation module 29.”). As I stated above, the “virtual environment Stimulus
Generator” is connected to a component that tracks the positions of the virtual
actors within the simulation.
Opp. Mem. at 7.
See Rexnord Corp., 274 F.3d at 1342.
“position information” in limitation 1[d]. It states: “[a] method as claimed in claim
12, wherein said generating a behavior vector comprises adding a reaction for each
physical characteristic signal, using said behavior model and said position
information to an overall reaction to generate a behavior vector.”129
The specification provides some help in understanding claim 13. As
previously stated, the behavior module generates a behavior vector, and takes input
from the physical characteristic signal. And the specification teaches that the
“overall reaction generator” is the final component of the behavioral module,130 the
function of which is to sum the reactions of all virtual actors prior to sending a
behavior vector to the biophysical model action generator.131 Presumably, then, the
“overall reaction” of claim 13 is the output of the “overall reaction generator.”
Even with this understanding, claim 13 makes no sense. Like claim
10, it might relate to the position information of virtual actors, because it depends
on claim 11, “wherein said virtual environment stimulus is added to said at least
‘353 Patent at col. 16:63-67.
See id. at fig. 3.
See id. at col. 6:28-32 (“All the reactions are then combined together
57 in an overall reaction by the overall reaction generator 59 and will be fed to the
Biophysical Model action generator module 61.”).
one physical characteristic signal.”132 But this only heightens the ambiguity of
claim 13: if position information about both users and virtual actors is contained
within the physical characteristic signal, it does not make sense to “add a reaction
for each physical characteristic signal,” and also, separately, “us[e] . . . said
position information[,]”133 as claim 13 requires. And nothing in the specification
resolves this problem. For example, the Patent gives no indication as to what
“position information to an overall reaction” may refer to.
In sum, claim 13 suffers from the same deficiency as claim 1: it
requires that a function be generated by two apparently identical elements, without
describing the relationship between those elements. Claim 15 also suffers from
this deficiency, and likewise does not support Virtual’s argument.134 As such, none
of the claims cited by Virtual succeed in resolving claim 1’s ambiguity.
Microsoft has the burden of demonstrating, by clear and convincing
Id. at col. 16:57-59.
Id. at col. 16:63-67.
See id. at col. 17:6-11 (“A method as claimed in claim 12, wherein
each physical characteristic signal of a similar type is summed into a physical
signal for said type and wherein a reaction for said physical signal for said type is
calculated, using said behavior model and said position information to an overall
reaction to generate a behavior vector.”).
evidence, that claim 1 is invalid.135 In light of the foregoing discussion, I conclude
that it has met that burden. Section 112, ¶ 2 requires that the inventor disclose
what she regards to be her invention with sufficient definiteness that “one skilled in
the art would understand the bounds of the claim when read in light of the
specification.”136 The inventor of the ‘353 Patent failed to make this disclosure.
As Microsoft points out, claim 1 requires that two apparently
contradictory statements hold true: “position information” must be simultaneously
part of, and used separately from, the “physical characteristic signal” to generate a
behavior vector. Bobick, a person having ordinary skill in the relevant field at the
relevant time, has stated that, in the context of claim 1, “one of ordinary skill in the
art would view it as critical to know the relationship between [position information
and the physical characteristic signal].”137 Virtual has offered no evidence or
argument to refute Microsoft’s contention that the ‘353 Patent does not disclose
A court may not “re-write claims to preserve their validity[,]” even if
See i4i Ltd. P’ship, 131 S.Ct. at 2252.
Personalized Media Commc’ns, LLC v. International Trade Comm’n,
161 F.3d 696, 705 (Fed. Cir. 1998) (citations omitted).
Bobick Decl. ¶ 13.
it is plain that the inventor did not mean what she wrote.138 Virtual’s expert has
identified a plethora of reasons why, in a hypothetical patent, it might make sense
to use position information in different ways. The ‘353 Patent, though, does not
disclose why position information must be used twice to generate a behavior
vector. Claim 1 simply states that position information and the physical
characteristic signal (which might contain only position information) are both to be
used to generate a behavior vector; and the balance of the patent does nothing to
resolve the paradox. The public would be ill-served if Virtual were allowed to
retain a monopoly on a patent that fails to disclose its scope. I therefore hold that
claim 1 and all of its dependent claims are invalid.
Because claim 1 is the only independent claim of the ‘353 Patent, I
need not consider the parties’ contentions relating to claim 8. The same holds true
for the parties’ claim construction contentions.
For the reasons stated above, Microsoft’s motion for summary
judgment is granted. The Clerk of the Court is directed to close the motion (Doc.
Allen Eng’g Corp. v. Bartell Indus., Inc., 299 F.3d 1336, 1349 (Fed.
Cir. 2002) (finding claim invalid when it used the word “perpendicular” to describe
an element that could only logically be parallel, despite the fact that a person
having ordinary skill in the art would understand the inventor to mean “parallel”).
No. 43) and the case.
New York, New York
- Appearances For Virtual Solutions, LLC:
Timothy E. Grochocinski, Esq.
1900 Ravinia Place
Orland Park, IL 60462
Anthony G. Simon, Esq.
Michael P. Kella, Esq.
THE SIMON LAW FIRM, P.C.
800 Market Street, Suite 1700
Saint Louis, MO 63101
Harold Y. MacCartney, Jr., Esq.
MacCARTNEY, MacCARTNEY, KERRIGAN & MacCARTNEY
13 North Broadway, P.O. Box 350
Nyack, NY 10960
For Microsoft Corp.:
Ruffin B. Cordell, Esq.
Lauren A. Degnan, Esq.
Cherylyn Esoy Mizzo, Esq.
Robert Courtney, Esq.
FISH & RICHARDSON P.C.
1425 K Street N.W., 11th Floor
Washington, DC 20005
Jonathan A. Marshall, Esq.
Leah A. Edelman, Esq.
FISH & RICHARDSON P.C.
601 Lexington Avenue, 52nd Floor
New York, NY 10022
Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.
Why Is My Information Online?