Apple Inc. v. Samsung Electronics Co. Ltd. et al
Filing
1183
Administrative Motion to File Under Seal Samsung's Claim Construction Brief filed by Samsung Electronics Co. Ltd.. (Attachments: #1 Trac Declaration in Support of Motion to Seal, #2 Proposed Order, #3 Samsung's Claim Construction Brief, #4 Schmidt Declaration in Support of Claim Construction Brief, #5 Exhibit 1, #6 Exhibit 2, #7 Exhibit 3, #8 Exhibit 4, #9 Exhibit 5, #10 Exhibit 6, #11 Exhibit 7, #12 Exhibit 8, #13 Exhibit 9, #14 Exhibit 10, #15 Exhibit 11, #16 Exhibit 12, #17 Exhibit 13, #18 Exhibit 14)(Maroulis, Victoria) (Filed on 7/5/2012)
EXHIBIT 7
U 7302677
UNITED STATES DEPAR· -- OF COMMERCE
United States Patent and Trademark Office
June 28, 2011
THIS IS TO CERTIFY THAT ANNEXED HERETO IS A TRUE COPY FROM
THE RECORDS OF THIS OFFICE OF:
U.S. PATENT: 7,864,163
ISSUE DATE: January 04, 2011
By Authority of the
Under Secretary of Commerce for Intellectual Property
and Director of the United States Patent and Trademark Office
.
LLIAMS
Certifying Officer
APLNDC00027870
IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIll l l l l l l l l l l l l l l l l l l l l
(12) United States Patent
(10) Patent No.:
Ording et al.
(54) PORTABLE ELECTRONIC DEVICE,
METHOD, AND G APRICAL USER
INTERFACE FOR DISv· --G
STRUC··-··ELECTRONICD··· ··-NTS
(75)
(56)
References Cited
U.S. PATENT DOCUMENTS
6,025,842 A
FOREIGN P
EP
Milic-Frayling, N. et al., "Smartview: Enhanced Document Viewer
for Mobile Devices," Microsoft Technical Report, Nov. 15, 2002,
URL:
ftp://ftp.research.microsoft.com/pub/tr/tr-2002-114.pdf,
retrieved Dec. 17, 2007.
(Continued)
Primary Examiner-Stephen G Sherman
(74) Attorney, Agent, or Firm-Morgan, Lewis & Bockius
LLP
Sep. 4, 2007
Prior Publication Data
(57)
Apr. 24, 2008
ABSTD M'T
A computer-implemented method, for use in conjunction
with a portable electronic device with a touch screen display,
comprises displaying at least a portion of a structured electronic document on the touch screen display, wherein the
structuredelectronic documentcomprises a plurality ofboxes
of content, and detecting a first gesture at a location on the
displayed portion of the structured electronic document. A
first box in the plurality of boxes at the location of the first
Related U.S. Application Data
(60) Provisional application No. 60/937,993, filed on Jun.
29, 2007, provisional application No. 60/946,715,
filed on Jun. 27, 2007, provisional application No.
60/879,469, filed on Jan. 8, 2007, provisional application No. 60/879,253, filed on Jan. 7, 2007, provisional
application No. 60/824,769, filed on Sep. 6, 2006.
(51) Int. Cl.
GO6F 3/041
3/1992
(Continued)
(21) Appl. No.: 11/850,013
US 2008/0094368 Al
- DOCUMENTS
0476972 A2
OTHER PUBLICATIONS
Subject to any disclaimer, the term ofthis
patent is extended or adjusted under 35
U.S.C. 154(b) by 688 days.
(65)
2/2000 Filetto et al. ................ 345/345
(Continued)
(73) Assignee: Apple Inc., Cupertino, CA (US)
(22) Filed:
Jan. 4, 2011
(58) Field of Classification Search ......... 345/173-178;
178/18.01-18.09, 18.11; 715/810, 828-831,
715/234, 781, 700
See application file for complete search history.
Inventors: Bas Ording, San Francisco, CA (US);
Scott Forstall, Mountain View, CA
(US); Greg Christie, San Jose, CA (US);
Stephen O. Lemay, San Francisco, CA
(US); Imran Chaudhri, San Francisco,
CA (US); Richard Williamson, Los
Gatos, CA (US); Chris Blumenberg,
San Francisco, CA (US); Marcel Van
Os, San Francisco, CA (US)
( * ) Notice:
US 7,864,163 B2
(45) Date of Patent:
gesture is determineçl. The first box on the touch screen display is enlarged and substantially centered.
(2006.01)
(52) U.S. Cl. ........................ 345/173; 715/234; 715/781
de
61 Claims, 29 Drawing Sheets
lht p I)www.company.com/start agga |
WeD page
Block 3
Block i
21¾
Block 7
Block4
Block 5
Block 2
Block 8
Block 6
3B18
1118
1120
3822
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027871
US 7,864,163 B2
Page 2
U.S. PATENT DO= -6,073,036 A
6/2000 Heikkinen et al. .......... 455/575
6,177,936 B
6,199,082 B
6,243,080 B
1/200 Cragun ....................... 345/340
3/200 Ferrel et al. ................. 707/522
6/200 Molne ........................ 345/173
6,262,732 B
7/200
Coleman et al. ............ 345/348
6,326,970 B
6,349,410 B
12/200 Mott et al. .................. 345/439
2/2002 Lortz ......................... 725/110
6,359,615
6,411,283
6,466,198
6,466,203
6,489,975
6,570,583
3/2002
6/2002
10/2002
10/2002
12/2002
5/2003
6,613,100
6,639,584
6,771,250
6,928,461
7,054,965
7,075,512
7,149,549
Bl
B1
B1
B2
B1
B1
B2
B1
B1
B2
B2
Bl
B1
9/2003
10/2003
8/2004
8/2005
5/2006
7/2006
12/2006
Singh .........................
Murphy ......................
Feinstein ....................
Van Ee .......................
Patil et al. ...................
Kung et al. .................
345/173
345/173
345/158
345/173
345/781
345/661
Miller ........................ 715/526
Li .............................. 345/173
Oh ............................. 345/156
Tuli ........................... 709/203
Bell et al. ..................... 710/72
Fabre et al. ................. 345/156
Ortiz et al. .................. 455/566
7,166,791
7,346,855
7,461,353
2002/0152283
2003/0033331
2003/0095135
2003/0095155
2003/0164861
2003/0193524
B2
B2
B2
A
A
A
A
A
A
1/2007
3/2008
12/2008
10/2002
2/2003
5/2003
5/2003
9/2003
10/2003
Robbin et al. ................. 84/477
Hellyar et al. .............. 715/783
Rohrabaugh et al. ........ 715/815
Dutta et al. ................. 709/218
Sena et al. .................. 707/513
Kaasila et al. .............. 345/613
Johnson ...................... 345/864
Barbanson et al. .......... 345/815
Bates et al. ................. 345/786
2004/0143796
2004/0169674
2004/0201595
2005/0071364
Al
Al
Al
Al
7/2004
9/2004
10/2004
3/2005
Lerner et al. ................
Linjama .....................
Manchester .................
Xie et al. ....................
2004/0103371 Al
5/2004 Chen et al. .................. 715/513
2005/0071778 Al
715/538
345/702
345/649
707/102
3/2005 Tokkonen ................... 715/822
2005/0079896 A
4/2005 Kokko et al. ................ 345/702
2005/0114788
2005/0166232
2005/0177783
2005/0285880
2006/0026521
2006/0033761
2006/0064647
2006/0085743
2006/0097991
2006/0101354
2006/0 25799
2006/0 43574
2006/0 46016
2006/0 46038
2006/0 97753
2006/0277588
2006/0284852
2007/0038612
2007/0152984
2007/0155434
2007/0157228
2007/0250768
5/2005
7/2005
8/2005
12/2005
2/2006
2/2006
3/2006
4/2006
5/2006
5/2006
6/2006
6/2006
7/2006
7/2006
9/2006
12/2006
12/2006
2/2007
7/2007
7/2007
7/2007
10/2007
2005/0093826 A
5/2005 Huh ........................... 345/168
A
A
A
A
A
A
A1
A
A
A
A *
A
A
A
A
A
A *
A
A
A
A
Al *
FOREIGN P
EP
EP
EP
EP
EP
EP
EP
EP
JP
0 651 544
0 701 220
0880090
0 990 202
I 049 305
1 517 228
1 632 874
1 752 880
11 143604
Fabritius ..................... 715/767
Lamkin et al. ................ 725/43
Agrawala et al. ........... 715/512
Lai et al. .................... 345/660
Hotelling et al. ............ 715/702
Suen et al. .................. 345/660
Tapuska et al. ............. 715/800
Baudisch et al. ............ 715/526
Hotelling et al. ............ 345/173
Hashimoto et al. .......... 715/863
Hillis et al. ................. 345/173
Ito et al. ..................... 715/800
Chan et al. .................. 345/ 56
Park et al. ................... 345/ 73
Hotelling .................... 345/ 73
Harrington et al. .......... 725/ 35
Hofmeister et al. ......... 345/173
Sull et al. ...................... 707/3
Ording et al. ............... 345/ 73
Jobs et al. ................... 455/ 65
Bayer et al. ................... 725 34
Funakami et al. ........... 715/ 21
DO
A2
A1
A2
A1
Al
A2
A2
Al
5/1995
3/1996
11/1998
4/2000
11/2000
/2005
/2006
/2007
5/1999
S
JP
JP
WO
WO
WO
WO
WO
WO
WO
WO
WO
WO
WO
WO
2000181436
2005 267049
WO
99/54807
WO
02/46903
WO02/082418
WO02/093542
WO02093542
WO03/052626
WO2004/021166
WO2004/040481
WO2005/036416
WO2005/074268
WO2005/106684
WO 2006/003591
6/2000
9/2005
A1 10/1999
A1
6/2002
A2 10/2002
Al
11/2002
Al * 11/2002
A1
6/2003
Al
3/2004
Al
5/2004
A2
4/2005
Al
8/2005
Al
11/2005
A2
1/2006
- - PUBLICATIONS
Holmquist, L., "The Zoom Browser Showing Simultaneous Detail
and Overview in Large Documents," Human IT, 1998, URL : http://
www.hb.selbhs/ith/3-98/leh.htm, retrieved Dec. 17, 2007.
Khella,A. et al., "Pocket PhotoMesa: AZoomable Image Browserfor
PDAs," Proceedings of the 3rd International Conference on Mobile
and Ubiquitous Multimedia, Oct. 29, 2004, pp. 19-24, URL: http://
delivery.aem.org/10.1145/1060000/1052384/p19-khella.
pdf?keyl=1052384&key2=2419987911&co11=GUIDE
&d1=GUIDE&CFID-47073625- - · ·KEN=65767142, retrieved
Dec. 17, 2007.
Invitation to Pay Additional Fees for International Application PCT/
US2007/077644, dated Jan. 23, 2008.
Cooper, A., "The Inmates Are Running the Asylum," Sams Publishing, 1999, pp. 138-147.
European Search Report dated Jan. 26, 2010, received in European
Application No. 09171787.6-2212, which corresponds to U.S. Appl.
No. 11/850,013.
Versiontracker, "Photogather-7.2.6. Hi-res Image Viewer & Editor
for Palm," printed Jun. 12, 2006, 5 pages, http://www.versiontracker.
com/dyn/moreinfo/palm/4624.
Office Action dated Sep. 28, 2009, received in Australian Patent
Application No. 2009100760, which corresponds to U.S. Appl. No.
11/850,013.
Agarwal, A., "iTunesInlineVideo," Digital Inspiration-The Tech
Guide,
27
pages,
httpl/:labnol.blogspot.com/
2006_09_17_labnol_archive.html.
Ahmad I. et al., "Content-Based Image Retrieval on Mobile
Devices," Proc. Of SPIE-IS&T Electronic Imaging, vol. 5684, 2005,
10 pages.
Alam, H. et al., "Web Document Manipulation for Small Screen
Devices: A Review," 4 pages, http://www.csc.liv.ac.uk/~wda2003/
Papers/Section_II/Paper_8.pdf, 2003.
Apparao, V. et al., "Level 1 Document Object Model Specification,"
W3C Working Draft Jul. 20, 1998, 3 pages, http://www.w3.org/TR/
WD-DOM/.
Bos, B. et al., "3 Conformance: Requirements and Recommendations," Cascading Style Sheets, level 2 CSS2 Specification, W3C
Recommendation, May 12, 1998, 6 pages, http://www.w3.org/TR/
CSS21/conform.html#doctree.
Chen, L. et al., "DRESS: A Slicing Tree Based Web Representation
for Various Display Sizes," Microsoft Researach, Technical Report
MSR-TR-2002-126, Nov. 16, 2002, 9 pages.
Eyemodule, "TurnYour Handspring= Handheld into a Digital Camera," User's Manual, www.eyemodule.com, 9 pages, 2000.
Fling, B., "Designing for Mobile, Bringing Design Down to Size,"
Copyright© 2006 Blue Flavor.
, Hart, K., "Rewriting the Web for Mobile Phones," washingtonpost.
com, Jul. 26, 2006, 2 pages, http://www.washingtonpost.com/wpdyn/content/article/2006/07/25/AR2006072501517_pf.html.
Hinc1dey et al., "Input/Output Devices and Interaction Techniques,"
Microsoft Research, 79 pages, 2003.
Laakko, T. et al., "Adapting Web Content to Mobile User Agents,"
IEEE Internet Computing, vol. 9, Issue 2, lylar./Apr. 2005, 8 pages.
Opera Software, "Downloadthe Opera Mobile= Browser," 5 pages,
http://www.opera.com/products/mobile/products/,2006.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027872
US 7,864,163 B2
Page 3
Opera Software, "Opera7.60 for Series 60 Mobile," http://jp.opera.
com/support/tutorials/s60/760/O760manual.pdf 2009.
Palme, J. et al., "MIME Encapsulation of Aggregate Documents,
such as HTML," The Internet Society, 1999, 24 pages.
Raman, B. et al., "Application-specific Workload Shaping in Multimedia-enabled Personal Mobile Devices," CODES+ ISSS' 06, Oct.
22-25, 2006, Seoul, Korea, Copyright 2006 ACM, 6 pages.
Robie, J., "What is the Document Object Model?" Texcel Research,
5 pages, http://www.w3.orgTR-DOM/introduction.html,2006.
Rohrer, T., "Metaphors We Compute by: Bringing Magic into Interface Design," http://www.uoregon.edu/-uophillmetaphor/gui4web.
htm, printed Jun. 13, 2006, 7 pages.
Roto, V. et al. "Minimap-A Web Page Visualization Method for
Mobile Phones," CHI 2006, Nokia Research Center, Apr. 22-27,
2006, 10 pages.
Salmre, I., "Chapter 2, Characteristics of Mobile Applications,"
Salme_02.fm, pp. 19-36, Dec. 20, 2004.
Schreiner, T., "High DPI in IE: Tip & Mystery Solved " Tony
Schreiner's Weblog, May 2004, 2 pages, http://blogs.msdn.com/
tonyschr/archive/2004/05/05/126305.aspx.
Stampfli, T., "Exploring Full-Screen Mode in Flash Player 9," Jan. 5,
2007,
http://web.archive.org/web20070105231635/http://www.
adobe.com/devnet/flashplayer/articles/fullis screen_mode.html.
Stanek, W. et al., "Chapter 9, Video and Animation Plug-Ins," Web
Publishing Professional Reference Edition, copyright 1997 by Sams.
net Publishing, http://www.ssuet edu.pk/taimoor/books/1-57521198-X/index.htm.
Stanek, W. et al., "Chapter 22, Adding MultimediatoYourWeb Site,"
Web Publishing Professional Reference Edition, copyright 1997 by
Sams.net Publishing, http://www.ssuet.edu.pk/taimoor/books/157521-198-X/index.htm.
Surfin'Safari, "XUL," 7 pages, Oct. 2003, http://weblogs.mozillazine.org/hyatt/archives/2003_10.html.
w3 schools.com, "Multimedia Video Formats," www.w3sschools.
com/medialmedia videoformats.asp?output=print,2006.
w3schools.com, "Playing QuickTime Movies;' http://www.
w3schools.com/media/media quicktime.asp?output=print,2006.
w3schools.com, "Playing Videos On The Web," www.w3schools.
com/medialmedia_browservideos.asp?out=print, 2006.
Wave Technologies, "Certified Internet Webmaster Foundations
Study Guide," Wave Technologies Internation, Inc., a Thomson
Learning company, copyright 1988-2000.
Warabino, T. et al., "Video Transcoding Proxy for 3GwirelessMobile
InternetAccess," IEEE CommunicationsMagazine,vol. 38, Issue 10,
Oct. 2000, 6 pages.
weblogs, "Chapter 1: Downloading and Building WebCore,"
WebCore documentation, 2 pages, http://weblogs.mozillazine.org/
hyatt/WebCore/chapter1.html, 2006.
weblogs, "Chapter 2: An Overview of WebCore," WebCore documentation, 3 pages, http://weblogs.mozillazine.org/hyatt/WebCore/
chapter2.html, 2006.
webmasterworld.com, "Page Zooming with IE," webmasterworld.
com, Jul. 2004, 7 pages, http://www.webmasterworld.com/forum83/
4179.htm.
Wikipedia, "Comparison of Layout Engines," Wikipedia, the free
encyclopedia, 3 pages, http://en.wikipedia.org/wiki/Comparison
of_loyout_engines, 2005.
Wikipedia, "KDE,"Wikipedia, the free encyclopedia, 9 pages, http://
en.wikipedia.org/wiki/KDE, 2004.
Wikipedia, "
," Wikipedia, the free encyclopedia, 3 pages,
http://en.wikipedia.org/wiki/KDHTML,2004.
Wikipedia, "List ofLayout Engines," 1 page, http://en.wikipedia.org/
wiki/List_of_layout_engines, 2005.
Wobbrock, J. et al., "WebThumb: Interaction Techniques for SmallScreen Browsers," Human Computer Interaction Institute and School
of Design, Carnegie Mellon University, {jrock, forlizzi, scott.
hudson, bam}@cs.cmu.edu, Oct. 27-30, 2002, 4 pages.
Yin, X. et al., "Using Link Analysis to Improve Layout on Mobile
Devices," ww w2004, May 17-22, 2004, 7 pages, http://www.
iw3c2.org/WWW2004/docs/1p338.pdf.
Xiao, X. et al., "Slicing*-Tree Based Web Page Transformation for
Small Displays," CIKM'05, Oct. 31-Nov. 5, 2005, Bremen, Gero
many, 2 pages.
Xie, X. et al., "Efficient Browsing ofWeb Search Results on Mobile
Devices Based on Block Importance Model;' Microsoft Research
Asia, mgx03@ mails.tsinghua.edu.cn, 10 pages, 2005.
Zhiwei et al., "Zoom Selector: A Pen-basedInteractionTechnique for
Small Target Selection," Transactions of the Information Processing
Society of Japan, Aug. 2004, vol. 45, No. 8, pp. 2087-2097, Inf.
Process. Soc. Japan, ISSN 0387-5806.
International Search Report and Written Opinion for International
Application No. PCT/lJS2007/088893, mailed Jul. 11, 2008.
Designing Interfaces.com, "Animated Transition," http://
designinginterfaces.com/Animated_Transition, printed Oct. 16,
2006, 2 pages.
Alejandre, S., "Graphing Linear Equations," http://mathforum.org/
alejandre/palm/times.palm.html, printed Jun. 12, 2006, 3 pages.
Baudisch, P., "Collapse-to-Zoom: Viewing Web Pages on Small
Screen Devices by Interactively Removing Irrelevant Content," Oct.
24-27, 2004, 4 pages.
Bitstream®, ""--Hawk Pocket PC Edition for End Users,"
http://www.bitstream.com/wireless/products/pocketpc/faq_using.
html, printed Jun. 12, 2006, 4 pages.
Buyukkokten, O. et al., "Power Browser: Efficient Web Browsing for
PDAs," Digital Libraries Lab (InfoLab), Stanford University,
Stanford, CA, 8 pages, 2002.
Chen, H. et al, "A Novel Navigation and Transmission Technique for
Mobile Handheld Devices," 8 pages, 2002.
Chen, Y., "Detecting Web Pages Structure for Adaptive Viewing on
Small
Form
Factor
Devices,"
Microsoft
Research,
i-yuchen@microsoft.com, May 20-24, 2003, 9 pages.
Coolsmartphone, "Orange SPV C600 Review," http://www
coolsmartphone.com/articleS69.html, Apr. 14, 2006, 58 pages.
Getgreg, "JeffHan's Multiple Touch Point Display, the StuffDreams
are
Made
of,"
http://www.theyshoulddothat.com2006/08/
jeff_hanns multiple_touch_poin.html>,Aug. 16, 2006, 2 pages.
Han, J., "Talks Jeff Han: Unveiling the Genius of Multi-touch Interface Design," Ted Ideas Worth Spreading, http://www.ted.com/
inhttp://www.ted.com/index.php/talks/view/id/65>,Aug. 6, 2006, 1
page.
Karlson et al., "AppLens and Lunch Tile: Two Designs for Onehanded Thumb Use on Small Devices," http://heil.cs.umd.edu/trs/
2004-37.html, printed Jun. 12, 2006, 11 pages.
Landragin, F., "The Role of Gesture in Multimodal Referring
Actions," Proceedings ofthe Fourth IEEE Internaational Conference
on Multimodal Intèrfaces, http://ieeexplore.iee.org/ie15/8346il
26309/01166988pdf?arnumber=116i6988>,2002, 6 pages.I.
Lie, H., "Cascading Style Sheets," http://people.opera.com/
howcome/2006/phd/css.pdf 2005, pp. 243-247.
Milic-Frayling, N. et al., "Smartview: FlexibleViewing ofWeb Page
Contents," The Eleventh International World Wide Web Conference,
http://www2002.org/CDROM/poster/172/>,May 11, 2002, 4 pages.
Opera Software, "Opera for Mobile, The Full Web Anytime, Anywhere," www.opera.com/mobile, Jan. 2006, 7 pages.
Opera Software, "Opera for S60 Tutorial," http://www.opera.com/
support/tutorials/260/, Apr. 5, 2006, 5 pages.
Opera-Press Releases Database, "The New Opera Browser for
Series 60 Features Zoom and Password Manager," Nov. 14, 2005, 3
pages.
Opera Software, "Opera for Windows Mobile Smartphone 2003
Tutorial," http://www.opera.com/support/tutorials/winmobile, Apr.
5, 2005, 4 pages.
Opera Software, "Opera 8.5 Beta2 forWindows Mobile, Pocket PC,"
http://www.opera.com/products/mobile/products/winmobileppc,
Apr. 5, 2006, 2 pages.
Opera, "Opera 8.5 for S60 Phones-Get the Full Internet Experience
onYourMobilePhone,"http://www.symbian-freak.com/news/1105/
opera.htm, Apr. 5, 2006, 3 pages.
International Search Report and Written Opinion for International
Application No. PCT/US2007/077644, mailed May 30, 2008.
International Search Report andWritten Opinion dated Jun. 30, 2008,
received in International Application PCT/US2007/088879 which
corresponds to U.S. Appl. No. 11/620,647.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027873
US 7,864,163 B2
Page 4
Office Action dated Nov. 17, 2009, received in U.S. Appl. No.
11/620,647.
Office Action dated Jun. 24, 2010, received in U.S. Appl. No.
11/620,647.
Examiner's ReportdatedMar. 24, 2010, receivedinAustralian Patent
Application No. 2007292383, which corresponds to U.S. Appl. No.
11/850,013, 2 pages.
Office Action dated Jun. 7, 2010, received in German Patent Application No. 11 2007 002 107.1-53, which corresponds to U.S. Appl.
No. 11/850,013.
Office Action dated Jun. 21, 2010, received in European Application
No. 07 814 690.9-2212, which corresponds to U.S. Appl. No.
11/850,013.
Office Action dated Oct. 13, 2010, received in Chinese Patent Application No. 20078004122.6, which corresponds to U.S. Appl. No.
11/850,013.
Office Action dated Oct. 19, 2010, received in European Application
No. 07 814 690.9, which corresponds to U.S. Appl. No. 11/850,013.
* cited by examiner
Co PY P rovided bY USPTO from the PIRS Ima9 e Database on 06/20/2011
APLNDC00027874
U.S. Patent
Jan. 4, 2011
Sheet 1 of 29
Memory
Portable Multifunction Device
Operating System
Communication Module
Contact/Motion Module
Graphics Module
Text Input Module
GPS Module
Applications
Contacts Module
/ 126 Applications (continued
; 128
Calendar Module
7130
Widget Modules
132
/
Weather Widget
7134
Stocks Widget
135
136
Calculator Widget
137
Alarm Clock Widget
138
Dictionary Widget
Telephone Module
Video Conference Module
E-mail Client Module
Instant Messaging Module
Biogging Module
Camera Module
Image Management Module
Video Player Module
Music Player Module
Browsing Module
103
104
122
US 7,864,163 B2
139
140
141
142
143
144
145
146
147
118
1
136
148
149
149-1
149-2
149-3
149-4
149-5
•
User-Created Widget(s)
-Widget Creator Module
Search Module
$
149-6
150
151
Power iy 162
System /
External ;124
^103
103
|
RF Circuitry
Speaker
111
Controller
4103
‡
C rc try
Pericherals
103
110
Micro hone
Proximity
Interface
113
Sensor
120
Processor(s)
166
103
Accelerometer(s)
Display
Controller
103
I/O Subsystem
.
Optical
sensor(s)
Controller
103
103
106 J
156
112
Touch-Sensitive
Display System
Optical
Sensor(s)
.1§&
158
Other input
Controller(s)
160
103
Other Input
Control
Devices
7116
Figure 1A
Conv nrovided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027875
U.S. Patent
Jan. 4, 2011
Sheet 2 of 29
Memory E
Portable Multifunction Device
Operating System
126
Communication Module
Contact/Motion Module
Graphics Module
Text Input Module
GPS Module
m
128 Applications (continued)
V 130
Calendar Module
I
132
134
Widget Modules
Weather Widget(s)
135
136
Stocks Widget
Applications
137
Contacts Module
Telephone Module
Video Conference Module
E-mail Client Module
Instant Messaging Module
138
139
140
141
142
143
144
152
Biogging Module
Camera Module
Image Management Module
Video and Music Player Module
Notes Module
I
Map Module
Browsing Module
103
104
122
US 7,864,163 B2
153
154
147
118
149
149-1
149-2
149-3
149-4
Calculator Widget
Alarm Clock Widget
Dictionary Widget
•
-•
- -User-Created Widget(s)
Widget Creator Module
149-5
149-6
150
151
Search Module
Power , 162
System /
External
Port
103
103
124
RF Circuitry
Speaker
111
Controller
103
120
Peripherals
Interface
103
Cr ry
110
Proximity
103 Sensor
Micro hone
113
166
Processor(s)
†
(103
106
Display
Controller
I/O Subsystem
Optical
sensor(s)
Controller
Accelerometer(s) I
m
|
158
Other Input
,160
Controller(s)
156
103
112
.
Touch-Sensitive
Display System
103
103
Optical
Sensor(s)
Other Input
Control
§.4
Devices
;116
Figure 1B
Copy provided by USPTO fmm the PIRS Immaa Database on 06/20/2011
APLNDC00027876
U.S. Patent
Jan. 4, 2011
Sheet 3 of 29
US 7,864,163 B2
Portable Multifunction Device 100
| 206
SIM Card Slot 210
\
Speaker 1_1_1
Optical
Sensor B
Proximity a
Sensor y
212 |
head- '
phone
jack
200
202
Touch Screen 1_1_2
' Microphone ' / Home
|
External Port 1B
Accelerometer(s)
|
Figure 2
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027877
U.S. Patent
Jan. 4, 2011
Sheet 4 of 29
US 7,864,163 B2
Portable Multifunction Device
100
Optical >
Sensor 164 ;|
Speaker 111
300
Proximity
Sensor 1_66
Current Time
310 - Day and Date
308
312
Wallpaper image
304
slide to unlock
302
306
Touch Screen 112
Microphone '
Home
Accelerometer(s)>
Figure 3
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027878
U.S. Patent
Jan. 4, 2011
Sheet 5 of 29
US 7,864,163 B2
Portable Multifunction Device
100
2_Qg
Optical
Speaker 111
400A
02
Proximity
Sensor1§A
Sensor1.
Current Time
n dos
IM
Text
Photos
Camera
Videos
141
144
143
145
Jan
75.
15
Weather
Stocks
Blog
Calendar
140-1
149-2
142
1AZ
.
.
User-
+-x÷
i. ..·
ABC
Calculator
Alarm
Dictionary
jg9-3
1 9-4
149-5
Created
Widget
Widget
149-8
410
6
Phone
138
'
Mail
140
Browser
147
Music
146
Touch Screen 112
' Microphone >
Home
Accelerometer(s) '
110.
204
1ññ
Figure 4A
Copy orovided bv USPTO from the PIRS Imaae Database on 06/20/2011
APLNDC00027879
U.S. Patent
Jan. 4, 2011
US 7,864,163 B2
Portable Multifunction Device
100
2_Qg
400B
Sheet 6 of 29
Optical
Sensor 1ß_4
Speaker 11î
Q2 o
Proximity
Sensor 1§.
Current Time o
Text
m. 10.6'
Calendar
Photos
Camera
148
144
g_g
+-X
75°
Calculator
Stocks
Map
Weather
140-3
14_R-2
1p_4
140-1
Notes
Clock
Settings
1 3
149-4
412
414
Phone
,
Mail
410
Browser
iPod
Touch Screen 112
' Microphone
Home "
Accelerometer(s)
îî
B i
168
Figure 4B
Copy provided by USPTO from the PIRS Imacle Database on 06/20/2011
APLNDC00027880
U.S. Patent
Jan. 4, 2011
Sheet 7 of 29
US 7,864,163 B2
Portable Multifunction Device
206
100
Optical
Sensor 164
Speaker --111
3900A
1 E2
' Proximity '
. Sensor j_6 |
Current Time saa
m. son'
http:llwww.company.com/staÀ 3908
3910
Web page
Block 3
3914-3
Block 1
3914-1
Block 7
Block 4
" *¯
3914-4
,·a g,
Block 5
''
3914-5
Block 2
'ai
Block 8
3914-2
3914-8
Block 6
3916
918
Microphone
3920
/ Home
3922
Accelerometer(s)
Figure 5A
Copy provided by USPTO from the PIRS Immae Database on 06/20/2011
APLNDC00027881
U.S. Patent
Jan. 4, 2011
Sheet 8 of 29
Portable Multifunction Device
100
206
'
3900 B
'
US 7,864,163 B2
~
Speaker
'
Optical
/
'
' Proximity '
Sensor B
Sensor 166
¯¯
2-1 m
Current Time m
Cancel
o m'
Welcome 3904
http://www.company.com| 3926
Clear 3928
Search 39_3_o
Go to URL 3932
QWERTYU
A
S
D
I
J
OP
F
G
H
K
xo
V
L
BNM
.?123
Microphone ' ' Home
Accelerometer(s)
Figure 5B
Conv nrovided bv USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027882
U.S. Patent
Jan. 4, 2011
US 7,864,163 B2
Portable Multifunction Device
100
| 2M
Optical '|
Sensor 164
Speaker 111
-
Sheet 9 of 29
-
3900C
402
Proximity
Sensor 16_6
Current Time aos
gang
ch i
Welcome 3901
3906
http://www.company.com/start 3908
.'3 fg
BIOck 4 3914-4
s'3929's
Block 5
3_914-5
s'3941 i
4
< 27° /
/
3939
> 27°,'
,/
..
e3943i
Block 6 3914-6
3916
3918
Microphone '
*
3920
Hom
3922
Accelerometer(s)
Figure 5C
Copy proyided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027883
U.S. Patent
Jan. 4, 2011
3900D
US 7,864,163 B2
Portable Multifunction Device
100
Qg
-
Sheet 10 of 29
Speaker -111
02
Optical
Sensor 1.§§
' Proximity '
Sensor îQû
Current Time aos
Microphone ' / Home
a> soi
Accelerometer(s)'
Figure 5D
Copy proyided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027884
U.S. Patent
Jan. 4, 2011
3900E
US 7,864,163 B2
Portable Multifunction Device
100
206
-
Sheet 11 of 29
Optical '
Sensor 164
Speaker 111
1 402
Proximity '
Sensor 166
ÛUrrent Time a
Em
o
×
o
o
O
O
Microphone
Home
2
O
' Accelerometer(s) '
Figure 5E
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027885
U.S. Patent
Jan. 4, 2011
US 7,864,163 B2
Portable Multifunction Device
100
| 206
3900F
Sheet 12 of 29
Speaker ---- '|
111
02
Optical
Proximity
Sensor1$ .
Sensor1§
Current Time of
n 0
,
o
m
·¤
o
-r
¯0
z
Q..
e
o
U
V
o
o
V
V
Microphone '
Home
Accelerometer(s)
113
2_0_4
16
Figure 5F
COpy nrnvirled by USPTO from the PIRS Imane Database on 08/90/2001
APLNDC00027886
U.S. Patent
Jan. 4, 2011
Sheet 13 of 29
US 7,864,163 B2
Portable Multifunction Device
20
-
3900G
100
'
Speaker --- '|
111
E2
Optical
Sensor
Proximity '
Sensor 16
Current Time 4
O 10
o *-
Microphone
Home
Accelerometer(s)
Figure 5G
L
Conv provided by USPTO from the PIRS Imaqe Database on 06/20/2011
APLNDC00027887
U.S. Patent
Jan. 4, 2011
Sheet 14 of 29
US 7,864,163 B2
Portable Multifunction Device
100
| 2_O
Optical
Sensor1§_4
Speaker ---111
3900H
4_0_2
Proximity '
Sensor.166. 2
Current Time
4
--þ g
http://www.company.com/startaana
¯
3920
3910
Web page
3912
Block 3
Block 1
3914-3
3962<
3914-1
Block 7
Block 4
3914-7
3914-4
,•3Ë58
BIOck 5
'••*
3914-5
.'3Ô3's
BIOCk 8
'"'
Block 2
3914-8
3914-2
Block 6
3914-6
3964
3902
3906
Microphone
3918
/ Home \
3922
Accelerometer(s)
Figure 5H
Copy provided by USPTO from the PIRS Imana Database on 06/20/2011
APLNDC00027888
U.S. Patent
Jan. 4, 2011
Sheet 15 of 29
US 7,864,163 B2
Portable Multifunction Device
| 2_œ
100
'
Speaker 111 /
'
39001
] 402
Share
Optical '
Sensor16.
Proxirnity
Sensor j_6
Current Time .40&
3966
M¯þ 06
Start
http:llwww.nyt|
3908
Search
Cancel
3970
aazz
wwwdlyt ulib.gig
QWERTYU
IOP
ASDFGHJKL
zxcv
.?123
/
3980
3976
Microphone
111
Home
2
BNM
.com
3982
<2]
go
Accel--ter(s)'
î6.8
Figure 51
Copy provided by USPTO from the PIRS Imaae Database on 06/20/2011
APLNDC00027889
U.S. Patent
Jan. 4, 2011
Sheet 16 of 29
US 7,864,163 B2
Portable Multifunction Device
100
210g
Optical
Speaker 111
-
3900J
1 102
'|
Proximity '|
Sensor 1û& ,
Sensor 1Bû ,
Current Time ADA
Share
3966
n saa'
Start
Cancel
http://www.company.comlasca
Search
E
3984
QWERTYU
A
S
D
ZX
.?123
' Microphone
F
I
OP
G
H
J
K
CV
B
N
M
space
Home
L
<2]
search
Accelerometer(s) '
Figure 5J
Copy provided by USPTO from the PIRS Imaae Database on 06/20/2011
APLNDC00027890
T
U.S. Patent
Jan. 4, 2011
Sheet 17 of 29
US 7,864,163 B2
Portable Multifunction Device
2_0_6
3900K
100
Optical
Sensor j_6_4
Speaker 11_1_
02
Share
Proximity '
Sensor .1_6
Current Time mi
3966
a-> ma'
Start
( Cancel
httn·llwww enmnanv enml
Email Link
=-as
Email Content
3988
SMS Link
3990
ASDF
GH
J
KL
Cancel
.?123
3992
space
Microphone ' t Home \
113
204 ;
"Accelerometer(s)'
168
Figure 5K
Copy provided by USPTO e-- en- ning sman. natas...an nmowan-i-:
APLNDC00027891
U.S. Patent
Jan. 4, 2011
Sheet 18 of 29
US 7,864,163 B2
Portable Multifunction Device
2Qg
-
3900L
100
Speaker -- '|
111
l E2
Optical '
Sensor 164
Proximity '
Sensor 166
Current Time m
o aos
o
Microphòne
113
o
Home
\
Accelerometer(s)
204
e-
168
Figure 5L
Conv Drovided bv USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027892
U.S. Patent
Jan. 4, 2011
US 7,864,163 B2
Portable Multifunction Device
100
| 206
3900M
Sheet 19 of 29
Optical
Sensor 164
Speaker 111
02
'
' Proximity '
Sensor i .
'
Current Time .4¾
o 4.0.6
L
o
B
o
o
B
I
o
CD
Microphone
Home
Accelerometer(s)
Figure 5M
r
_
Copy provided by USPTO from the PIRS Imaae Database on 06/20/2011
APLNDC00027893
U.S. Patent
Jan. 4, 2011
Sheet 20 of 29
US 7,864,163 B2
¯¯¯¯
6002
6000
r
Determine borders, margins, and/or paddings for a plurality of boxes that
are specified in a structured electronic document (e.g., a web page) (e.g.,
an HTML or XML document).
r
-I Adjust the borders, margins, and/or paddings for the plurality of boxes for
i
display on a touch screen display of a portable electronic device.
6004
6006
Display, on the touch screen display of the portable electronic device, at
least a portion of the structured electronic document comprising the
plurality of boxes of content.
Scale the document width to fit within the touch screen display width
independent of the document length.
6008
6010
Detect a first gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture)
at a location on the displayed portion of the structured electronic
document.
6012
Determine a first box in the plurality of boxes at the location of the first
gesture.
I In a render tree associated with the structured electronic document,
traverse down the render tree to determine a first node that
corresponds to the detected location of the first gesture.
6014
Traverse up the render tree from the first node to a closest parent
node that contains a logical grouping of content.
6016
Identify content......ponding to the closest parent node as the first
6018
6020
Enlarge and substantially center the first box on the touch screen display.
Simultaneously zoom and translate the first box.
6022
Expand the first box so that its width is substantially the same as the
width of the touch screen display.
6024
( A '
Figure 6A
cODV Drovided bv USPTO from the PIRS Imaae Database on 06/20/2011
APLNDC00027894
U.S. Patent
Jan. 4, 2011
Sheet 21 of 29
US 7,864,163 B2
A )
6026
i
Resize text in the enlarged first box or in the structured electronic
document to meet or exceed a predetermined minimum text size on the
touch screen display.
Determine a scale factor.
028
Divide the predetermined minimum text size by the scaling factor to
determine a minimum text size.
030
If a text size is less than the determined minimum text size, increase
the text size to at least the determined minimum text size.
6032
I
6034
^*^^* a second gesture (e.g., a finger or stylus gesture) (e.g., a tap
gesture) on the enlarged first box.
6036
In response to detecting the second gesture, reduce in size the displayed
portion of the structured electronic document.
Return the first box to its size prior to being enlarged
I
038
I
While the first box is enlarged, detect a third gesture (e.g., a finger or
6040
stylus gesture) (e.g., a tap gesture) on a second box other than the first
box.
6042
In response to detecting the third gesture, substantially center the second
box on the touch screen display.
Figure 6B
DV nrovided bv USPTO from the PIRS Immae Databana on 08/90/9011
APLNDC00027895
U.S. Patent
Jan. 4, 2011
Sheet 22 of 29
US 7,864,163 B2
6044
Detect a swipe gesture (e.g., a finger or stylus gesture) on the touch
screen display.
6046
In response to detecting the swipe gesture, translate the displayed portion
of the structured electronic document on the touch screen display.
Move the structured electronic document vertically, horizontally, or
diagonally on the touch screen display.
.l
6048
I
6050
Detect a fifth gesture (e.g., a finger or multifinger gesture) (e.g., a twisting
gesture) on the touch screen display.
6052
In response to detecting the fifth gesture, rotate the displayed portion of
the structured electronic document on the touch screen display by 90 .
.I
r
i
6054
Detect a change in orientation of the device.
6056
In response to detecting the change in orientation of the device, rotate the
displayed portion of the structured electronic document on the touch
screen display by 90°.
r-
I
-----
Detect a multi-finger de-pinch gesture on the touch -^^n display.
1
I
I
6058
6060
In response to detecting the multi-finger de-pinch gesture, enlarge a
portion of the displayed portion of the structured electronic document on
the touch screen display in accordance with a position of the multi-finger
de-pinch gesture and an amount of finger movement in the multi-finger
de-pinch gesture.
Figure 6C
Conv nrovided bv USPTO from the PIBS1maae Database on 06/20/2011
APLNDC00027896
U.S. Patent
Jan. 4, 2011
Sheet 23 of 29
US 7,864,163 B2
Portable Multifunction Device
| 20
100
4000A
~
Optical '
Sensor îû&
Speaker ---111
4_02
3920 =
' Proximity '
Sensor 16_û
Current Time 4_O
O
http://www.company.com/start
3910
Web page
Inline
3912
Multimedia
Content
4004-1
10||DO
Content
4002-1
Multimedia
Content
%
Inline
Multimedia
Content
4002-3
Content
Content
4004-3
e 4028 *
4004-2
Inline
Multimedia
4¾2_-2
Content
C40004
3902
3906
Microphone
4
3918
Home
3922
Accelerometer(s)
Figure 7A
COD3L Drovided bV USPTO trom the PIHS Imaae Databane on 08/20/2011
APLNDC00027897
U.S. Patent
2ûû
4000B
Jan. 4, 2011
Sheet 24 of 29
US 7,864,163 B2
Portable Multifunction Device
100
Speaker 111
--'
Optical
Sensor101
'
Proximity
Sensori§.
00
Q
C\l
00
Q
C\l
Microphone '
om
Accelerometer(s) '
Figure 7B
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027898
U.S. Patent
Jan. 4, 2011
Sheet 25 of 29
US 7,864,163 B2
Portable Multifunction Device
206
4000C
100
Optical
Sensor 1&4
Speaker 111-
Microphone ' / Home
113
\
Proximity 3
Sensor 1§
Accelerometer(s)
îß_8.
Figure 7C
Copy provided by USPTO from the PlHS Imaqe Database on 06/20/2011
APLNDC00027899
U.S. Patent
Jan. 4, 2011
Sheet 26 of 29
US 7,864,163 B2
Portable Multifunction Device
20
100
'
Speaker1ît /
4000D
Optical
Sensor1§_4
'
Proximity '
Sensorì
o-
' Microphone > l' Home
1î3
- \ 204
Accelerometer(s)
1_611
Figure 7D
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027900
U.S. Patent
206
4000E
Jan. 4, 2011
Sheet 27 of 29
US 7,864,163 B2
Portable Multifunction Device
100
Speaker 11î
Microphone
Optical
Sensor j§_4.
Home
Proximity
Sensor 1ßß.
Accelerometer(s)
Figure 7E
Copy provided by USPTO from the PIRS Immae Database on 06/20/2011
APLNDC00027901
U.S. Patent
,
Jan. 4, 2011
US 7,864,163 B2
Portable Multifunction Device
100
206
Optical '
. Sensor &
Speaker 11î
4000F
col
o
Sheet 28 of 29
" Proximity '
Sensor 16û ,
<
4
Current Time .40s
Done 0:00
4006
4008
n 06
-1:08
4010
4012
Inline
Multimedia
Content
41)O2-1
4014
' Microphone >
Home '\
'Accelerometer(s)>
Figure 7F
CODV Drovided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027902
U.S. Patent
Jan. 4, 2011
Sheet 29 of 29
US 7,864,163 B2
8000
8002
Display, on a touch screen display of a portable electronic device, at least
a portion of a structured electronic document (e.g., a web page) (e.g., an
HTML or XML document) comprising content.
8004
Detect a first gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture)
on an item of inline multimedia content (e.g., video and/or audio content)
in the displayed portion of the structured electronic document.
8006
In response to detecting the first gesture, enlarge the item of inline
multimedia content on the touch screen display. Cease to display other
content in the structured electronic document besides the enlarged item of
inline multimedia content.
8008
While the enlarged item of inline multimedia content is displayed, detect a
second gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture) on
the touch screen display.
8010
In response to detecting the second gesture, display one or more
playback controls (e.g., play, pause, sound volume, and/or playback
progress bar icons) for playing the enlarged item of inline multimedia
content.
8012
Detect a third gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture)
on one of the playback controls.
8014
In response to detecting the third gesture, play the enlarged item of inline
multimedia content.
8016
Detect a fourth gesture (e.g., a tap gesture on a playback completion icon)
on the touch screen display.
8018
Display, again, at least the portion of the structured electonic document.
Figure 8
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027903
US 7,864,163 B2
1
2
PORTABLE ELECTRONIC DEVICE,
METHOD, AND G- = «·-··CAL USER
INTERFACE FOR DISv..aw.=G
STRUCTURED ELECTRONIC DOCUMENTS
and manipulatedata. These conventional user interfaces oûen
result in complicated key sequences and menu hierarchies
that must be --manzed by the user.
Many conventional user interfaces, such as those that
RELATED APPLICATIONS
s include physical pushbuttons, are also inflexible. This may
prevent a user interface from being configured and/or adapted
by either an application running on the portable device or by
This application claims priority to U.S. Provisional Patent
users. When coupled with the time consuming requirementto
Application Nos. 60/937,993, "Portable Multifunction
.1. multiple key sequences and menu hierarchies, and
Device," filed Jun. 29, 2007; 60/946,715, "Portable Elec- 10 the difficulty in activating a desired pushbutton, such inflextronic Device, Method, And Graphical User Interface For
ibility is frustrating to most users.
Displaying Structured Electronic Documents," filed Jun. 27,
In particular, it is slow and tedious to navigate in structured
2007; 60/879,469, "Portable Multifunction Device," filed
electronic documents (e.g., web pages) in portable electronic
devices with small ...._ using conventional input devices
Jan. 8, 2007; 60/879,253, To-Lle Multifunction Device,"
filed Jan. 7, 2007; and 60/824,769, "Portable Multifunction 15 (e.g., 5-way toggle switches). M......., it is cumbersome to
Device," filed Sep. 6, 2006. All of these applications are
control and view multimedia content within such documents
incorporated by referenced herein in their entirety.
on portable electronic devices.
This application is related to the following applications: (1)
Accordingly, there is a need for portable electronic devices
U.S. patent application Ser. No. 10/188,182, "Touch Pad For
with more transparent and intuitive user interfaces for viewHandheld Device," filed Jul. 1, 2002; (2) U.S. patent applica- 20 ing and navigating structured electronic documents and multion Ser. No. 10/722,948, "Touch Pad For Handheld Device,"
timedia content within such documents. Such interfaces
1..._. the effëctiveness, efficiency and user satisfaction
filed Nov. 25, 2003; (3) U.S. patent application Ser. No.
10/643,256, "Movable Touch Pad With Added Functionalwith activities like web browsing on portable electronic
ity," filed Aug. 18, 2003; (4) U.S. patent application Ser. No.
10/654,108, "Ambidextrous Mouse," filed Sep. 2, 2003; (5)
devices.
25
U.S. patent application Ser. No. 10/840,862, "Multipoint
Touchscreen," filed May 6, 2004; (6) U.S. patent application
The above deficiencies and other problems associated with
Ser. No. 10/903,964, "Gestures For Touch Sensitive Input
user interfaces for portable devices are reduced or eliminated
Devices," filed Jul. 30, 2004; (7) U.S. patent application Ser.
No. 11/038,590, "Mode-Based Graphical User Interfaces For 30 by the disclosed portable multifunction device. In some
embodiments, the device has a touch-sensitive display (also
Touch Sensitive Input Devices" filed Jan. 18, 2005; (8) U.S.
known as a "touch screen") with a graphical user interface
patent application Ser. No. 11/057,050, "Display Actuator,"
(GUI), one or more processors, memory and one or more
filed Feb. 11, 2005; (9) U.S. Provisional Patent Application
modules, programs or sets of instructions stored in the
No. 60/658,777, "Multi-Functional Hand-HeldDevice," filed
Mar. 4, 2005; (10) U.S. patent application Ser. No. 11/367, 35 memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through
749, "Multi-Functional Hand-Held Device," filed Mar. 3,
finger contacts and gestures on the touch-sensitive display. In
2006; and (11) U.S. Provisional Patent Application No.
some embodiments, the functions may include telephoning,
60/947,155, "Portable Electronic Device, Method, And
video conferencing, e-mailing, instant messaging, blogging,
Graphical User Interface For Displaying Inline Multimedia
Content", filed Jun. 29, 2007. All of these applications are 40 digital photographing, digital videoing, web browsing, digital music playing, and/or digital video playing. Instructions
for performing these functions may be included in a computer
readable storage medium or other computer program product
- =CAL FIELD
configured for execution by one or more processors.
In one aspect of the invention, a computer-implemented
The disclosed embodiments relate generally to portable 45
method, for use in conjunction with a portable electronic
electronic devices, and more particularly, to portable elecdevice with a touch screen display, comprises: displaying at
tronic devices that display structured electronic documents
least a portion of a structured electronic document on the
such as web pages on a touch screen display.
incorporated by reference herein in their entirety.
touch screen display, wherein the structured electronic docu50 ment comprises a plurality of boxes of content; detecting a
first gesture at a location on the displayed portion of the
structured electronic document; determining a first box in the
As portable electronic devices become more compact, and
plurality of boxes at the location of the first gesture; and
the number of functions performed by a given device
enlarging and substantially centering the first box on the
increase, ithas become a significant challengeto design auser
BACKGROUND
interface that allows users to easily interact with a multifunc- 55 touch screen display.
tion device. This challenge is particular significant for handheld portable devices, whichhave much smaller screens than
y
In another aspect of the invention, a graphical user interface on a portable electronic device with a touch screen display comprises: at least a portion of a structured electronic
document, wherein the structured electronic document com60 , prises a plurality ofboxes ofcontent. In response to detecting
a first gesture at a location on the portion of the structured
electronic document, a first box in the plurality ofboxes at the
location of the first gesture is determined and the first box is
enlarged and substantially centered on the touch screen dis-
desktop or laptop computers. This situation is unfortunate
because the user interface is the gatewaythroughwhichusers
receive not only content but also responses to user actions or
behaviors, including user attempts to access a device's features, tools, and functions. Some portable communication
devices (e.g., mobile telephones, sometimes called mobile
phones, cell phones, cellular telephones, and the like) have
resorted to adding more pushbuttons, increasing the density 65 play.
In another aspect of the invention, a portable electronic
ofpush buttons, overloading the functions ofpushbuttons, or
device comprises: a touch screen display, one or more prousing complex menu systems to allow a user to access, store
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027904
US 7,864,163 B2
3
4
cessors, memory, and one or more programs. The one ormore
pages on a portable electronic device with a touch screen
programs are stored in the memory and configured to be
display in accordance with some embodiments.
executed by the one or more processors. The one or more
FIGS. 7A-7F illustrate exemplary user interfaces for playprograms include instructions for displaying at least a portion
ing an item of inline multimedia content in accordance with
of a structured electronic document on the touch screen dis- 5 some embodiments.
play, wherein the structured electronic document comprises a
FIG. 8 is a flow diagram illustrating a process for displayplurality ofboxes of content. The one or more programs also
ing inline multimedia content on a portable electronic device
with a touch screen display in accordance with some embodiinclude: instructions for detecting a first gesture at a location
on the displayed portion of the structured electronic documents.
ment; instructions for determining a first box in the plurality 10
D
= ·-· ·ON OF EMBODIMENTS
of boxes at the location of the first gesture; and instructions
for enlarging and substantially centering the first box on the
Reference will now be made in detail to embodiments,
touch screen display.
examples ofwhich are illustrated in the accompanying drawIn another aspect of the invention, a computer-program
product comprises a computer readable storage medium and 15 ings. In the following detailed description, numerous specific
details are set forth in order to provide a thoroughunderstanda computer program mechanism (e.g., one or more computer
ing of the present invention. However, it will be apparent to
programs) embedded therein. The computer program mechaone of ordinary skill in the art that the present invention may
nism comprises instructions, which when executed by a porbe practiced without these specific details. In other instances,
table electronic device with a touch screen display, cause the
device: to display at least a portion of a structured electronic 20 well-known methods, procedures, components, circuits, and
networks have not been described in detail so as not to unnecdocument on the touch screen display, whereinthe structured
essarily obscure aspects of the embodiments.
electronic document comprises a plurality of boxes of conIt will also be understood that, although the terms first,
tent; to detect a first gesture at a location on the displayed
second, etc. may be used herein to describe various elements,
portion ofthe structured electronic document; to determine a
first box in the plurality of boxes at the location of the first 25 these elements should not be limited by these terms. These
terms are only used to distinguish one element from another.
gesture; and to enlarge and substantially center the first box
For example, a first gesture could be termed a second gesture,
on the touch screen display.
and, similarly, a second gesture could be termed a first gesIn another aspect of the invention, a portable electronic
ture, without departing from the scope of the present invendevice with a touch screen display comprises: means for
displaying at least a portion of a structured electronic docu- 30 tion.
The terminology used in the description of the invention
ment on the touch screen display, wherein the structured
herein is for the purpose of describing particular embodielectronic document comprises a plurality of boxes of conments only and is not intendedto be limiting ofthe invention.
tent; means for detecting a first gesture at a location on the
As used in the description of the invention and the appended
displayed portion of the structured electronic document;
means for determining a first box in the plurality ofboxes at 35 claims, the singular forms "a", "an" and "the" are intendedto
include the plural forms as well, unless the context clearly
the location of the first gesture; and means for enlarging and
indicates otherwise. It will also be understood that the term
substantially centering the first box on the touch screen dis"and/or" as used hereinrefers to and encompasses any and all
play.
possible combinations of one or more of the associated listed
The disclosed embodiments allow users to more easily
view and navigate structured electronic documents and mul- 40 items. It will be furtherunderstoodthat the terms "comprises"
and/or "comprising," when used in this specification, specify
timedia content within such documents onportable electronic
the presence of stated features, integers, steps, operations,
devices.
elements, and/or components, but do not preclude the presence or addition ofone or more other features, integers, steps,
BRIEF DES- - · - · ·ON OF THE DRAWINGS
45
operations, elements, components, and/or groups thereof.
Embodiments of a portable multifunction device, user
For a better understanding of the aforementioned embodiinterfaces for such devices, and associated processes for
ments of the invention as well as additional embodiments
using such devices are described. In some embodiments, the
thereof, reference should be made to the Description of
device is a portable ---acations device such as a mobile
Embodiments below, in conjunctionwith the following drawings in which like reference numerals refer to corresponding 50 telephone that also contains other functions, such as PDA
and/or music player functions.
parts throughout the figures.
The user interface may include a physical click wheel in
FIGS. 1A and 1B are block diagrams illustrating portable
additionto a touch screen or a virtual click wheel displayedon
multifunction devices with touch-sensitive displays in accorthe touch .._ A click wheel is a user-interface device that
dance with some embodiments.
may provide navigation commands based on an angular disFIG. 2 illustrates a portable multifunction device having a
placement ofthe wheel or a point ofcontact with the wheel by
touch screen in accordance with some embodiments.
a user ofthe device. A clickwheel may also be usedto provide
FIG. 3 illustrates an exemplary user interface forunlocking
a user command corresponding to selection of one or more
a portableelectronic device inaccordancewith some embodiitems, for example, when the user ofthe device presses down
ments'
60 on at least a portion of the wheel or the center of the wheel.
FIGS. 4A and 4B illustrate exemplary user interfaces for a
Alternatively, breaking contact with a click wheel image on a
menu of applications on a portable multifunction device in
touch screen surface may indicate a user command corresponding to selection. For simplicity, in the discussion that
FIGS. 5A-5M illustrate exemplary user interfaces for a
follows, a portable multifunction device that includes a touch
browser in accordance with some embodiments.
es screen is used as an exemplary embodiment. It should be
understood, however, that some of the user interfaces and
FIGS. 6A-6C are flow diagrams illustrating a process for
accordance with some embodiments.
displaying structured electronic documents such as web
associated processes may be applied to other devices, such as
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027905
US 7,864,163 B2
5
6
personal computers and laptop computers, which may
include one or more other physical user-interface devices,
such as a physical click wheel, a physical keyboard, a mouse
It should be appreciated that the device 100 is only one
example of a portable multifunction device 100, and that the
device 100 may have more or fewer components than shown,
may combine two or more components, or a may have a
and/or a joystick.
The device supports a variety of applications, such as one 5 different configuration or arrangement of the components.
The various components shown in FIGS. 1A and 1B may be
or more of the following: a telephone application, a video
implemented in hardware, software or a combination of both
conferencing application, an e-mail application, an instant
hardwareand software, including one or more signal processmessaging application, a blogging application, a photo maning and/or application specific integrated circuits.
agement application, a digital camera application, a digital
10
Memory 102 may include high-speed random access
video camera application, a web browsing application, a digimemory and may also include non-volatile memory, such as
tal music player application, and/or a digital video player
one or more magnetic disk storage devices, flash memory
application.
devices, or other non-volatile solid-state memory devices.
The various applications that may be executed on the
Access to memory 102 by other components of the device
device may use at least one --- physical user-interface 15 100, such as the CPU 120 and the peripherals interface 118,
device, such as the touch screen. One or more functions ofthe
may be controlled by the memory controller 122.
touch screen as well as corresponding information displayed
The peripherals interface 118 couples the input and output
on the device may be adjusted and/or varied from one appliperipherals of the device to the CPU 120 and memory 102.
cation to the next and/or within a respective application. In
The one or more processors 120 run or execute various softthis way, a common physical architecture (such as the touch 20 ware programs and/or sets of instructions stored in memory
screen) of the device may support the variety of applications
102 to perform various functions for the device 100 and to
with user interfaces that are intuitive and transparent.
process data.
The user interfaces may include one or more soft keyboard
In some embodiments, the peripherals interface 118, the
embodiments. The soft keyboard embodiments may include
CPU 120, and the memory controller 122 may be implestandard (QWERTY) and/or non-standard configurations of 25 mented on a single chip, such as a chip 104. In some other
symbols on the displayed icons ofthe keyboard, such as those
embodiments, they may be implemented on separate chips.
described in U.S. patent application Ser. Nos. 11/459,606,
The RF (radio frequency) circuitry 108 receives and sends
"Keyboards For Portable Electronic Devices," filed Jul. 24,
RF signals, also called electromagnetic signals. The RF cir2006, and 11/459,615, "Touch Screen Keyboards For Porcuitry 108 converts electrical signals to/fromelectromagnetic
table Electronic Devices," filed Jul. 24, 2006, the contents of 30 signals and co-acates with so-Ecations networks
which are hereby incorporated by reference. The keyboard
and other co-Acations devices via the electromagnetic
embodiments may include a reducednumber of icons (or soft
signals. The RF circuitry 108 may include well-known cirkeys) relative to the number of keys in existing physical
cuitry for performing these functions, including but not limkeyboards, such as that for a typewriter. This may make it
ited to an antenna system, an RF transceiver, one or more
easier for users to select one or more icons in the keyboard, 35 amplifiers, a tuner, one or more oscillators, a digital signal
and thus, one or more corresponding symbols. The keyboard
processor, a CODEC chipset, a subscriber identity module
embodiments may be adaptive. For example, displayed icons
(SIM) card, memory, and so forth. The RF circuitry 108 may
may be modified in accordance with user actions, such as
c -Acate with networks, such as the Internet, also
selecting one or more icons and/or one or more corresponding
referred to as the World Wide Web (WWW), an intranet
symbols. One or more applications on the portable device 40 and/or a wireless network, such as a cellular telephone netmay utilize ---and/or different keyboardembodiments.
work, a wireless local area network (LAN) and/or a metroThus, the keyboard embodiment used may be tailored to at
politan area network (MAN), and other devices by wireless
least some ofthe applications. In some embodiments, one or
communication. The wireless communication may use any of
more keyboard embodiments may be tailored to a respective
a plurality of communications standards, protocols and techuser. For example, one or more keyboard embodiments may 45 nologies, including but not limited to Global System for
be tailored to a respective user based on a word usage history
Mobile Co
cations (GSM), Enhanced Data GSM Envi(lexicography, slang, individual usage) ofthe respective user.
ronment (EDGE), wideband code division multiple access
Some of the keyboard embodiments may be adjusted to
(W-CDMA), code division multiple access (CDMA), time
reduce a probability of a user error when selecting one or
division multiple access (TDMA), Bluetooth, Wireless Fidelmore icons, and thus one or more symbols, when using the so ity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g
soft keyboard embodiments.
and/or IEEE 802.11n), voice over Internet Protocol (VoIP),
Attention is now directed towards embodiments of the
Wi-MAX, a protocol for email (e.g., Internet message access
protocol (IMAP) and/or post office protocol (POP)), instant
device. FIGS. 1A and 1B are block diagrams illustrating
portable multifunction devices 100 with touch-sensitive dismessaging (e.g., extensible messaging and presence protocol
'
plays 112 in accordance with some embodiments. The touch- 55 (XMPP), Session Initiation Protocol for Instant Messaging
and Presence Leveraging Extensions (SIMPLE), and/or
sensitive display 112 is sometimes called a "touch screen" for
Instant Messaging and Presence Service (IMPS)), and/or
convenience, and may also be known as or called a touchsensitive display system. The device 100 may include a
Short Message Service (SMS)), or any other suitable commemory 102 (whichmay include one or more computer readmunicationprotocol, including communicationprotocols not
able storage mediums), a memory controller 122, one ormore 60, yet developed as of the filing date of this document.
processing units (CPU's) 120, a peripherals interface 118, RF
The audio circuitry 110, the speaker 111, and the microphone 113 provide an audio interface between a user and the
circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (1/O) subsystem 106, other input
device 100. The audio circuitry 110 receives audio data from
or control devices 116, and an external port 124. The device
the peripherals interface 118, converts the audio data to an
100 may include one or more optical sensors 164. These 65 electrical signal, and transmits the electrical signal to the
speaker 111. The speaker 111 converts the electrical signal to
components may communicate over one or more commuhication buses or signal lines 103.
human-andible sound waves. The audio circuitry 110 also
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027906
US 7,864, 163 B2
7
8
receives electrical signals converted by the microphone 113
from sound waves. The audio circuitry 110 converts the elec-
capacitive, resistive, infrared, and surface acoustic wavetechnologies, as well as other proximity sensor arrays or other
elements for determining one or more points of contact with
trical signal to audio data and transmits the audio data to the
peripherals interface 118 for processing. Audio data may be
a touch screen 112.
retrieved from and/or transmitted to memory 102 and/or the 5
A touch-sensitive display in some embodiments of the
RF circuitry 108 by the peripherals interface 118. In some
touch screen 112 may be analogous to the multi-touch sensiembodiments, the audio circuitry 110 also includes a headset
tive tablets described in the following U.S. Pat. Nos. 6,323,
jack (e.g. 212, FIG. 2). The headsetjack provides an interface
between the audio circuitry 110 and ---ble audio input/
846 (Wesk-- et al.), 6,570,557 (Westerman et al.), and/or
6,677,932 (Westerman), and/or U.S. Patent Publication 2002/
outputperipherals, such as output-onlyheadphonesora head- 10 0015024A1, each of which is hereby incorporated by referset with both output (e.g., a headphone for one or both ears)
ence. However, a touch screen 112 displays visual output
and input (e.g., a microphone).
from the portable device 100, whereas touch sensitive tablets
The I/O subsystem106 couples input/output peripherals on
do not provide visual output.
the device 100, such as the touch screen 112 and other input/
A touch-sensitive display in some embodiments of the
control devices 116, to the peripherals interface 118. The I/O 15 touch screen 112 may be as described in the following applisubsystem 106 may include a display controller 156 and one
cations: (1) U.S. patent application Ser. No. 11/381,313,
or more input controllers 160 for other input or control
"Multipoint Touch Surface Controller," filed May 2, 2006; (2)
devices. The one or more input controllers 160 receivelsend
U.S. patent application Ser. No. 10/840,862, "Multipoint
electrical signals from/to other input or control devices 116.
Touchscreen," filed May 6, 2004; (3) U.S. patent application
The other input/control devices 116 may include physical 20 Ser. No. 10/903,964, "Gestures For Touch Sensitive Input
buttons (e.g., push buttons, rocker buttons, etc.), dials, slider
Devices," filed Jul. 30, 2004; (4) U.S. patent application Ser.
No. 11/048,264, "Gestures For Touch Sensitive Input
switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 may be coupled to
Devices," filed Jan. 31, 2005; (5) U.S. patent application Ser.
any (ornone) ofthe following: a keyboard, infrared port, USB
No. 11/038,590, "Mode-Based Graphical User Interfaces For
port, and a pointer device such as a mouse. The one or more 25 Touch Sensitive Input Devices," filed Jan. 18, 2005; (6) U.S.
buttons (e.g., 208, FIG. 2) may include anup/downbutton for
volume control of the speaker 111 and/or the microphone
113. The one or more buttons may include a push button (e.g.,
206, FIG. 2). A quick press ofthe push button may disengage
patent application Ser. No. 11/228,758, "Virtual Input Device
Placement On A Touch Screen User Interface," filed Sep. 16,
2005; (7) U.S. patent application Ser. No. 11/228,700,
"Operation OfA Computer With A Touch Screen Interface,"
a lock of the touch screen 112 or begin a process that uses 30 filed Sep. 16, 2005; (8) U.S. patent application Ser. No.
gestures onthetouch screen to unlockthe device, as described
11/228,737, "ActivatingVirtual Keys OfA Touch-ScreenVir-
in U.S. patent application Ser. No. 11/322,549, "Unlocking a
Device by Performing Gestures on an Unlock Image," filed
tual Keyboard," filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, "Multi-Functional Hand-Held
Dec. 23, 2005, which is hereby incorporated by reference. A
Device," filed Mar. 3, 2006. All of these applications are
longer press of the push button (e.g., 206) may turn power to 35 incorporated by reference herein.
the device 100 on or off. The user may be able to customize a
The touch screen 112 may have a resolution in excess of
functionality of one or more of the buttons. The touch screen
100 dpi. In an exemplary embodiment, the touch screen has a
112 is used to implement virtual or soft buttons and one or
resolution of approximately 160 dpi. The user may make
more soft keyboards.
contact with the touch screen 112 using any suitable object or
The touch-sensitive touch screen 112 provides an input 40 appendage, such as a stylus, a finger, and so forth. In some
interface and an output interface between the device and a
user. The display controller 156 receives and/or sends electrical signals from/to the touch screen 112. The touch screen
112 displays visual output to the user. The visual output may
embodiments, the user interface is designed to workprimarily
with finger-based contacts and gestures, which are much less
precise than stylus-based input due to the larger area of contact ofa finger on the touch screen. In some embodiments, the
include graphics, text, icons, video, and any combination 45 device translates the rough finger-based input into a precise
thereof (collectively termed "graphics"). In some embodipointer/cursor position or command for performing the
ments, some or all of the visual output may correspond to
actions desired by the user.
user-interface objects, further details of which are described
In some embodiments, in addition to the touch screen, the
below.
device 100 may include a touchpad (not shown) for activating
A touch screen112 has a touch-sensitive surface, sensor or 50 or deactivating particular functions. In some embodiments,
set of sensors that accepts input from the user based on haptic
the touchpad is a touch-sensitive area of the device that,
and/or tactile contact. The touch screen 112 and the display
unlike the touch screen, does not display visual output. The
controller 156 (along with any associatedmodules and/orsets
touchpad may be a touch-sensitive surface that is separate
ofinstructions in memory 102) detect contact (and any movefrom the touch screen 112 or an extension of the touchment or breaking ofthe contact) on the touch screen 112 and 55 sensitive surface formed by the touch
converts the detected contact into interaction with user-interIn some embodiments, the device 100 may include a physiface objects (e.g., one or more soft keys, icons, web pages or
cal or virtual click wheel as an input control device 116. A
images) that are displayed on the touch ..... . In an exemuser may navigate among and interact with one or more
plary embodiment, a point of contact between a touch screen
graphical objects (henceforth referred to as icons) displayed
112 and the user corresponds to a finger of the user.
60 in the touch screen 112 by rotating the click wheel or by
The touch screen 112 may use LCD (liquid crystal display)
moving a point ofcontact withthe click wheel (e.g., wherethe
technology, or LPD (light emitting polymer display) technolamount ofmovement ofthe point ofcontact is measuredby its
ogy, althoughother display technologies may beused in other
angular displacement with respect to a center point of the
embodiments. The touch screen 112 and the display controlclick wheel). The click wheel may also be used to select one
ler 156 may detect contact and any movement or breaking 65 or more of the displayed icons. For example, the user may
thereofusing any of a plurality oftouch sensing technologies "
press down on at least a portion of the click wheel or an
now known or later developed, including but not limited to
associatedbutton.Userw--dsandnavigationcommands
Copy provided bv USPTO from the PIRS Imaae Database on 06/20/2011
APLNDC00027907
US 7,864 ,163 B2
9
10
provided by the user via the click wheel may be processed by
The device 100 may also include one or more acceleroman input controller 160 as well as one or more ofthe modules
eters 168. FIGS. 1A and 1B show an accelerometer 168
coupled to the peripherals interface 118. Alternately, the
and/or sets of instructions in memory 102. For a virtual click
accelerometer 168 may be coupled to an input controller 160
wheel, the click wheel and click wheel controller may be part
ofthe touch screen 112 andthe display controller 156, respec- 5 in the I/O subsystem 106. The accelerometer 168 may perform as described in U.S. Patent Publication No.
tively. For a virtual click wheel, the click wheel may be either
20050190059, "Acceleration-based Theft Detection System
an opaque or semitransparent object that appears and disapfor Portable Electronic Devices," and U.S. Patent Publication
pears on the touch screen display in response to user interacNo. 20060017692, "MethodsAndApparatuses For Operating
tion with the device. In some embodiments, a virtual click
wheel is displayed on the touch screen of a portable multi- 10 A Portable Device Based On An Accelerometer," both of
which are which are incorporated herein by reference. In
function device and operated by user contact with the touch
some embodiments, information is displayed on the touch
screen.
screen display in a portrait view or a landscape view based on
The device 100 also includes a power system 162 for powan analysis of data received from the one or more acceleromering the various components. The power system 162 may 15
eters.
include a power management system, one or more power
In some embodiments, the software components stored in
sources (e.g., battery, alternating current (AC)), a recharging
system, a power failure detection circuit, a power converteror
inverter, a power status indicator (e.g., a light-emitting diode
(LED)) and any other components associated with the generation, management and distribution of power in portable
devices.
memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a contact/
motion module (or set ofinstructions) 130, a graphics module
(or set of instructions) 132, a text input module (or set of
The device 100 may also include one or more optical
instructions) 134, a Global Positioning System (GPS) module
(or set ofinstructions) 135, and applications (or set ofinstructions) 136.
sensors 164. FIGS.1A and 1B show an optical sensor coupled
The operating system 126 (e.g., Darwin, RTXC, LINUX,
to an optical sensor controller 158 in I/O subsystem 106. The 2s UNIX, OS X, WINDOWS, or an embedded operating system
optical sensor 164 may include charge-coupleddevice (CCD)
such as VxWorks) includes various software components
or complementary metal-oxide semiconductor (CMOS) phoand/or drivers for controlling and managing general system
totransistors. The optical sensor 164 receives light from the
tasks (e.g., memory management, storage device control,
-L -ent, projected through one or more lens, and conpower management, etc.) and facilitates v--un cation
verts the light to data representing an image. In conjunction 3e between various hardware and software components.
with an imaging module 143 (also called a camera module),
The co--- cation module 128 facilitates communicathe optical sensor 164 may capture still images or video. In
tion with other devices over one or more external ports 124
some embodiments, an optical sensor is locatedon the backof
and also includes various software components for handling
the device 100, opposite the touch screen display 112 on the
data received by the RF circuitry 108 and/or the external port
front of the device, so that the touch screen display may be 35 124. The external port 124 (e.g., Universal Serial Bus (USB),
used as a viewfmder for either still and/or video image acquiFIREWIRE, etc.) is adapted for coupling directly to other
sition. In some embodiments, an optical sensor is located on
devices or indirectly over a network (e.g., the Internet, wirethe front of the device so that the user's image may be
less LAN, etc.). In some embodiments, the external port is a
obtainedfor videoconferencingwhile the user views the other
multi-pin (e.g., 30-pin) connector that is the same as, or
video conference participants on the touch screen display. In 40 similar to and/or compatible with the 30-pia connector used
some embodiments, the position ofthe optical sensor 164 can
be changed by the user (e.g., by rotating the lens and the
on iPod (trademark ofApple Computer, Inc.) devices.
sensor in the device housing) so that a single optical sensor
the touch screen 112 (in conjunctionwith the display control-
The contact/motion module 130 may detect contact with
164 may be used along with the touch screen display for both
ler 156) and other touch sensitive devices (e.g., a touchpad or
video conferencing and still and/or video image acquisition· 45 physical click wheel). The contact/motion module 130
The device 100 may also include one or more proximity
includes various software components for performingvarious
166. FIGS. 1A and 1B show a proximity sensor 166
operationsrelatedto detectionofcontact, such as determining
coupled to the peripherals interface 118. Alternately, the
if contact has occurred, determining if there is movement of
proximity sensor 166 may be coupled to an input controller
the contact and tracking the movement across the touch
160 in the I/O subsystem106. The proximity sensor 166 may 50 screen 112, and determining if the contact has been broken
perform as described in U.S. patent application Ser. No.
(i.e., ifthe contact has ceased). Determiningmovement ofthe
11/241,839, "Proximity Detector In Handheld Device," filed
point ofcontact may include determining speed (magnitude),
Sep. 30, 2005; Ser. No. H/240,788, "Proximity Detector In
velocity (magnitude and direction), and/or an acceleration (a
Handheld Device," filed Sep. 30, 2005; Ser. No. 11/620,702,
change in magnitude and/or direction) ofthe point ofcontact.
filed Jan. 7, 2007, "Using Ambient Light Sensor To Augment 55 These operations may be applied to single contacts (e.g., one
Proximity Sensor Output," Ser. No. 11/586,862, filed Oct.24,
finger contacts) or to multiple simultaneous contacts (e.g.,
2006, "Automated Response To And Sensing Of UserActiv"multitouch"/multiple finger contacts). In some embodiity In Portable Devices," and Ser. No. 11/638,251, filed Dec.
ments, the contact/motion module 130 and the display con12, 2006, "Methods And Systems ForAutomatic Configuratroller 156 also detects contact on a touchpad. In some
tion Of Peripherals," which are hereby incorporated by ref- 60 embodiments, the contact/motion module 130 and the conerence. In some embodiments, the proximity sensor turns off
troller 160 detects contact on a click wheel.
and disables the touch screen 112 when the multifunction
The graphics module 132 includes various known software
components for rendering and displaying graphics on the
making a phone call). In some embodiments, the proximity
touch screen 112, including components for changing the
sensor keeps the screen off when the device is in the user's 65 intensity of graphics that are displayed. As used herein, the
pocket, purse, or other dark area to prevent -----sary bitterm "graphics" includes any object that can be displayed to
tery drainage when the device is a locked state.
a user, including without limitation text, web pages, icons
device is placed near the user's ear (e.g., when the user is
Gopy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027908
US 7,864,163 B2
11
12
(such as user-interface objects including soft keys), digital
images, videos, animations and the like.
The text input module 134, which may be a component of
In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158,
graphics module 132, provides soft keyboards for entering
contact module 130, graphics module 132, text input module
text invarious applications (e.g., contacts137, e-mail140, IM 5 134, contact list 137, and telephone module 138, the video141, blogging 142, browser 147, and any other application
conferencing module 139 may be used to initiate, conduct,
that needs text input).
and terminate a video conference between a user and one or
The GPS module 135 determines the location ofthe device
more other participants.
and provides this information for use in various applications
In conjunction with RF circuitry 108, touch screen 112,
(e.g., to telephone 138 for use in location-based dialing, to 10
display controller 156, contact module 130, graphics module
.
143 and/or blogger 142 as picture/video metadata,
132, and text input module 134, the e-mail client module 140
and to applications that provide location-based services such
may be used to create, send, receive, and manage e-mail. In
conjunctionwith image managementmodule 144, the e-mail
The applications 136 may include the following modules 15 module 140 makes it very easy to create and send e-mails with
still or video images taken with camera module 143.
(or sets of instructions), or a subset or superset thereof:
a contacts module 137 (sometimes called an address book
In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module
or contact list);
132, andtext input module 134, the instant messaging module
a telephone module 138;
a video conferencing module 139;
20 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered
an e-mail client module 140;
characters, to transmit a respective instant message (for
an instant messaging (IM) module 141;
as weather widgets, local yellow page widgets, and map/
navigation widgets).
example, using a Short Message Service (SMS) or Multime-
a blogging module 142;
a -2222-2. module 143 for still and/or video images;
an image management module 144;
a video player module 145;
a music player module 146;
a browser module 147;
a calendar module 148;
widget modules 149, which may include weather widget
149-1, stocks widget 149-2, calculator widget 149-3,
alarm clock widget 149-4, dictionary widget 149-5, and
other widgets obtained by the user, as well as usercreated widgets 149-6;
dia Message Service (MMS) protocol for telephony-based
25 instant messages or using XMPP, - E, or IMPS for
Internet-based instant messages), to receive instant messages
and to view receivedinstant messages. In some embodiments,
transmitted and/or received instant messages may include
graphics, photos, audio files, video files and/or other attach30 ments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, "instant messaging"
refers to both telephony-basedmessages(e.g., messages sent
using SMS or MMS) and Internet-basedmessages (e.g., mes-
sages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112,
gets 149-6;
display controller 156, contact module 130, graphics module
search module 151;
132, text input module 134, image management module 144,
video and music player module 152, which merges video
and browsing module 147, the blogging module 142 may be
player module 145 and music player module 146;
used to send text, still images, video, and/or other graphics to
notes module 153; and/or map module 154.
40 a blog (e.g., the user's blog).
widget creator module 150 for making user-created wid- 35
Examples of other applications 136 that may be stored in
memory 102 include other word processing applications,
In conjunction with touch screen 112, display controller
156, optical sensor(s) 164, optical sensor controller 158, conJAVA-enabled applications, encryption, digital rights mantact module 130, graphics module 132, and image manageagement, voice recognition, and voice replication.
ment module 144, the ---- module 143 may be used to
In conjunction with touch screen 112, display controller 45 capture still images or video (including a video stream) and
156, contact module130, graphics module 132, and text input
store them into memory 102, modify characteristics of a still
module 134, the contacts module 137 may be used to manage
image or video, or delete a still image or video from memory
an address book or contact list, including: adding name(s) to
102.
the address book; deleting name(s) from the address book;
50
In conjunction with touch screen 112, display controller
associating telephone number(s), e-mail address(es), physi156, contact module 130, graphics module 132, text input
cal address(es) or other information with a name; associating
module 134, and ---module 143, the image management
an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/
or facilitate communications by telephone 138, video confer-
module 144 may be used to arrange, modify or otherwise
manipulate, label, delete, present (e.g., in a digital slide show
number, access one ormore telephonenumbers inthe address
external port 124).
In conjunction with touch screen 112, display system con-
55 or album), and store still and/or video images.
ence 139, e-mail 140, or IM 141; and so forth.
In conjunction with touch screen 112, display controller
In conjunction with RF circuitry 108, audio circuitry 110,
156, contact module 130, graphics module 132, audio cirspeaker 111, microphone 113, touch screen 112, display concuitry 110, and speaker 111, the video playermodule 145 may
troller 156, contact module 130, graphics module 132, and
be usedto display, present or otherwiseplay backvideos (e.g.,
text input module 134, the telephonemodule 138 maybe used
to enter a sequence ofcharacters correspondingto a telephone 60 on the touch screen or on an external, connected display via
book 137, modify a telephone number that has been entered,
troller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, and browser
and disconnect or hang up when the conversation is completed. As noted above, the wireless --acation may use 65 module 147, the music player module 146 allows the user to
download and play back recorded music and other sound files
any of a plurality ofcommunications standards, protocols and
dial a respective telephone number, conduct a conversation
technologies.
stored in one or more file formats, such as MP3 orAAC files.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027909
US 7,864,163 B2
13
14
In some embodiments, the device 100 is a device where
In some embodiments, the device 100 may include the funcoperation of a predefined set of functions on the device is
tionality of an MP3 player, such as an iPod (trademark of
performed exclusively through a touch screen 112 and/or a
Apple Computer, Inc.).
touchpad. By using a touch screen and/or a touchpad as the
In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics 5 primary input/control device for operation of the device 100,
the number of physical input/control devices (such as push
module 132, and text input module 134, the browser module
buttons, dials, and the like) on the device 100 may bereduced.
147 may be used to browse the Internet, including searching,
The predefined set of functions that may be performed
linking to, receiving, and displaying web pages or portions
exclusively through a touch screen and/or a touchpad include
thereof, as well as attachments and other files linked to web
pages. Embodiments of user interfaces and associated pro- 10 navigation between user interfaces. In some embodiments,
the touchpad, when touched by the user, navigates the device
cesses using browsermodule 147 are describedfurther below.
100 to a main, home, or root menu from any user interface that
In conjunction with RF circuitry 108, touch screen 112,
may be displayed on the device 100. In such embodiments,
display system controller 156, contact module 130, graphics
the touchpad may be referred to as a "menu button." In some
module 132, text input module 134, e-mail module 140, and
browsermodule147, the calendarmodule148 may be usedto 15 other embodiments, the menu button may be a physical push
button or other physical input/control device instead of a
create, display, modify, and store calendars and data associtouchpad.
ated with calendars (e.g., calendar entries, to do lists, etc.).
FIG. 2 illustrates a portable multifunction device 100 havIn conjunction with RF circuitry 108, touch screen 112,
ing a touch screen 112 in accordance with some embodidisplay system controller 156, contact module 130, graphics
module 132, text input module 134, and browsermodule 147, 20 ments. The touch screen may display one or more graphics
within user interface (UI) 200. Inthis embodiment, as well as
the widget modules 149 are mini-applications that may be
others described below, a user may select one or more of the
downloaded and used by a user (e.g., weather widget 149-1,
graphics by making contact or touching the graphics, for
stocks widget 149-2, calculator widget 149-3, alarm clock
example, with one or more fingers 202 (not drawn to scale in
widget 149-4, and dictionary widget 149-5) or createdby the
user (e.g., user-createdwidget 149-6). In some embodiments, 25 the figure). In some embodiments, selection of one or more
a widget includes an - - (Hypertext Markup Language)
graphics occurs when the user breaks contact with the one or
more graphics. In some embodiments, the contact may
file, a CSS (Cascading Style Sheets) file, anda JavaScript file.
include a gesture, such as one or more taps, one or more
In some embodiments, a widget includes anXML (Extensible
Markup Language) file and a JavaScript file (e.g., Yahoo!
swipes (from left to right, right to left, upward and/or down-
30 ward) and/or a rolling of a finger (from right to left, left to
Widgets).
rigbt, upward and/or downward) that has made contact with
In conjunction with RF circuitry 108, touch screen 112,
the device 100. In some embodiments, inadvertent contact
display system controller 156, contact module 130, graphics
with a graphic may not select the graphic. For example, a
module 132, text input module 134, and browsermodule 147,
the widget creatormodule 150 maybe used by a user to create
swipe gesture that sweeps over an application icon may not
widgets (e.g., turning a user-specified portion of a web page 35 select the corresponding application when the gesture corresponding to selection is a tap.
into a widget).
The device 100 may also include one or more physical
In conjunctionwith touch screen 112, display system controller 156, contact module 130, graphics module 132, and
buttons, such as "home" or menu button 204. As described
text input module 134, the search module 151 may be used to
previously, the menu button 204 may be used to navigate to
search for text, music, sound, image, video, and/or other files 40 any application 136 in a set of applications that may be
executed on the device 100. Alternatively, in some embodiin memory 102 that match one or more search criteria (e.g.,
ments, the menu button is implemented as a soft key in a GUI
one or more user-specified search terms).
In conjunction with touch screen 112, display controller
in touch screen 112.
156, contact module130, graphics module 132, and text input
In one embodiment, the device 100 includes a touch screen
module 134, the notes module 153 may be used to create and 45 112, a menu button 204, a push button 206 for powering the
manage notes, to do lists, and the like.
In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics
device on/off and locking the device, volume adjustment
button(s) 208, a Subscriber Identity Module (SIM) card slot
210, a head setjack 212, and a docking/charging external port
124. The push button 206 may be used to turn the power
module 132, text input module 134, GPS module 135, and
b....-- module 147, the map module 154 may be used to 50 on/offon the device by depressing the button and holding the
button in the depressed state for a predefined time interval; to
receive, display, modify, and store maps and data associated
lock the device by depressing the button and releasing the
with maps (e.g., driving directions; data on stores and other
button before the predefmed time interval has elapsed; and/or
points of interest at or near a particular location; and other
to unlock the device or initiate an unlock process. In an
location-based data).
y
Each ofthe above identified modules and applications cor- 55 alternative embodiment, the device 100 also may accept verbal input for activation or deactivation of some functions
respond to a set of instructions for performing one or more
through the microphone 113.
functions described above. These modules (i.e., sets of
Attention is now directed towards embodiments of user
instructions) need not be implemented as separate software
interfaces ("UI") and associatedprocesses that may be impleprograms, procedures or modules, and thus various subsets of
these modules may be combined or otherwise re-arranged in 60 , mented on a portable multifunction device 100.
various embodiments. For example, video playermodule145
FIG.3illustratesanexemplaryuserinterfaceforunlocking
a portable electronic device in accordancewith some embodimay be combined with music playermodule 146 into a single
ments. In some embodiments, user interface 300 includes the
module (e.g., video and music player module 152, FIG.1B).
following elements, or a subset or superset thereof:
In some embodiments, memory 102 may store a subset ofthe
Unlock image 302 that is moved with a finger gesture to
modules and data structures identified above. Furthermore, 65
unlock the device;
memory 102 may store additional modules and data strubArrow 304 that provides a visual cue to the unlock gesture;
tures not described above.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027910
US 7,864,163 B2
15
16
need to scroll through a list of applications (e.g., via a scroll
bar). In some embodiments, as the number of applications
increase, the icons corresponding to the applications may
Time 308;
decrease in size so that all applications may be displayedon a
Day 310;
5 single screen without scrolling. In some embodiments, havDate 312; and
ing all applications on one screen and a menu button enables
Wallpaper image 314.
a user to access any desired application with at most two
In some embodiments, the device detects contact with the
touch-sensitive display (e.g., a user's finger making contact
inputs, such as activating the menu button 204 and then activating the desired application (e.g., by a tap or other finger
on or near the unlock image 302) while the device is in a
user-interface lock state. The device moves the unlock image 10 gesture on the icon corresponding to the application).
302 in accordance with the contact. The device transitions to
FIGS. 5A-5M illustrate exemplary user interfaces for a
Channel 306 that provides additional cues to the unlock
gesture;
a user-interface unlock state if the detected contact corresponds to a predefined gesture, such as moving the unlock
image across channe1306.
browser in accordance with some embodiments.
In some embodiments, user interfaces 3900A-3900M (in
FIGS. 5A-5M, respectively) include the following elements,
Conversely, the device maintains the user-interface lock 15 or a subset or superset thereof:
402, 404, and 406, as described above;
state if the detected contact does not correspond to the prePrevious page icon 3902 that when activated (e.g., by a
defined gesture. As noted above, processes that use gestures
finger tap on the icon) initiates display of the previous
on the touch screen to unlock the device are described in U.S.
web page;
patent application Ser. Nos. 11/322,549, "Unlocking A
Web page name 3904;
Device By Performing Gestures On An Unlock Image," filed 20
Next page icon 3906 that when activated (e.g., by a finger
Dec. 23, 2005, and 11/322,550, "Indication Of Progress
tap on the icon) initiates display of the next web page;
Towards Satisfaction OfA User Input Condition," filed Dec.
23, 2005, which are hereby incorporated by refe--.
FIGS. 4A and 4B illustrate exemplary user interfaces for a
menu of applications on a portable multifunction device in 25
accordance with some embodiments. In some embodiments,
user interface 400A includes the following elements, or a
subset or superset thereof:
Signal strength indicator(s) 402 for wireless --Aca30
tion(s), such as cellular and Wi-Fi signals;
Time 404;
Battery status indicator 406;
Tray 408 with icons for frequently used applications, such
as one or more of the following:
Phone 138, which may include an indicator 414 of the 35
number of missed calls or voicemail messages;
E-mail client 140, whichmay include an indicator410 of
the number of unread e-mails;
Browser 147; and
40
Music player 146; and
Icons for other applications, such as one or more of the
following:
IM 141;
Image management 144;
45
Camera 143;
Video player 145;
Weather 149-1;
Stocks 149-2;
Blog 142;
so
Calendar 148;
Calculator 149-3;
Alarm clock 149-4;
Dictionary 149-5; and
User-created widget 149-6.
In some embodiments, user interface 400B includes the ss
following elements, or a subset or superset thereof:
402, 404, 406, 141, 148, 144, 143, 149-3, 149-2, 149-1,
149-4, 410, 414, 138, 140, and 147, as described above;
Map 154;
60,
Notes 153;
Settings 412, which provides access to settings for the
device 100 and its various applications 136, as described
further below; and
Video and music player module 152, also referred to as
iPod (trademark ofApple Computer, Inc.) module 152. 65
In some embodiments, UI 400A or 400B displays all ofthe
available applications 136 on one screen so that there is no
URL (Uniform Resource Locator) entry box 3908 for
inputting URLs of web pages;
Refresh icon 3910 that when activated (e.g., by a finger tap
on the icon) initiates a refresh of the web page;
Web page 3912 or other structured document, which is
made of blocks 3914 of text content and other graphics
(e.g., images and inline multimedia);
Settings icon 3916 that when activated (e.g., by a finger tap
on the icon) initiates display of a settings menu for the
browser;
Bookmarks icon 3918 that when activated (e.g., by a finger
tap on the icon) initiates display of a bookmarks list or
menu for the browser;
Add bookmark icon 3920 that when activated (e.g., by a
finger tap on the icon) initiates display ofa UI for adding
bookmarks (e.g., UI 3900F, FIG. 5F, which like other
UIs and pages, can be displayed in either portrait or
landscape view);
New window icon 3922 that when activated (e.g., by a
finger tap on the icon) initiates display ofa UI for adding
new windows to the browser (e.g., UI 3900G, FIG. SG);
Vertical bar 3962 (FIG. 5H) for the web page 3912 or other
structured document that helps a user understand what
portion of the web page 3912 or other structured docu-
ment is being displayed;
Horizontal bar 3964 (FIG. 5H) for the web page 3912 or
other structured document that helps a user understand
what portion of the web page 3912 or other structured
document is being displayed;
Share icon 3966 that when activated (e.g., by a fingertap on
the icon) initiates display ofa UI for sharinginformation
with other users (e.g., UI 3900K, FIG. 5K);
URL clear icon 3970 (FIG. 51) that whenactivated (e.g., by
a finger tap on the icon) clears any input in URL entry
box 3908;
Search term entry box 3972 (FIG. SI) for inputting search
terms for web searches;
URL suggestion list 3974 that displays URLs that match
the input in URL entry box 3908 (FIG. SI), wherein
activation of a suggested URL (e.g., by a finger tap on
the suggestedURL) initiates retrieval ofthe corresponding web page;
URL input keyboard 3976 (FIGS. 5I and SM) with period
key 3978, backslash key 3980, and ".com"key 3982 that
make it easier to enter ca....2 characters in URLs;
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027911
US 7,864, 163 B2
17
18
Search term clear icon 3984 that when activated (e.g., by a
fmger tap on the icon) clears any input in search term
entry box 3972;
Email link icon 3986 (FIG. 5K) that when activated (e.g.,
page may be enlarged. Conversely, in response to a multitouch pinching gesture by the user, the web page may be
reduced.
In some embodiments, in response to a substantially vertiby a finger tap or other gesture on the icon) prepares an 5 cal upward (or downward) swipe gesture by the user, the web
email that contains a link to be shared with one or more
page (or, more generally, other electronic documents) may
other users;
scroll one-dimensionally upward (or downward) in the vertiEmail content icon 3988 (FIG. 5K) that when activated
cal direction. For example, in response to an upward swipe
(e.g., by a finger tap or other gesture on the icon) pregesture 3937 by the user that is within a predetermined angle
pares an email that contains content to be shared with 10 (e.g., 27°) ofbeing perfectly vertical, the web page may scroll
one or more other users;
one-dimensionally upward in the vertical direction.
IM link icon 3990 (FIG. 5K) that when activated (e.g., by a
Conversely, in some embodiments, in response to a swipe
finger tap or other gesture on the icon) prepares an IM
gesture that is not within a predetermined angle (e.g., 27°) of
that contains a link to be shared with one or more other
being perfectly vertical, the web page may scroll two-dimenusers; and
15 sionally (i.e., with simultaneous movement in both the vertiCancel icon 3992 (FIG. 5K) that when activated (e.g., by a
finger tap or other gesture on the icon) cancels the shar-
cal and horizontal directions). For example, in response to an
upward swipe gesture 3939 (FIG. 5C) by the user that is not
ing UI (e.g., UI 3900K, FIG. 5K) and displays the previous UI.
within a predetermined angle (e.g., 27°) of being perfectly
vertical, the web page may scroll two-dimensionally along
In some embodiments, in response to a predefined gesture 20 the direction of the swipe 3939.
by the user on a block 3914 (e.g., a single tap gesture or a
In some embodiments, in response to a multi-touch 3941
double tap gesture), the block is enlarged and centered (or
and 3943 rotation gesture by the user (FIG. 5C), the web page
substantially centered) in the web page display. For example,
may be rotated exactly 90° (UI 3900D, FIG. 5D) for landin response to a single tap gesture 3923 on block 3914-5,
scape viewmg, even if the amount of rotation in the multiblock 3914-5 may be enlarged and centered in the display, as 25 touch 3941 and 3943 rotation gesture is substantially differshown in UI 3900C, FIG. 5C. In some embodiments, the
ent from90°. Similarly, in response to a multi-touch 3945 and
width ofthe block is scaled to fill the touch screen display. In
3947 rotation gesture by the user (UI 3900D, FIG. 5D), the
some embodiments, the width ofthe block is scaled to fill the
web page may be rotated exactly 90° for portrait viewing,
touch screen display with a predefined amount of padding
even if the amount of rotation in the multi-touch 3945 and
along the sides ofthe display. In some embodiments, a zoom- 3o 3947 rotation gesture is substantially different from 90°.
ing animation ofthe block is displayedduring enlargement of
Thus, in response to imprecise gestures by the user, precise
the block. Similarly, in response to a single tap gesture 3925
movements of graphics occur. The device behaves in the
--, desired by the user despite is.-te input by the
on block 3914-2, block 3914-2 may be enlargedwith a zooming animationand two-dimensionallyscrolled to the center of
user. Also, note that the gestures described for UI 3900C,
the display (not shown).
35 which has a portrait view, are also applicable to UIs with a
In some embodiments, the device analyzes the render tree
landscape view (e.g., UI 3900D, FIG. 5D) so that the user can
ofthe web page 3912 to determinethe blocks 3914 in the web
choose whichever view the user prefers for web browsing.
page. In some embodiments, a block 3914 corresponds to a
FIGS. 6A-6C are flow diagrams illustrating a process 6000
rendernode that is: a replaced inline; a block; an inline block;
for displaying structured electronic documents such as web
or an inline table.
40 pages on a portable electronic device with a touch screen
In some embodiments, in response to the same predefined
display (e.g., device 100) in accordance with some embodigesture by the user on a block 3914 (e.g., a single tap gesture
ments. The portable electronic device displays at least a poror a double tap gesture) that is already enlarged and centered,
tion of a structured electronic do---i on the touch screen
the enlargement and/or centering is substantially or comdisplay. The structured electronic document comprises a plupletely reversed. For example, in response to a single tap 45 rality ofboxes ofcontent (e.g., blocks 3914, FIG. 5A) (6006).
gesture 3929 (FIG. SC) on block 3914-5, the web page image
In some embodiments, the plurality of boxes is defined by
may zoom out and return to UI 3900A, FIG. 5A.
a style sheet language. In some embodiments, the style sheet
In some embodiments, in response to a predefined gesture
language is a cascading style sheet language. In some
(e.g., a single tap gesture or a double tap gesture) by the user
embodiments, the structured electronic document is a web
on a block 3914 that is already enlarged but not centered, the so page (e.g., web page 3912, FIG. SA). In some embodiments,
block is centered (or substantially centered) in the web page
the structured electronic document is an
- or XML
display. For example, in response to a single tap gesture 3927
document.
(FIG. 5C) on block 3914-4, block 3914-4 may be centered (or
In some embodiments, displaying at least a portion of the
substantially centered) in the web page display. Similarly, in
structured electronic document comprises scaling the docuresponse to a single tap gesture 3935 (FIG. 5C) on block 55 ment width to fit within the touch screen display width inde3914-6, block 3914-6 may be centered (or substantially cenpendent of the document length (6008).
tered) in the web page display. Thus, for a web page display
In some embodiments, the touch screen display is rectanthat is already enlarged, in response to a predefined gesture,
gular with a short axis and a long axis (also called the minor
the device may display in an intuitive
a series of
axis and major axis); the display width corresponds to the
blocks that the user wants to view. This same gesture may 60 short axis (or minor axis) when the structured electronic
initiate different actions in different contexts (e.g., (1) zoomdocument is seen in portrait view (e.g., FIG. 5C); and the
ing and/or enlarging in combination with scrolling when the
display width corresponds to the long axis (or major axis)
web page is reduced in size, UI 3900A and (2) reversing the
when the structured electronic document is seen in landscape
enlargement and/or centering if the block is already centered
view (e.g., FIG. 5D).
and enlarged).
, 65
In some embodiments, prior to displayingat least a portion
In some embodiments, in response to a multi-touch 3931
ofa structured electronic document, borders, margins, and/or
and 3933 de-pinching gesture by the user (FIG. 5C), the web
paddings are determinedfor the plurality ofboxes (6002) and
Conv provided bv USPTO from the PIRS Imaae Database on " """"11
APLNDC00027912
US 7,864,163 B2
19
20
In some embodiments, text in the structured electronic
adjusted for display on the touch screen display (6004). In
document is resized to meet or exceed a predeterminedminisome embodiments, all boxes in the plurality of boxes are
mum text size on the touch screen display (6026; FIG. 6B). In
adjusted. In some embodiments,just the first box is adjusted.
some embodiments, the text resizing comprises: determining
In some embodiments,just the first box and boxes adjacent to
s a scale factor by which the first box will be enlarged (6028);
the first box are adjusted.
dividing the predetermined - J-- text size on the touch
A first gesture is detected at a location on the displayed
screen display by the scaling factor to determine a
portion of the structured electronic document (e.g., gesture
text size for text inthe structured electronic document (6030);
3923, FIG. 5A) (6010). In some embodiments, the first gesand if a text size for text in the structured electronic document
ture is a finger gesture. In some embodiments, the first gesture
10 is less than the determined minimum text size, ° --ing the
is a stylus gesture.
text size for text in the structured electronic document to at
In some embodiments, the first gesture is a tap gesture. In
least the determined minimum text size (6032). In some
some embodiments, the first gesture is a double tap with a
embodiments, the text resizing comprises: identifying boxes
single finger, a double tap with two fingers, a single tap with
containing text in the plurality of boxes; determining a scale
a single finger, or a single tap with two fingers.
A first box (e.g., Block 5 3914-5, FIG. 5A) in the plurality 15 factor by which the first box will be enlarged; dividing the
predetermined ...:..:...- text size on the touch screen display
of boxes is determined at the location of the first gesture
by the scaling factor to determinea minimum text size for text
(6012). In some embodiments, the structured electronic docuin the structured electronic document; and for each identified
ment has an associated render tree with a plurality of nodes
box containing text, if a text size for text in the identified box
and determiningthe first box at the location ofthe first gesture
comprises: traversing downthe rendertree to determinea first 20 is less than the determined ...:..:...-text size, increasing the
text size for text in the identifiedbox to at least the determined
node in the plurality ofnodes that corresponds to the detected
A--text size and adjusting the size ofthe identified box
location of the first gesture (6014); traversing up the render
to accommodate the resized text.
tree from the first node to a closest parent node that contains
a logical grouping of content (6016); and identifying content
In some embodiments, a second gesture (e.g., gesture
corresponding to the closest parent node as the first box 25 3929, FIG. 5C) is detected on the enlarged first box (6034). In
(6018). In some embodiments, the logical grouping of conresponse to detecting the second gesture, the displayed portent comprises a paragraph, an image, a plugin object, or a
tion of the structured electronic document is reduced in size
table. In some embodiments, the closest parent node is a
(6036). In some embodiments, the first box returns to its size
replaced inline, a block, an inline block, or an inline table.
prior to being enlarged (6038).
The first box is enlarged and substantially centered on the 30
In some embodiments, the second gesture and the first
touch screen display (e.g., Block 5 3914-5, FIG. 5C) (6020).
gesture are the same type of gesture. In some embodiments,
In some embodiments, enlarging and substantially centering
the second gesture is a finger gesture. In some embodiments,
comprises simultaneously zooming and translating the first
the second gesture is a stylus gesture.
box on the touch screen display (6022). In some embodiIn some embodiments, the second gesture is a tap gesture.
ments, enlarging comprises expandingthe first box so that the 35 In some embodiments, the second gesture is a double tap with
width ofthe first box is substantially the same as the width of
a single finger, a double tap with two fingers, a single tap with
the touch screen display (6024).
a single finger, or a single tap with two fingers.
In some embodiments, text in the enlarged first box is
In some embodiments, while the first box is enlarged, a
resized to meet or exceed a predeterminedminimum text size
40 third gesture (e.g., gesture 3927 or gesture 3935, FIG. 5C) is
onthe touch screen display (6026). In some embodiments, the
detected on a second box other than the first box (6040). In
text resizing comprises: determining a scale factor by which
response to detecting the third gesture, the second box is
the first box will be enlarged (6028); dividing the predetersubstantially centered on the touch screen display (6042). In
mined minimum text size on the touch screen display by the
some embodiments, the third gesture and the first gesture are
scaling factor to determine a mid-- text size for text in the 45 the same type of gesture. In some embodiments, the third
first box (6030); and ifa text size for text inthe first box is less
gesture is a finger gesture. In some embodiments, the third
than the determined minimum text size, increasing the text
gesture is a stylus gesture.
size for text inthe first box to at least the determined
In some embodiments, the third gesture is a tap gesture. In
text size (6032). In some embodiments, the first box has a
some embodiments, the third gesture is a double tap with a
width; the display has a display width; and the scale factor is
the display width divided by the width of the first box prior to so single finger, a double tap with two fmgers, a single tap with
a single finger, or a single tap with two fingers.
enlarging. In some embodiments, the resizing occurs during
In some embodiments, a swipe gesture (e.g., gesture 3937
the enlarging. In some embodiments, the resizing occurs after
or gesture 3939, FIG. 5C) is detected on the touch screen
the enlarging.
display (6044; FIG. 6C). In response to detecting the swipe
For example, suppose the predetermined mid-- text
size is an 18-point font andthe scale factor is determinedto be 55 gesture, the displayed portion of the structured electronic
document is translated on the touch screen display (6046). In
two.Inthatcase,theminimumtextsizefortextinthefirstbox
some embodiments, the translating comprises vertical, horiis 18 divided by 2, or 9. If text in the first box is in a 10-point
zontal, or diagonal movement of the structured electronic
font, its text size is not increased, because 10 is greater than
document on the touch screen display (6048). In some
the 9-pointminimum. Once the scale factor is applied, the text
will be displayed in a 20-point font, which is gieater than the 60 embodiments, the swipe gesture is a finger gesture. In some
embodiments, the swipe gesture is a stylus gesture.
predetermined -L- text size of 18. If, however, text in
In some embodiments, a fifth gesture (e.g., multi-touch
the first box is in an 8-point font, application of the scale
gesture 3941/3943, FIG. 5C) is detected on the touch screen
factor would cause the text to be displayed in a 16-point font,
display (6050). In response to detecting the fifth gesture, the
which is less than the predeterminedAL--text size of 18.
Therefore, since 8 is less than 9, thetext size is increasedto at 65 displayed portion of the structured electronic document is
rotated on the touch screen display by 90° (6052). In some
least a 9-point font and displayed in at least an 18-point foilt
embodiments, the fifth gesture is a finger gesture. In some
after application of the scale factor.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027913
US 7,864,163 B2
21
22
embodiments, the fifth gesture is a multifinger gesture. In
In response to detecting a gesture on the touch screen
some embodiments, the fifth gesture is a twisting gesture.
display, a displayed window in the application is moved off
the display and a hidden window is moved onto the display.
In some embodiments, a change in orientationofthe device
For example, in response to detecting a tap gesture 3949 on
is detected (6054). For example, the one or more accelerometers 168 (FIGS. 1A-1B) detect a change in orientation ofthe 5 the left side of the screen, the window with web page 3912-2
is moved partially or fully off-screen to the right, the window
device. In response to detecting the change in orientation of
with web page 3912-3 is moved completely off-screen, parthe device, the displayed portion of the structured electronic
tially hidden window with web page 3912-1 is moved to the
document is rotated on the touch screen display by 90°
center ofthe display, and another completely hidden window
(6056).
In some embodiments, a multi-finger de-pinch gesture 10 with a web page (e.g., 3912-0) may be moved partially onto
the display. Alternatively, detection of a left-to-right swipe
(e.g., multi-touch gesture 3931/3933, FIG. 5C) is detected on
gesture 3951 may achieve the same effect.
the touch screen display (6058). In response to detecting the
Conversely, in response to detecting a tap gesture 3953 on
multi-finger de-pinch gesture, a portion ofthe displayed porthe right side ofthe screen, the windowwithweb page 3912-2
tion of the structured electronic document is enlarged on the
touch screen display in accordance with a position of the 15 is moved partially or fully off-screen to the left, the window
with web page 3912-1 is moved completely off-screen, parmulti-finger de-pinch gesture and an amount of finger move-
tially hidden window with web page 3912-3 is moved to the
center ofthe display, and -A- completely hidden window
with a web page (e.g., 3912-4) may be moved partially onto
specific order, it should be apparent that the process 6000 can 20 the display. Alternatively, detection of a right-to-left swipe
gesture 3951 may achieve the same effect.
include more or fewer operations, which can be executed
In some embodiments, in response to a tap or other preserially or in parallel (e.g., using parallel processors or a
defmed gesture on a delete icon 3934, the corresponding
multi-threadingenvironment), an order oftwo ormore operament in the multi-finger de-pinch gesture (6060).
While the content display process 6000 described above
includes a number of operations that appear to occur in a
window3912 is deleted. In some embodiments, inresponseto
tions may be changed and/or two or more operations may be
25 a tap or other predefined gesture on Done icon 3938, the
combined into a single operation.
window in the center ofthe display (e.g., 3912-2) is enlarged
A graphical user interface (e.g., UI 3900A, FIG. 5A) on a
to fill the screen.
portable electronic device with a touch screen display comAdditional description of adding windows to an applicaprises at least a portion of a structured electronic document
tion can be found in U.S. patent application Ser. No. 11/620,
(e.g., web page 3912, FIG. 5A). The structured electronic
document comprises a plurality of boxes of content (e.g., 30 647, "Method, System, And Graphical User Interface For
Viewing Multiple Application Windows," filed Jan. 5, 2007,
blocks 3914, FIG. 5A). In response to detecting a first gesture
the content of which is hereby incorporated by reference.
(e.g., gesture 3923, FIG. 5A) at a location on the portionofthe
FIGS. 7A-7F illustrate exemplary user interfaces for playstructured electronic document, a first box (e.g., Block 5
ing an item of inline multimedia content in accordance with
3914-5, FIG. 5A) in the plurality of boxes at the location of
the first gesture is determinedand the first box is enlarged and 35 some embodiments.
I
In some embodiments, user interfaces 4000A-4000F (in
substantially centered onthe touch screen display (e.g., Block
FIGS. 7A-7F, respectively) include the following elements,
5 3914-5, FIG. 5C).
or a subset or superset thereof:
In some embodiments, in response to a tap or other pre402, 404, 406, 3902, 3906, 3910, 3912, 3918, 3920, 3922,
defmed user gesture on URL entry box 3908, the touch screen
as described above;
displays an enlarged entry box 3926 and a keyboard 616 (e.g., 40
inline multimedia content 4002, such as QuickTime conUI3900B, FIG.5BinportraitviewingandUI3900E, FIG.5E
tent (4002-1), Windows Media content (4002-2), or
in landscape viewing). In some embodiments, the touch
Flash content (4002-3);
screen also displays:
Contextual clear icon 3928 that when activated (e.g., by a
finger tap on the icon) initiates deletion of all text in
entry box 3926;
45
a search icon 3930 that when activated (e.g., by a fmger tap
on the icon) initiates an Internet search using the search
terms input in box 3926; and
50
Go to URL icon 3932 that when activated (e.g., by a finger
tap on the icon) initiates acquisitionofthe web page with
the URL input in box 3926;
Thus, the same entry box 3926 may be used for inputting
both search terms and URLs. In some embodiments, whether 55
or not clear icon 3928 is displayed depends on the context.
UI 3900G (FIG. 5G) is a UI for adding new windows to an
application, such as the b-,,-, 147. UI 3900G displays an
other types of content 4004 in the structured document,
such as text;
Exit icon 4006 that when activated (e.g., by a fmger tap on
the icon) initiates exiting the inline multimedia content
player UI (e.g., UI 4000B or 4000F) and returning to
another UI (e.g., UI 4000A, FIG. 7A);
Lapsed time 4008 that shows how much of the inline multimedia content 4002 has been played, in units of time;
Progress bar 4010 that indicates what fraction of the inline
multimedia content 4002 has been played and that may
be used to help scroll through the inline multimedia
content in response to a user gesture;
Remaining time 4012 that shows how much of the inline
multimedia content 4002 remains to be played, in units
of time;
application (e.g., the browser 147), which includes a disDownloading icon 4014 that indicates when inline multimedia content 4002 is being downloaded or streamed to
played window (e.g., web page 3912-2) and at least one 60 ,
hidden window (e.g., web pages 3912-1 and 3934-3 and
the device;
possibly other web pages that are completely hidden offFast Reverse/Skip Backwards icon 4016 that when actiscreen). UI 3900G also displays an icon for adding windows
vated (e.g., by a finger tap on the icon) initiates reversing
to the application (e.g., new window or new page icon 3936).
or skipping backwards through the inline multimedia
In responseto detecting activation ofthe icon3936 for adding 65
content 4002;
windows, the browser adds a window to the application (e.g
a new window for a new web page 3912).
Play icon 4018 that when activated (e.g., by a fmger tap
4026 (FIG. 7C) on the icon) initiates playing the inline
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027914
US 7,864,163 B2
23
24
multimedia content 4002, either from the beginning or
from where the inline multimedia content was paused;
In response to detecting the second gesture, one or more
playback controls for playing the enlarged item of inline
multimedia content are displayed (8010). In some embodiFast Forward/Skip Forward icon 4020 that initiates forments, the one or more playback controls comprise a play
warding or skipping forwards through the inline multimedia content 4002;
5 icon (e.g., icon 4018, FIG. 7C), a pause icon (e.g., icon 4024,
FIG. 7E), a sound volume icon (e.g., icon 4022), and/or a
Volume adjustment slider icon 4022 that that when actiplayback progress bar icon (e.g., icon 4010).
vated (e.g., by a finger tap on the icon) initiates adjustIn some embodiments, displaying one or more playback
ment of the volume of the inline multimedia content
controls comprises displaying one or more playback controls
4002; and
Pause icon 4024 that when activated (e.g., by a fingertap on 10 on top ofthe enlarged item ofinline multimedia content (e.g.,
playback controls 4016, 4018, 4020, and 4022 are on top of
the icon) initiates pausing the inline multimedia content
enlarged inline multimedia content 4002-1 in FIG. 7C). In
4002.
FIG. 8 is a flow diagram illustrating a process 8000 for
some embodiments, the one or more playback controls are
superimposed on top of the enlarged item of inline multime-
displaying inline multimedia content on a portable electronic
device with a touch screen display (e.g., device100) in accor- 15 dia content. In some embodiments, the one or more playback
controls are semitransparent.
dance with some embodiments. The portable electronic
In some embodiments, an instructioninthe structured electronic document to automatically start playing the item of
inline mul"-, content is overridden, which gives the
and 4004, FIG. 7A). In some embodiments, the structured 20 device time to downloadmore ofthe selected inline multimedia content prior to starting playback.
electronic document is a web page (e.g., web page 3912). In
A third gesture is detected on one ofthe playback controls
some embodiments, the structured electronic document is an
(e.g., gesture 4026 on play icon 4018, FIG. 7C) (8012).
- - or XML document.
In response to detecting the third gesture, the enlarged item
A first gesture (e.g., gesture 4028, FIG. 7A) is detected on
device displays at least a portion of a structured electronic
document on the touch screen display (8002). The structured
electronic document comprises content (e.g., content 4002
an item of inline multimedia content (e.g., content 4002-1, 26 of inline multimedia content is played (8014). In some
embodiments, playing the enlarged item of inline multimedia
FIG. 7A) in the displayedportion ofthe structured electronic
content comprises playing the enlarged item of inline multidocument (8004). In some embodiments, the inline multimemedia content with a plugin for a content type associatedwith
dia content comprises video and/or audio content. In some
In response to detecting the first gesture, the item of inline
multimedia content is enlarged on the touch screen display
the item of inline multimedia content.
In some embodiments, while the enlarged item of inline
multimedia content is played, the one or more playback controls cease to be displayed (e.g., FIG. 7D, which no longer
and other content (e.g., content 4004 and other content 4002
besides 4002-1, FIG. 7A) in the structured electronic docu-
ments, all of the playback controls cease to be displayed. In
embodiments, the content can be played with a QuickTime,
Windows Media, or Flash plugin.
ment besides the enlarged item of inline multimedia content
30
displays playback controls 4016, 4018, 4020, and 4022, but
still shows 4006, 4008, 4010, and 4012). In some embodi-
some embodiments, ceasing to display the one or more playceases to be displayed (e.g., UI 4000B, FIG. 7B orUI 4000F,
back controls comprises fading out the one or more playback
FIG. 7F) (8006).
controls. In some embodiments, the display of the one or
In some embodiments, enlarging the item of inline multimore playback controls is ceased after a predeterminedtime.
media content comprises animated zooming in on the item. In 40 In some embodiments, the display of the one or more playsome embodiments, enlarging the item of inline multimedia
back controls is ceased after no contact is detected with the
content comprises simultaneously zooming and translating
touch screen display for a predetermined time.
the item of inline multimedia content on the touch screen
In some embodiments, a fourth gesture is detected on the
display. In some embodiments, enlarging the item of inline
touch screen display (8016). In response to detecting the
multimedia content comprises rotating the item ofinline mul- 45 fourth gesture, at least the portion ofthe structured electronic
timedia content by 90° (e.g., from UI 4000A, FIG. 7A to UI
do---i is displayed again (e.g., FIG. 7A) (8018). In some
4000B, FIG. 7B).
embodiments, the fourth gesture comprises a tap gesture on a
In some embodiments, the item of inline multimedia conplayback completion icon, such as a done icon (e.g., gesture
tent has a full size; the touch screen display has a size; and
4032 on done icon 4006, FIG.7D). In some embodiments, the
enlarging the item of inline multimedia content comprises se item of inline multimedia content returns to its size prior to
enlarging the item ofinline multimedia content to the smaller
being enlarged.
of the full size of the item and the size of the touch screen
In some embodiments, the first, second, and third gestures
display.
are fmger gestures. In some embodiments, the first, second,
In some embodiments, enlarging the item of inline multiand third gestures are stylus gestures.
media content comprises expanding the item of inline multi- ss
In some embodiments, the first, second, and third gestures
media content so that the width of the item of inline multiare tap gestures. In some embodiments, the tap gesture is a
media content is substantially the same as the width of the
double tap with a single finger, a double tap with two fingers,
touch screen display (e.g., UI 4000B, FIG. 7B or UI 4000F,
a single tap with a single finger, or a single tap with two
FIG. 7F).
fingers.
In some embodiments, ceasing to display other content in 60
While the multimedia display process 8000 described
the structured electronic document besides the item of inline
above includes a number ofoperations that appear to occur in
multimedia content comprises fading out the other content in
a specific order, it should be apparent that the process 8000
the structured electronic document besides the item of inline
can include more or fewer operations, which can be executed
multimedia content.
serially or in parallel (e.g., using parallel processors or a
While the enlarged item of inline multimedia content iß 65 multi-threading environment), an order oftwo or more operadisplayed, a second gesture is detected on the touch screeri
tions may be changed and/or two or more operations may be
display (e.g., gesture 4030, FIG. 7B) (8008).
combined into a single operation.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027915
US 7,864,163 B2
25
A graphical user interface on a portable electronic device
with a touch screen display comprises: at least a portion of a
structured electronic document, wherein the structured electronic document comprises content; an item of inline multi-
26
in response to detecting the second gesture, the structured
electronic document is translated so that the second box
is substantially centered on the touch screen display.
3. The method of claim 2, including: prior to displaying at
media content in the portion ofthe structured electronic docu- s least a portion of a structured electronic document,
determining borders, margins, and/or paddings for the plument; and one or more playback controls. In response to
detecting a first gesture on the item of inline multimedia
rality of boxes that are specified in the structured electronic document; and
content, the item of inline multimedia content on the touch
adjusting the borders, margins, and/or paddings for the
screen display is enlarged, and display of other content in the
plurality of boxes for display on the touch screen disstructured electronic document besides the enlarged item of 10
play.
inline multimedia content is ceased. In response to detecting
4. The method ofclaim 2, wherein the structuredelectronic
a second gesture on the touch screen display while the
enlarged item of inline multimedia content is displayed, the
one or more playback controls for playing the enlarged item
document is a web page.
5. The method ofclaim 2, wherein the structured electronic
of inline multimedia content are displayed. In response to 15 document is an - - or XML document.
6. The method of claim 2, wherein:
detecting a third gesture on one of the playback controls, the
the structured electronic document has a document width
enlarged item of inline multimedia content is played.
The foregoing description, for purpose of explanation, has
beendescribed with reference to specific embodiments. However, the illustrative discussions above are not intended to be 20
exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in
view of the above teachings. The embodiments were chosen
and described in order to best explain the principles of the
1-tion and its practical applications, to thereby enable 25
others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to
the particular use contemplated.
and a document length;
the touch screen display has a display width; and
displaying at least a portion of the structured electronic
document comprises scaling the document width to fit
within the display width independent of the document
length.
7. The method of claim 6, wherein:
the touch screen display is rectangularwith a short axis and
a long axis;
the display width corresponds to the short axis when the
structured electronic document is seen in portrait view;
and
the display width corresponds to the long axis when the
structured electronic document is seen in landscape
view.
What is claimed is:
30
1. A computer-implementedmethod, comprising:
at a portable electronic device with a touch screen display;
displaying at least a portion of a web page on the touch
8. The method ofclaim 2, whereinthe plurality ofboxes are
screen display, wherein the web page comprises a pludefined by a style sheet language.
rality of boxes of content;
9. The method of claim 8, wherein the style sheet language
detecting a first finger tap gesture at a location on the 35
is a cascading style sheet language.
displayed portion of the web page;
10. The method of claim 2, wherein the first gesture is a
determining a first box in the plurality of boxes at the
finger gesture.
location of the first finger tap gesture; and
11. The method of claim 2, wherein the first gesture is a
enlarging and translating the web page so as to substan40
stylus gesture.
tially center the first box on the touch screen display,
12. The method ofclaim 2, wherein the first gesture is a tap
wherein enlarging comprises expanding the first box so
that the width ofthe first box is substantially the same as
gesture.
the width of the touch screen display;
13. The method of claim 12, wherein the first gesture is a
resizing text in the enlarged first box to meet or exceed a 45 double tap with a single finger, a double tap with two fingers,
predetermined 21- text size on the touch screen
a single tap with a single finger, or a single tap with two
display;
fingers.
while the first box is enlarged, detecting a second finger tap
14. The method of claim 2, wherein:
gesture on a second box other than the first box; and
the structured electronic document has an associated renin response to detecting the second finger tap gesture, 50
der tree with a plurality of nodes; and determining the
translating the web page so as to substantially center the
first box at the location of the first gesture comprises:
second box on the touch screen display.
traversing down the render tree to determine a first node in
the plurality of nodes that corresponds to the detected
2. A computer-implemented method, comprising:
location of the first gesture;
at a portable electronic device with a touch screen display;
traversing up the render tree from the first node to a
displayingat least a portionofa structured electronic docu- ss
closest parent node that contains a logical grouping of
ment on the touch screen display, whereinthe structured
content; and
electronic document comprises a plurality of boxes of
identifying content corresponding to the closest parent
content;
node as the first box.
detecting a first gesture at a location on the displayed
portion of the structured electronic document;
60 15. The method of claim 14, wherein the logical grouping
of content comprises a paragraph, an image, a plugin object,
determining a first box in the plurality of boxes at the
or a table.
location of the first gesture;
16. The method of claim 14, wherein the closest parent
enlarging and translating the structured electronic docunode is a replaced inline, a block, an inline block, or an inline
ment so that the first box is substantially centered on the
65 table.
touch screen display;
17. The method of claim 2, wherein enlarging and transwhile the first box is enlarged, a second gesture is detected
on a second box other than the first box; and
lating the structured electronic document comprises display-
Conv provided bv USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027916
US 7,864,163 B2
27
28
ing at least a portion of the second box of the plurality of
boxes of content on the touch screen display.
30. The method of claim 27, wherein the third gesture is a
finger gesture.
18. The method of claim 2, wherein enlarging comprises
31. The method of claim 27, wherein the third gesture is a
expanding the first box so that the width of the first box is
stylus gesture.
substantially the same as the width of the touch screen dis- 5
32. The method of claim 27, wherein the third gesture is a
play.
tap gesture.
19. The method of claim 2, including resizing text in the
33. The method of claim 32, wherein the third gesture is a
enlarged first box to meet or exceed a predetermined minidouble tap witha single finger, a double tap with two fingers,
mum text size on the touch screen display.
a single tap with a single finger, or a single tap with two
20. The method of claim 19, wherein the text resizing 10 fmgers.
comprises:
34. The method of claim 2, whereinthe second gesture and
determining a scale factor by which the first box will be
the first gesture are the same type of gesture.
enlarged;
35. The method of claim 2, whereinthe second gesture is a
dividing the predetermined ...:..:...-text size on the touch
fmger gesture.
screen display by the scaling factor to determine a mini- 15
36. The method of claim 2, whereinthe second gesture is a
mum text size for text in the first box; and
stylus gesture.
if a text size for text in the first box is less than the deter37. The method ofclaim 2, wherein the second gesture is a
mined
text size, mcreasmg the text size for
tap gesture.
text in the first box to at least the determined minimum
38. The method of claim 37, whereinthe second gesture is
text size.
20 a double tap with a single finger, a double tap with two fingers,
21. The method of claim 20, wherein: the first box has a
a single tap with a single finger, or a single tap with two
width; the display has a display width; and the scale factor is
fingers.
the display width divided by the width ofthe first box prior to
enlarging.
22. The method of claim 19, wherein the resizing occurs 25
during the enlarging.
23. The method of claim 19, wherein the resizmg occurs
after the enlarging.
39. The method of claim 2, including:
detecting a swipe gesture on the touch screen display; and
in response to detecting the swipe gesture, translating the
displayed portion ofthe structured electronic document
on the touch screen display.
40. The method of claim 39, whereintranslating comprises
24. The method of claim 2, including resizing text in the 30 vertical, horizontal, or diagonal movement of the structured
structured electronic document to meet or exceed a predetermined minimum text size on the touch screen display.
25. The method of claim 24, wherein the text resizing
comprises:
electronic document on the touch screen display.
41. The method of claim 39, wherein the swipe gesture is a
finger gesture.
42. The method of claim 39, wherein the swipe gesture is a
determining a scale factor by which the first box will be 35 stylus gesture.
enlarged;
43. The method of claim 2, including:
dividing the predeterminedminimumtext size on the touch
detecting a fifth gesture on the touch screen display,
screen display by the scaling factor to determine a miniin response to detecting the fifth gesture, rotating the dismum text size for text in the structured electronic docuplayed portion ofthe structured electronic document on
ment; and
40
the touch screen display by 90° .
if a text size for text in the structured electronic document
44. The method of claim 43, wherein the fifth gesture is a
is less than the determined =2- text size, L -a
finger gesture.
ing the text size for text in the structured electronic
45. The method of claim 44, wherein the fifth gesture is a
document to at least the determined ...:..:...- text size.
multifinger gesture.
26. The method of claim 24, wherein the text resizing
45
46. The method of claim 45, wherein the fifth gesture is a
comprises:
twisting gesture.
identifying boxes containing text in the plurality ofboxes;
47. The method of claim 2, including:
determining a scale factor by which the first box will be
enlarged;
dividing the predeterminedminimumtext size on the touch 50
screen display by the scaling factor to determine a minimum text size for text in the structured electronic document; and
for each identified box containing text, ifa text size for text
in the identified box is less than the determined mini- 55
mum text size, increasing the text size for text in the
identified box to at least the determined =&- text
size and adjusting the size of the identified box.
27. The method of claim 2, including:
detecting a third gesture on the enlarged second box; and 60 ,
in response to detecting the third gesture, reducing in size
the displayed portion of the structured electronic document.
detecting a change in orientation of the device,
in response to detecting the change in orientation of the
device, rotating the displayed portion of the structured
electronic document on the touch screen display by 90°
48. The method of claim 2, including:
detecting a multi-finger de-pinch gesture on the touch
screen display,
in response to detecting the multi-finger de-pinch gesture,
enlarging a portion ofthe displayed portion ofthe structured electronic documenton the touch screen display in
accordance with a position of the multi-finger de-pinch
gesture and an amount of finger movement in the multifinger de-pinch gesture.
49. A graphical user interface on a portable electronic
device with a touch screen display, comprising:
28. The method ofclaim 27, whereinthe first box returns to
65
at least a portion of a structured electronic document,
its size prior to being enlarged.
whereinthe structured electronic document comprises a
29. The method of claim 27, wherein the third gesture and
plurality of boxes of content;
the first gesture are the same type of gesture.
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027917
US 7,864,163 B2
29
wherein:
in response to detecting a first gesture at a location on the
portion of the structured electronic document:
a first box in the plurality of boxes at the location of the
first gesture is determined;
the structured electronic document is enlargedandtranslated so that the first box is substantially centered on
the touch screen display;
while the first box is enlarged, a .._1gesture is detected
on a second box other than the first box; and
in response to detecting the second gesture, the structured
electronic document is translated so that the second box
is substantially centered on the touch screen display.
50. A portable electronic device, comprising:
a touch screen display;
one or more processors;
memory; and
one or more programs, wherein the one or more programs
are stored in the memory and configured to be executed
by the one or more processors, the one or more programs
including:
instructions for displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document
comprises a plurality of boxes of content;
instructions for detecting a first gesture at a location on
the displayed portion of the structured electronic
document;
instructions for determining a first box in the plurality of
boxes at the location of the first gesture;
instructions for enlarging and translating the structured
electronic document so that the first box is substantially centered on the touch screen display;
instruction for, while the first box is enlarged, a second
gesture is detected on a secondbox other than the first
box; and
instructions for, in response to detecting the second gesture, the structured electronic document is translated
so that the second box is substantially centered on the
touch screen display.
51. A non-transitory computer readable storage medium
storing one or more programs, the one or more programs
comprising instructions, which when executed by a portable
electronic device with a touch screen display, cause the device
to:
30
5
means for detecting a first gesture at a location on the
displayedportion ofthe structured electronic document;
means for determining a first box in the plurality ofboxes
at the location of the first gesture;
means for enlarging and translating the structured electronic document so that the first box is substantially
centered on the touch screen display;
10
means for, while the first box is enlarged, a second gesture
is detected on a second box other thanthe first box; and
means for, in response to detecting the second gesture, the
structured electronic document is translated so that the
second box is substantially centered on the touch screen
display.
53. A portable electronic device, comprising:
15 a touch screen display;
one or more processors;
memory; and
a program, wherein the program is stored in the memory and
configured to be executed by the one or more processors, the
20 program including instructions for:
displaying a user interface on the touch screen display,
wherein the user interface includes:
a displayed window of an application, the displayed
windowbeing in full view on the touch screen display,
25
and
one or more partially hidden windows ofthe application;
while displaying the displayedwindow andthe one ormore
30
partially hidden windows, detecting a gesture on the
touch screen display;
in response to detecting the gesture,
moving the displayed window partially or fully off the
touch screen display, and
moving a first partially hidden window into full view on
35
the touch screen display; and
in response to detecting a gesture on an icon, a window of
the application in the center of the touch screen display
is enlarged.
54. The portable electronic device ofclaim 53, whereinthe
detected gesture is a swipe gesture.
40
55. The portable electronic device of claim 53, including:
in response to detecting the gesture, moving a second par-
tially hidden window off the touch screen display.
56. The portable electronic device ofclaim 53, whereinthe
45
display at least a portion of a structured electronic document on the touch screen display, wherein the struc-
tured electronic document comprises a plurality of
boxes of content;
detect a first gesture at a location on the displayed por- 50
tion of the structured electronic document;
det..
. a first box in the plurality of boxes at the
location of the first gesture;
enlarge and translate the structured electronic document
detected gesture is a left-to-right swipe gesture, including:
in response to detecting the left-to-right swipe gesture:
moving the displayed window partially off-screen to the
right,
moving the first partially hidden window into full view
on the touch screen display, and
moving a second partially hidden window completely
off-screen.
57. The portable electronic device ofclaim 53, whereinthe
user interface is displayedin response to activation ofan icon
for initiating display of the user interface, and wherein the
so that the first box is substantially centered on the 55 icon for initiating display of the user interface indicates the
number of windows in the application.
touch screen display;
58. The portable electronic device ofclaim 53, whereinthe
while the first box is enlarged, detect a second gesture on a
displayed window and the one or more partially hidden winsecond box other than the first box; and
dows are displayed prior to detecting the gesture.
in response to detecting the second gesture, translate the
59. A computer-implementedmethod, comprising:
structured electronic document so that the second box is 60
at a portable electronic device with a touch screen display:
substantially centered on the touch screen display.
displaying a user interface on the touch screen display,
52. A portable electronic device with a touch screen diswherein the user interface includes:
play, comprising:
means for displaying at least a portion of a structured
electronic document on the touch screen display, 65
whereinthe structured electronic document comprises'a
plurality of boxes of content;
a displayed window of an application, the displayed
window being in full view on thetouch screen display,
and
one or more partially hiddenwindows ofthe application;
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027918
US 7,864,163 B2
31
32
while displayingthe displayedwindow andthe one ormore
partially hidden windows, detecting a gesture on the
in response to detecting a gesture on an icon, a window of
the application in the center of the touch screen display
touch screen display;
is enlarged.
in response to detecting the gesture,
61. A non-transitory computer readable storage medium
moving the displayed window partially or fully off the 5 storing one or more programs, the one or more programs
comprising instructions, which when executed by a portable
touch screen display, and
electronic device with atouch screen display, causethe device
moving a first partially hidden window into full view on
to:
the touch screen display; and,
display a user interface on the touch screen display,
in response to detecting a gesture on an icon, a window of
wherein the user interface includes:
the application in the center of the touch screen display 10
is enlarged.
60. A graphical user interface on a portable electronic
device with a touch screen display, the graphical user interface comprising:
a displayed window of an application, the displayed win- 15
dow being in full view on the touch screen display, and
one or more partially hidden windows of the application;
wherein:
while displaying the displayed window and the one or
more partially hidden windows, a gesture is detected 20
on the touch screen display;
in response to detecting the gesture:
the displayed window is moved partially or fully off
the touch screen display, and
25
a first partially hidden window is moved into full view
on the touch screen display; and,
a displayed window of an application, the displayed
window being in full view on the touch screen display,
and
one or morepartially hiddenwindows ofthe application;
while displaying the displayedwindow andthe one ormore
partially hidden windows, detect a gesture on the touch
screen display;
in response to detecting the gesture,
move the displayed window partially or fully off the
touch screen display, and
move a first partially hidden window into full view on
the touch screen display; and
in response to detecting a gesture on an icon, enlarge a
window of the application in the center of the touch
screen display.
*****
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027919
UNITED STATES PATENT Awn TRADE= = OFFICE
CERI WICATE OF COMuRCTION
PAimNi NO.
APPLICATION NO.
DATED
INVENTOR(S)
:
:
:
:
7,864,163 B2
11/850013
January 4, 2011
Ording et al.
Page 1 of2
It is certified that error appears in the above-identified patent and that said Letters Patent is hereby corrected as shown below:
Claim 2, column 25, line 66, between the words "enlarged," and "a second" insert
--detecting--, and at the end of the line, delete "is detected".
Claim 2, column 26, line 1, between the words "gesture," and "the structured" insert
--translated--.
Claim 2, column 26, line 2, delete "is translated".
Claim 50, column 29, line 34, between the words "enlarged," and "a second" insert
--detecting--.
Claim 50, column 29, line 35, delete "is detected".
Claim 50, column 29, line 38, between the words "gesture," and "the structured" insert
--translating--, and at the end of the line delete "is translated".
Claim 52, column 30, line 8, between the words "enlarged" and "a second" insert
--detecting--.
Claim 52, column 30, line 9, delete "is detected".
Claim 52, column 30, line 10, between the words "gesture," and "the structured" insert
--translating--.
Claim 52, column 30, line 11, delete "is translated".
Claim 53, column 30, line 35, between the words "icon," and "a window" insert --enlarging--.
Claim 53, column 30, line 37, delete "is enlarged".
Signed and Sealed this
Fifteenth Day of March, 2011
David J. Kappos
Director ofthe United States Patent and Trademark Ogice
Conv provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027920
CER <·CATE OF CORRECTION (continued)
U.S. Pat. No. 7,864,163 B2
Page 2 of2
Claim 59, column 31, line 9, between the words "icon," and "a window" insert -enlarging-.
Claim 59, column 31, line 11 delete "is enlarged".
Copy provided by USPTO from the PIRS Image Database on 06/20/2011
APLNDC00027921
Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.
Why Is My Information Online?