Apple Inc. v. Samsung Electronics Co. Ltd. et al

Filing 946

EXHIBITS re #943 Declaration in Support,, Exhibits to Arnold Declaration (Ex. 84 (Part 1)) filed bySamsung Electronics America, Inc., Samsung Electronics Co. Ltd., Samsung Telecommunications America, LLC. (Attachments: #1 Ex. 84 (Part 2), #2 Ex. 84 (Part 3), #3 Ex. 84 (Part 4), #4 Ex. 84 (Part 5), #5 Ex. 85 (Part 1), #6 Ex. 85 (Part 2), #7 Ex. 85 (Part 3), #8 Ex. 86 (Part 1), #9 Ex. 86 (Part 2), #10 Ex. 86 (Part 3), #11 Ex. 86 (Part 4))(Related document(s) #943 ) (Maroulis, Victoria) (Filed on 5/18/2012)

Download PDF
U.S. Patent Jan. 4, 2011 Sheet 29 of 29 US 7,864,163 B2 8000 8002 Display, on a touch screen display of a portable electronic device, at least a portion of a structured electronic document (e.g., a web page) (e.g., an HTML or XML document) comprising content. 8004 Detect a first gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture) on an item of inline multimedia content (e.g., video and/or audio content) in the displayed portion of the structured electronic document. 8006 In response to detecting the first gesture, enlarge the item of inline multimedia content on the touch screen display. Cease to display other content in the structured electronic document besides the enlarged item of inline multimedia content. 8008 While the enlarged item of inline multimedia content is displayed, detect a second gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture) on the touch screen display. 8010 In response to detecting the second gesture, display one or more playback controls (e.g., play, pause, sound volume, and/or playback progress bar icons) for playing the enlarged item of inline multimedia content. 8012 Detect a third gesture (e.g., a finger or stylus gesture) (e.g., a tap gesture) on one of the playback controls. 8014 In response to detecting the third gesture, play the enlarged item of inline multimedia content. 8016 Detect a fourth gesture (e.g., a tap gesture on a playback completion icon) on the touch screen display. 8018 Display, again, at least the portion of the structured electonic document. Figure 8 Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027903 US 7,864,163 B2 1 2 PORTABLE ELECTRONIC DEVICE, METHOD, AND G- = «·-··CAL USER INTERFACE FOR DISv..aw.=G STRUCTURED ELECTRONIC DOCUMENTS and manipulatedata. These conventional user interfaces oûen result in complicated key sequences and menu hierarchies that must be --manzed by the user. Many conventional user interfaces, such as those that RELATED APPLICATIONS s include physical pushbuttons, are also inflexible. This may prevent a user interface from being configured and/or adapted by either an application running on the portable device or by This application claims priority to U.S. Provisional Patent users. When coupled with the time consuming requirementto Application Nos. 60/937,993, "Portable Multifunction .1. multiple key sequences and menu hierarchies, and Device," filed Jun. 29, 2007; 60/946,715, "Portable Elec- 10 the difficulty in activating a desired pushbutton, such inflextronic Device, Method, And Graphical User Interface For ibility is frustrating to most users. Displaying Structured Electronic Documents," filed Jun. 27, In particular, it is slow and tedious to navigate in structured 2007; 60/879,469, "Portable Multifunction Device," filed electronic documents (e.g., web pages) in portable electronic devices with small ...._ using conventional input devices Jan. 8, 2007; 60/879,253, To-Lle Multifunction Device," filed Jan. 7, 2007; and 60/824,769, "Portable Multifunction 15 (e.g., 5-way toggle switches). M......., it is cumbersome to Device," filed Sep. 6, 2006. All of these applications are control and view multimedia content within such documents incorporated by referenced herein in their entirety. on portable electronic devices. This application is related to the following applications: (1) Accordingly, there is a need for portable electronic devices U.S. patent application Ser. No. 10/188,182, "Touch Pad For with more transparent and intuitive user interfaces for viewHandheld Device," filed Jul. 1, 2002; (2) U.S. patent applica- 20 ing and navigating structured electronic documents and multion Ser. No. 10/722,948, "Touch Pad For Handheld Device," timedia content within such documents. Such interfaces 1..._. the effëctiveness, efficiency and user satisfaction filed Nov. 25, 2003; (3) U.S. patent application Ser. No. 10/643,256, "Movable Touch Pad With Added Functionalwith activities like web browsing on portable electronic ity," filed Aug. 18, 2003; (4) U.S. patent application Ser. No. 10/654,108, "Ambidextrous Mouse," filed Sep. 2, 2003; (5) devices. 25 U.S. patent application Ser. No. 10/840,862, "Multipoint Touchscreen," filed May 6, 2004; (6) U.S. patent application The above deficiencies and other problems associated with Ser. No. 10/903,964, "Gestures For Touch Sensitive Input user interfaces for portable devices are reduced or eliminated Devices," filed Jul. 30, 2004; (7) U.S. patent application Ser. No. 11/038,590, "Mode-Based Graphical User Interfaces For 30 by the disclosed portable multifunction device. In some embodiments, the device has a touch-sensitive display (also Touch Sensitive Input Devices" filed Jan. 18, 2005; (8) U.S. known as a "touch screen") with a graphical user interface patent application Ser. No. 11/057,050, "Display Actuator," (GUI), one or more processors, memory and one or more filed Feb. 11, 2005; (9) U.S. Provisional Patent Application modules, programs or sets of instructions stored in the No. 60/658,777, "Multi-Functional Hand-HeldDevice," filed Mar. 4, 2005; (10) U.S. patent application Ser. No. 11/367, 35 memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through 749, "Multi-Functional Hand-Held Device," filed Mar. 3, finger contacts and gestures on the touch-sensitive display. In 2006; and (11) U.S. Provisional Patent Application No. some embodiments, the functions may include telephoning, 60/947,155, "Portable Electronic Device, Method, And video conferencing, e-mailing, instant messaging, blogging, Graphical User Interface For Displaying Inline Multimedia Content", filed Jun. 29, 2007. All of these applications are 40 digital photographing, digital videoing, web browsing, digital music playing, and/or digital video playing. Instructions for performing these functions may be included in a computer readable storage medium or other computer program product - =CAL FIELD configured for execution by one or more processors. In one aspect of the invention, a computer-implemented The disclosed embodiments relate generally to portable 45 method, for use in conjunction with a portable electronic electronic devices, and more particularly, to portable elecdevice with a touch screen display, comprises: displaying at tronic devices that display structured electronic documents least a portion of a structured electronic document on the such as web pages on a touch screen display. incorporated by reference herein in their entirety. touch screen display, wherein the structured electronic docu50 ment comprises a plurality of boxes of content; detecting a first gesture at a location on the displayed portion of the structured electronic document; determining a first box in the As portable electronic devices become more compact, and plurality of boxes at the location of the first gesture; and the number of functions performed by a given device enlarging and substantially centering the first box on the increase, ithas become a significant challengeto design auser BACKGROUND interface that allows users to easily interact with a multifunc- 55 touch screen display. tion device. This challenge is particular significant for handheld portable devices, whichhave much smaller screens than y In another aspect of the invention, a graphical user interface on a portable electronic device with a touch screen display comprises: at least a portion of a structured electronic document, wherein the structured electronic document com60 , prises a plurality ofboxes ofcontent. In response to detecting a first gesture at a location on the portion of the structured electronic document, a first box in the plurality ofboxes at the location of the first gesture is determined and the first box is enlarged and substantially centered on the touch screen dis- desktop or laptop computers. This situation is unfortunate because the user interface is the gatewaythroughwhichusers receive not only content but also responses to user actions or behaviors, including user attempts to access a device's features, tools, and functions. Some portable communication devices (e.g., mobile telephones, sometimes called mobile phones, cell phones, cellular telephones, and the like) have resorted to adding more pushbuttons, increasing the density 65 play. In another aspect of the invention, a portable electronic ofpush buttons, overloading the functions ofpushbuttons, or device comprises: a touch screen display, one or more prousing complex menu systems to allow a user to access, store Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027904 US 7,864,163 B2 3 4 cessors, memory, and one or more programs. The one ormore pages on a portable electronic device with a touch screen programs are stored in the memory and configured to be display in accordance with some embodiments. executed by the one or more processors. The one or more FIGS. 7A-7F illustrate exemplary user interfaces for playprograms include instructions for displaying at least a portion ing an item of inline multimedia content in accordance with of a structured electronic document on the touch screen dis- 5 some embodiments. play, wherein the structured electronic document comprises a FIG. 8 is a flow diagram illustrating a process for displayplurality ofboxes of content. The one or more programs also ing inline multimedia content on a portable electronic device with a touch screen display in accordance with some embodiinclude: instructions for detecting a first gesture at a location on the displayed portion of the structured electronic documents. ment; instructions for determining a first box in the plurality 10 D = ·-· ·ON OF EMBODIMENTS of boxes at the location of the first gesture; and instructions for enlarging and substantially centering the first box on the Reference will now be made in detail to embodiments, touch screen display. examples ofwhich are illustrated in the accompanying drawIn another aspect of the invention, a computer-program product comprises a computer readable storage medium and 15 ings. In the following detailed description, numerous specific details are set forth in order to provide a thoroughunderstanda computer program mechanism (e.g., one or more computer ing of the present invention. However, it will be apparent to programs) embedded therein. The computer program mechaone of ordinary skill in the art that the present invention may nism comprises instructions, which when executed by a porbe practiced without these specific details. In other instances, table electronic device with a touch screen display, cause the device: to display at least a portion of a structured electronic 20 well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecdocument on the touch screen display, whereinthe structured essarily obscure aspects of the embodiments. electronic document comprises a plurality of boxes of conIt will also be understood that, although the terms first, tent; to detect a first gesture at a location on the displayed second, etc. may be used herein to describe various elements, portion ofthe structured electronic document; to determine a first box in the plurality of boxes at the location of the first 25 these elements should not be limited by these terms. These terms are only used to distinguish one element from another. gesture; and to enlarge and substantially center the first box For example, a first gesture could be termed a second gesture, on the touch screen display. and, similarly, a second gesture could be termed a first gesIn another aspect of the invention, a portable electronic ture, without departing from the scope of the present invendevice with a touch screen display comprises: means for displaying at least a portion of a structured electronic docu- 30 tion. The terminology used in the description of the invention ment on the touch screen display, wherein the structured herein is for the purpose of describing particular embodielectronic document comprises a plurality of boxes of conments only and is not intendedto be limiting ofthe invention. tent; means for detecting a first gesture at a location on the As used in the description of the invention and the appended displayed portion of the structured electronic document; means for determining a first box in the plurality ofboxes at 35 claims, the singular forms "a", "an" and "the" are intendedto include the plural forms as well, unless the context clearly the location of the first gesture; and means for enlarging and indicates otherwise. It will also be understood that the term substantially centering the first box on the touch screen dis"and/or" as used hereinrefers to and encompasses any and all play. possible combinations of one or more of the associated listed The disclosed embodiments allow users to more easily view and navigate structured electronic documents and mul- 40 items. It will be furtherunderstoodthat the terms "comprises" and/or "comprising," when used in this specification, specify timedia content within such documents onportable electronic the presence of stated features, integers, steps, operations, devices. elements, and/or components, but do not preclude the presence or addition ofone or more other features, integers, steps, BRIEF DES- - · - · ·ON OF THE DRAWINGS 45 operations, elements, components, and/or groups thereof. Embodiments of a portable multifunction device, user For a better understanding of the aforementioned embodiinterfaces for such devices, and associated processes for ments of the invention as well as additional embodiments using such devices are described. In some embodiments, the thereof, reference should be made to the Description of device is a portable ---acations device such as a mobile Embodiments below, in conjunctionwith the following drawings in which like reference numerals refer to corresponding 50 telephone that also contains other functions, such as PDA and/or music player functions. parts throughout the figures. The user interface may include a physical click wheel in FIGS. 1A and 1B are block diagrams illustrating portable additionto a touch screen or a virtual click wheel displayedon multifunction devices with touch-sensitive displays in accorthe touch .._ A click wheel is a user-interface device that dance with some embodiments. may provide navigation commands based on an angular disFIG. 2 illustrates a portable multifunction device having a placement ofthe wheel or a point ofcontact with the wheel by touch screen in accordance with some embodiments. a user ofthe device. A clickwheel may also be usedto provide FIG. 3 illustrates an exemplary user interface forunlocking a user command corresponding to selection of one or more a portableelectronic device inaccordancewith some embodiitems, for example, when the user ofthe device presses down ments' 60 on at least a portion of the wheel or the center of the wheel. FIGS. 4A and 4B illustrate exemplary user interfaces for a Alternatively, breaking contact with a click wheel image on a menu of applications on a portable multifunction device in touch screen surface may indicate a user command corresponding to selection. For simplicity, in the discussion that FIGS. 5A-5M illustrate exemplary user interfaces for a follows, a portable multifunction device that includes a touch browser in accordance with some embodiments. es screen is used as an exemplary embodiment. It should be understood, however, that some of the user interfaces and FIGS. 6A-6C are flow diagrams illustrating a process for accordance with some embodiments. displaying structured electronic documents such as web associated processes may be applied to other devices, such as Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027905 US 7,864,163 B2 5 6 personal computers and laptop computers, which may include one or more other physical user-interface devices, such as a physical click wheel, a physical keyboard, a mouse It should be appreciated that the device 100 is only one example of a portable multifunction device 100, and that the device 100 may have more or fewer components than shown, may combine two or more components, or a may have a and/or a joystick. The device supports a variety of applications, such as one 5 different configuration or arrangement of the components. The various components shown in FIGS. 1A and 1B may be or more of the following: a telephone application, a video implemented in hardware, software or a combination of both conferencing application, an e-mail application, an instant hardwareand software, including one or more signal processmessaging application, a blogging application, a photo maning and/or application specific integrated circuits. agement application, a digital camera application, a digital 10 Memory 102 may include high-speed random access video camera application, a web browsing application, a digimemory and may also include non-volatile memory, such as tal music player application, and/or a digital video player one or more magnetic disk storage devices, flash memory application. devices, or other non-volatile solid-state memory devices. The various applications that may be executed on the Access to memory 102 by other components of the device device may use at least one --- physical user-interface 15 100, such as the CPU 120 and the peripherals interface 118, device, such as the touch screen. One or more functions ofthe may be controlled by the memory controller 122. touch screen as well as corresponding information displayed The peripherals interface 118 couples the input and output on the device may be adjusted and/or varied from one appliperipherals of the device to the CPU 120 and memory 102. cation to the next and/or within a respective application. In The one or more processors 120 run or execute various softthis way, a common physical architecture (such as the touch 20 ware programs and/or sets of instructions stored in memory screen) of the device may support the variety of applications 102 to perform various functions for the device 100 and to with user interfaces that are intuitive and transparent. process data. The user interfaces may include one or more soft keyboard In some embodiments, the peripherals interface 118, the embodiments. The soft keyboard embodiments may include CPU 120, and the memory controller 122 may be implestandard (QWERTY) and/or non-standard configurations of 25 mented on a single chip, such as a chip 104. In some other symbols on the displayed icons ofthe keyboard, such as those embodiments, they may be implemented on separate chips. described in U.S. patent application Ser. Nos. 11/459,606, The RF (radio frequency) circuitry 108 receives and sends "Keyboards For Portable Electronic Devices," filed Jul. 24, RF signals, also called electromagnetic signals. The RF cir2006, and 11/459,615, "Touch Screen Keyboards For Porcuitry 108 converts electrical signals to/fromelectromagnetic table Electronic Devices," filed Jul. 24, 2006, the contents of 30 signals and co-acates with so-Ecations networks which are hereby incorporated by reference. The keyboard and other co-Acations devices via the electromagnetic embodiments may include a reducednumber of icons (or soft signals. The RF circuitry 108 may include well-known cirkeys) relative to the number of keys in existing physical cuitry for performing these functions, including but not limkeyboards, such as that for a typewriter. This may make it ited to an antenna system, an RF transceiver, one or more easier for users to select one or more icons in the keyboard, 35 amplifiers, a tuner, one or more oscillators, a digital signal and thus, one or more corresponding symbols. The keyboard processor, a CODEC chipset, a subscriber identity module embodiments may be adaptive. For example, displayed icons (SIM) card, memory, and so forth. The RF circuitry 108 may may be modified in accordance with user actions, such as c -Acate with networks, such as the Internet, also selecting one or more icons and/or one or more corresponding referred to as the World Wide Web (WWW), an intranet symbols. One or more applications on the portable device 40 and/or a wireless network, such as a cellular telephone netmay utilize ---and/or different keyboardembodiments. work, a wireless local area network (LAN) and/or a metroThus, the keyboard embodiment used may be tailored to at politan area network (MAN), and other devices by wireless least some ofthe applications. In some embodiments, one or communication. The wireless communication may use any of more keyboard embodiments may be tailored to a respective a plurality of communications standards, protocols and techuser. For example, one or more keyboard embodiments may 45 nologies, including but not limited to Global System for be tailored to a respective user based on a word usage history Mobile Co cations (GSM), Enhanced Data GSM Envi(lexicography, slang, individual usage) ofthe respective user. ronment (EDGE), wideband code division multiple access Some of the keyboard embodiments may be adjusted to (W-CDMA), code division multiple access (CDMA), time reduce a probability of a user error when selecting one or division multiple access (TDMA), Bluetooth, Wireless Fidelmore icons, and thus one or more symbols, when using the so ity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g soft keyboard embodiments. and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Attention is now directed towards embodiments of the Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant device. FIGS. 1A and 1B are block diagrams illustrating portable multifunction devices 100 with touch-sensitive dismessaging (e.g., extensible messaging and presence protocol ' plays 112 in accordance with some embodiments. The touch- 55 (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or sensitive display 112 is sometimes called a "touch screen" for Instant Messaging and Presence Service (IMPS)), and/or convenience, and may also be known as or called a touchsensitive display system. The device 100 may include a Short Message Service (SMS)), or any other suitable commemory 102 (whichmay include one or more computer readmunicationprotocol, including communicationprotocols not able storage mediums), a memory controller 122, one ormore 60, yet developed as of the filing date of this document. processing units (CPU's) 120, a peripherals interface 118, RF The audio circuitry 110, the speaker 111, and the microphone 113 provide an audio interface between a user and the circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (1/O) subsystem 106, other input device 100. The audio circuitry 110 receives audio data from or control devices 116, and an external port 124. The device the peripherals interface 118, converts the audio data to an 100 may include one or more optical sensors 164. These 65 electrical signal, and transmits the electrical signal to the speaker 111. The speaker 111 converts the electrical signal to components may communicate over one or more commuhication buses or signal lines 103. human-andible sound waves. The audio circuitry 110 also Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027906 US 7,864, 163 B2 7 8 receives electrical signals converted by the microphone 113 from sound waves. The audio circuitry 110 converts the elec- capacitive, resistive, infrared, and surface acoustic wavetechnologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with trical signal to audio data and transmits the audio data to the peripherals interface 118 for processing. Audio data may be a touch screen 112. retrieved from and/or transmitted to memory 102 and/or the 5 A touch-sensitive display in some embodiments of the RF circuitry 108 by the peripherals interface 118. In some touch screen 112 may be analogous to the multi-touch sensiembodiments, the audio circuitry 110 also includes a headset tive tablets described in the following U.S. Pat. Nos. 6,323, jack (e.g. 212, FIG. 2). The headsetjack provides an interface between the audio circuitry 110 and ---ble audio input/ 846 (Wesk-- et al.), 6,570,557 (Westerman et al.), and/or 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/ outputperipherals, such as output-onlyheadphonesora head- 10 0015024A1, each of which is hereby incorporated by referset with both output (e.g., a headphone for one or both ears) ence. However, a touch screen 112 displays visual output and input (e.g., a microphone). from the portable device 100, whereas touch sensitive tablets The I/O subsystem106 couples input/output peripherals on do not provide visual output. the device 100, such as the touch screen 112 and other input/ A touch-sensitive display in some embodiments of the control devices 116, to the peripherals interface 118. The I/O 15 touch screen 112 may be as described in the following applisubsystem 106 may include a display controller 156 and one cations: (1) U.S. patent application Ser. No. 11/381,313, or more input controllers 160 for other input or control "Multipoint Touch Surface Controller," filed May 2, 2006; (2) devices. The one or more input controllers 160 receivelsend U.S. patent application Ser. No. 10/840,862, "Multipoint electrical signals from/to other input or control devices 116. Touchscreen," filed May 6, 2004; (3) U.S. patent application The other input/control devices 116 may include physical 20 Ser. No. 10/903,964, "Gestures For Touch Sensitive Input buttons (e.g., push buttons, rocker buttons, etc.), dials, slider Devices," filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, "Gestures For Touch Sensitive Input switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 may be coupled to Devices," filed Jan. 31, 2005; (5) U.S. patent application Ser. any (ornone) ofthe following: a keyboard, infrared port, USB No. 11/038,590, "Mode-Based Graphical User Interfaces For port, and a pointer device such as a mouse. The one or more 25 Touch Sensitive Input Devices," filed Jan. 18, 2005; (6) U.S. buttons (e.g., 208, FIG. 2) may include anup/downbutton for volume control of the speaker 111 and/or the microphone 113. The one or more buttons may include a push button (e.g., 206, FIG. 2). A quick press ofthe push button may disengage patent application Ser. No. 11/228,758, "Virtual Input Device Placement On A Touch Screen User Interface," filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, "Operation OfA Computer With A Touch Screen Interface," a lock of the touch screen 112 or begin a process that uses 30 filed Sep. 16, 2005; (8) U.S. patent application Ser. No. gestures onthetouch screen to unlockthe device, as described 11/228,737, "ActivatingVirtual Keys OfA Touch-ScreenVir- in U.S. patent application Ser. No. 11/322,549, "Unlocking a Device by Performing Gestures on an Unlock Image," filed tual Keyboard," filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, "Multi-Functional Hand-Held Dec. 23, 2005, which is hereby incorporated by reference. A Device," filed Mar. 3, 2006. All of these applications are longer press of the push button (e.g., 206) may turn power to 35 incorporated by reference herein. the device 100 on or off. The user may be able to customize a The touch screen 112 may have a resolution in excess of functionality of one or more of the buttons. The touch screen 100 dpi. In an exemplary embodiment, the touch screen has a 112 is used to implement virtual or soft buttons and one or resolution of approximately 160 dpi. The user may make more soft keyboards. contact with the touch screen 112 using any suitable object or The touch-sensitive touch screen 112 provides an input 40 appendage, such as a stylus, a finger, and so forth. In some interface and an output interface between the device and a user. The display controller 156 receives and/or sends electrical signals from/to the touch screen 112. The touch screen 112 displays visual output to the user. The visual output may embodiments, the user interface is designed to workprimarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact ofa finger on the touch screen. In some embodiments, the include graphics, text, icons, video, and any combination 45 device translates the rough finger-based input into a precise thereof (collectively termed "graphics"). In some embodipointer/cursor position or command for performing the ments, some or all of the visual output may correspond to actions desired by the user. user-interface objects, further details of which are described In some embodiments, in addition to the touch screen, the below. device 100 may include a touchpad (not shown) for activating A touch screen112 has a touch-sensitive surface, sensor or 50 or deactivating particular functions. In some embodiments, set of sensors that accepts input from the user based on haptic the touchpad is a touch-sensitive area of the device that, and/or tactile contact. The touch screen 112 and the display unlike the touch screen, does not display visual output. The controller 156 (along with any associatedmodules and/orsets touchpad may be a touch-sensitive surface that is separate ofinstructions in memory 102) detect contact (and any movefrom the touch screen 112 or an extension of the touchment or breaking ofthe contact) on the touch screen 112 and 55 sensitive surface formed by the touch converts the detected contact into interaction with user-interIn some embodiments, the device 100 may include a physiface objects (e.g., one or more soft keys, icons, web pages or cal or virtual click wheel as an input control device 116. A images) that are displayed on the touch ..... . In an exemuser may navigate among and interact with one or more plary embodiment, a point of contact between a touch screen graphical objects (henceforth referred to as icons) displayed 112 and the user corresponds to a finger of the user. 60 in the touch screen 112 by rotating the click wheel or by The touch screen 112 may use LCD (liquid crystal display) moving a point ofcontact withthe click wheel (e.g., wherethe technology, or LPD (light emitting polymer display) technolamount ofmovement ofthe point ofcontact is measuredby its ogy, althoughother display technologies may beused in other angular displacement with respect to a center point of the embodiments. The touch screen 112 and the display controlclick wheel). The click wheel may also be used to select one ler 156 may detect contact and any movement or breaking 65 or more of the displayed icons. For example, the user may thereofusing any of a plurality oftouch sensing technologies " press down on at least a portion of the click wheel or an now known or later developed, including but not limited to associatedbutton.Userw--dsandnavigationcommands Copy provided bv USPTO from the PIRS Imaae Database on 06/20/2011 APLNDC00027907 US 7,864 ,163 B2 9 10 provided by the user via the click wheel may be processed by The device 100 may also include one or more acceleroman input controller 160 as well as one or more ofthe modules eters 168. FIGS. 1A and 1B show an accelerometer 168 coupled to the peripherals interface 118. Alternately, the and/or sets of instructions in memory 102. For a virtual click accelerometer 168 may be coupled to an input controller 160 wheel, the click wheel and click wheel controller may be part ofthe touch screen 112 and the display controller 156, respec- 5 in the I/O subsystem 106. The accelerometer 168 may perform as described in U.S. Patent Publication No. tively. For a virtual click wheel, the click wheel may be either 20050190059, "Acceleration-based Theft Detection System an opaque or semitransparent object that appears and disapfor Portable Electronic Devices," and U.S. Patent Publication pears on the touch screen display in response to user interacNo. 20060017692, "MethodsAndApparatuses For Operating tion with the device. In some embodiments, a virtual click wheel is displayed on the touch screen of a portable multi- 10 A Portable Device Based On An Accelerometer," both of which are which are incorporated herein by reference. In function device and operated by user contact with the touch some embodiments, information is displayed on the touch screen. screen display in a portrait view or a landscape view based on The device 100 also includes a power system 162 for powan analysis of data received from the one or more acceleromering the various components. The power system 162 may 15 eters. include a power management system, one or more power In some embodiments, the software components stored in sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converteror inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices. memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a contact/ motion module (or set ofinstructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of The device 100 may also include one or more optical instructions) 134, a Global Positioning System (GPS) module (or set ofinstructions) 135, and applications (or set ofinstructions) 136. sensors 164. FIGS.1A and 1B show an optical sensor coupled The operating system 126 (e.g., Darwin, RTXC, LINUX, to an optical sensor controller 158 in I/O subsystem 106. The 2s UNIX, OS X, WINDOWS, or an embedded operating system optical sensor 164 may include charge-coupleddevice (CCD) such as VxWorks) includes various software components or complementary metal-oxide semiconductor (CMOS) phoand/or drivers for controlling and managing general system totransistors. The optical sensor 164 receives light from the tasks (e.g., memory management, storage device control, -L -ent, projected through one or more lens, and conpower management, etc.) and facilitates v--un cation verts the light to data representing an image. In conjunction 3e between various hardware and software components. with an imaging module 143 (also called a camera module), The co--- cation module 128 facilitates communicathe optical sensor 164 may capture still images or video. In tion with other devices over one or more external ports 124 some embodiments, an optical sensor is locatedon the backof and also includes various software components for handling the device 100, opposite the touch screen display 112 on the data received by the RF circuitry 108 and/or the external port front of the device, so that the touch screen display may be 35 124. The external port 124 (e.g., Universal Serial Bus (USB), used as a viewfmder for either still and/or video image acquiFIREWIRE, etc.) is adapted for coupling directly to other sition. In some embodiments, an optical sensor is located on devices or indirectly over a network (e.g., the Internet, wirethe front of the device so that the user's image may be less LAN, etc.). In some embodiments, the external port is a obtainedfor videoconferencingwhile the user views the other multi-pin (e.g., 30-pin) connector that is the same as, or video conference participants on the touch screen display. In 40 similar to and/or compatible with the 30-pia connector used some embodiments, the position ofthe optical sensor 164 can be changed by the user (e.g., by rotating the lens and the on iPod (trademark ofApple Computer, Inc.) devices. sensor in the device housing) so that a single optical sensor the touch screen 112 (in conjunctionwith the display control- The contact/motion module 130 may detect contact with 164 may be used along with the touch screen display for both ler 156) and other touch sensitive devices (e.g., a touchpad or video conferencing and still and/or video image acquisition· 45 physical click wheel). The contact/motion module 130 The device 100 may also include one or more proximity includes various software components for performingvarious 166. FIGS. 1A and 1B show a proximity sensor 166 operationsrelatedto detectionofcontact, such as determining coupled to the peripherals interface 118. Alternately, the if contact has occurred, determining if there is movement of proximity sensor 166 may be coupled to an input controller the contact and tracking the movement across the touch 160 in the I/O subsystem106. The proximity sensor 166 may 50 screen 112, and determining if the contact has been broken perform as described in U.S. patent application Ser. No. (i.e., ifthe contact has ceased). Determiningmovement ofthe 11/241,839, "Proximity Detector In Handheld Device," filed point ofcontact may include determining speed (magnitude), Sep. 30, 2005; Ser. No. H/240,788, "Proximity Detector In velocity (magnitude and direction), and/or an acceleration (a Handheld Device," filed Sep. 30, 2005; Ser. No. 11/620,702, change in magnitude and/or direction) ofthe point ofcontact. filed Jan. 7, 2007, "Using Ambient Light Sensor To Augment 55 These operations may be applied to single contacts (e.g., one Proximity Sensor Output," Ser. No. 11/586,862, filed Oct.24, finger contacts) or to multiple simultaneous contacts (e.g., 2006, "Automated Response To And Sensing Of UserActiv"multitouch"/multiple finger contacts). In some embodiity In Portable Devices," and Ser. No. 11/638,251, filed Dec. ments, the contact/motion module 130 and the display con12, 2006, "Methods And Systems ForAutomatic Configuratroller 156 also detects contact on a touchpad. In some tion Of Peripherals," which are hereby incorporated by ref- 60 embodiments, the contact/motion module 130 and the conerence. In some embodiments, the proximity sensor turns off troller 160 detects contact on a click wheel. and disables the touch screen 112 when the multifunction The graphics module 132 includes various known software components for rendering and displaying graphics on the making a phone call). In some embodiments, the proximity touch screen 112, including components for changing the sensor keeps the screen off when the device is in the user's 65 intensity of graphics that are displayed. As used herein, the pocket, purse, or other dark area to prevent -----sary bitterm "graphics" includes any object that can be displayed to tery drainage when the device is a locked state. a user, including without limitation text, web pages, icons device is placed near the user's ear (e.g., when the user is Gopy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027908 US 7,864,163 B2 11 12 (such as user-interface objects including soft keys), digital images, videos, animations and the like. The text input module 134, which may be a component of In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, graphics module 132, provides soft keyboards for entering contact module 130, graphics module 132, text input module text invarious applications (e.g., contacts137, e-mail140, IM 5 134, contact list 137, and telephone module 138, the video141, blogging 142, browser 147, and any other application conferencing module 139 may be used to initiate, conduct, that needs text input). and terminate a video conference between a user and one or The GPS module 135 determines the location ofthe device more other participants. and provides this information for use in various applications In conjunction with RF circuitry 108, touch screen 112, (e.g., to telephone 138 for use in location-based dialing, to 10 display controller 156, contact module 130, graphics module . 143 and/or blogger 142 as picture/video metadata, 132, and text input module 134, the e-mail client module 140 and to applications that provide location-based services such may be used to create, send, receive, and manage e-mail. In conjunctionwith image managementmodule 144, the e-mail The applications 136 may include the following modules 15 module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143. (or sets of instructions), or a subset or superset thereof: a contacts module 137 (sometimes called an address book In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module or contact list); 132, andtext input module 134, the instant messaging module a telephone module 138; a video conferencing module 139; 20 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered an e-mail client module 140; characters, to transmit a respective instant message (for an instant messaging (IM) module 141; as weather widgets, local yellow page widgets, and map/ navigation widgets). example, using a Short Message Service (SMS) or Multime- a blogging module 142; a -2222-2. module 143 for still and/or video images; an image management module 144; a video player module 145; a music player module 146; a browser module 147; a calendar module 148; widget modules 149, which may include weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as usercreated widgets 149-6; dia Message Service (MMS) protocol for telephony-based 25 instant messages or using XMPP, - E, or IMPS for Internet-based instant messages), to receive instant messages and to view receivedinstant messages. In some embodiments, transmitted and/or received instant messages may include graphics, photos, audio files, video files and/or other attach30 ments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, "instant messaging" refers to both telephony-basedmessages(e.g., messages sent using SMS or MMS) and Internet-basedmessages (e.g., mes- sages sent using XMPP, SIMPLE, or IMPS). In conjunction with RF circuitry 108, touch screen 112, gets 149-6; display controller 156, contact module 130, graphics module search module 151; 132, text input module 134, image management module 144, video and music player module 152, which merges video and browsing module 147, the blogging module 142 may be player module 145 and music player module 146; used to send text, still images, video, and/or other graphics to notes module 153; and/or map module 154. 40 a blog (e.g., the user's blog). widget creator module 150 for making user-created wid- 35 Examples of other applications 136 that may be stored in memory 102 include other word processing applications, In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, conJAVA-enabled applications, encryption, digital rights mantact module 130, graphics module 132, and image manageagement, voice recognition, and voice replication. ment module 144, the ---- module 143 may be used to In conjunction with touch screen 112, display controller 45 capture still images or video (including a video stream) and 156, contact module130, graphics module 132, and text input store them into memory 102, modify characteristics of a still module 134, the contacts module 137 may be used to manage image or video, or delete a still image or video from memory an address book or contact list, including: adding name(s) to 102. the address book; deleting name(s) from the address book; 50 In conjunction with touch screen 112, display controller associating telephone number(s), e-mail address(es), physi156, contact module 130, graphics module 132, text input cal address(es) or other information with a name; associating module 134, and ---module 143, the image management an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/ or facilitate communications by telephone 138, video confer- module 144 may be used to arrange, modify or otherwise manipulate, label, delete, present (e.g., in a digital slide show number, access one ormore telephonenumbers inthe address external port 124). In conjunction with touch screen 112, display system con- 55 or album), and store still and/or video images. ence 139, e-mail 140, or IM 141; and so forth. In conjunction with touch screen 112, display controller In conjunction with RF circuitry 108, audio circuitry 110, 156, contact module 130, graphics module 132, audio cirspeaker 111, microphone 113, touch screen 112, display concuitry 110, and speaker 111, the video playermodule 145 may troller 156, contact module 130, graphics module 132, and be usedto display, present or otherwiseplay backvideos (e.g., text input module 134, the telephonemodule 138 maybe used to enter a sequence ofcharacters correspondingto a telephone 60 on the touch screen or on an external, connected display via book 137, modify a telephone number that has been entered, troller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser and disconnect or hang up when the conversation is completed. As noted above, the wireless --acation may use 65 module 147, the music player module 146 allows the user to download and play back recorded music and other sound files any of a plurality ofcommunications standards, protocols and dial a respective telephone number, conduct a conversation technologies. stored in one or more file formats, such as MP3 orAAC files. Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027909 US 7,864,163 B2 13 14 In some embodiments, the device 100 is a device where In some embodiments, the device 100 may include the funcoperation of a predefined set of functions on the device is tionality of an MP3 player, such as an iPod (trademark of performed exclusively through a touch screen 112 and/or a Apple Computer, Inc.). touchpad. By using a touch screen and/or a touchpad as the In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics 5 primary input/control device for operation of the device 100, the number of physical input/control devices (such as push module 132, and text input module 134, the browser module buttons, dials, and the like) on the device 100 may bereduced. 147 may be used to browse the Internet, including searching, The predefined set of functions that may be performed linking to, receiving, and displaying web pages or portions exclusively through a touch screen and/or a touchpad include thereof, as well as attachments and other files linked to web pages. Embodiments of user interfaces and associated pro- 10 navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the device cesses using browsermodule 147 are describedfurther below. 100 to a main, home, or root menu from any user interface that In conjunction with RF circuitry 108, touch screen 112, may be displayed on the device 100. In such embodiments, display system controller 156, contact module 130, graphics the touchpad may be referred to as a "menu button." In some module 132, text input module 134, e-mail module 140, and browsermodule147, the calendarmodule148 may be usedto 15 other embodiments, the menu button may be a physical push button or other physical input/control device instead of a create, display, modify, and store calendars and data associtouchpad. ated with calendars (e.g., calendar entries, to do lists, etc.). FIG. 2 illustrates a portable multifunction device 100 havIn conjunction with RF circuitry 108, touch screen 112, ing a touch screen 112 in accordance with some embodidisplay system controller 156, contact module 130, graphics module 132, text input module 134, and browsermodule 147, 20 ments. The touch screen may display one or more graphics within user interface (UI) 200. Inthis embodiment, as well as the widget modules 149 are mini-applications that may be others described below, a user may select one or more of the downloaded and used by a user (e.g., weather widget 149-1, graphics by making contact or touching the graphics, for stocks widget 149-2, calculator widget 149-3, alarm clock example, with one or more fingers 202 (not drawn to scale in widget 149-4, and dictionary widget 149-5) or createdby the user (e.g., user-createdwidget 149-6). In some embodiments, 25 the figure). In some embodiments, selection of one or more a widget includes an - - (Hypertext Markup Language) graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the contact may file, a CSS (Cascading Style Sheets) file, anda JavaScript file. include a gesture, such as one or more taps, one or more In some embodiments, a widget includes anXML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! swipes (from left to right, right to left, upward and/or down- 30 ward) and/or a rolling of a finger (from right to left, left to Widgets). rigbt, upward and/or downward) that has made contact with In conjunction with RF circuitry 108, touch screen 112, the device 100. In some embodiments, inadvertent contact display system controller 156, contact module 130, graphics with a graphic may not select the graphic. For example, a module 132, text input module 134, and browsermodule 147, the widget creatormodule 150 maybe used by a user to create swipe gesture that sweeps over an application icon may not widgets (e.g., turning a user-specified portion of a web page 35 select the corresponding application when the gesture corresponding to selection is a tap. into a widget). The device 100 may also include one or more physical In conjunctionwith touch screen 112, display system controller 156, contact module 130, graphics module 132, and buttons, such as "home" or menu button 204. As described text input module 134, the search module 151 may be used to previously, the menu button 204 may be used to navigate to search for text, music, sound, image, video, and/or other files 40 any application 136 in a set of applications that may be executed on the device 100. Alternatively, in some embodiin memory 102 that match one or more search criteria (e.g., ments, the menu button is implemented as a soft key in a GUI one or more user-specified search terms). In conjunction with touch screen 112, display controller in touch screen 112. 156, contact module130, graphics module 132, and text input In one embodiment, the device 100 includes a touch screen module 134, the notes module 153 may be used to create and 45 112, a menu button 204, a push button 206 for powering the manage notes, to do lists, and the like. In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics device on/off and locking the device, volume adjustment button(s) 208, a Subscriber Identity Module (SIM) card slot 210, a head setjack 212, and a docking/charging external port 124. The push button 206 may be used to turn the power module 132, text input module 134, GPS module 135, and b....-- module 147, the map module 154 may be used to 50 on/offon the device by depressing the button and holding the button in the depressed state for a predefined time interval; to receive, display, modify, and store maps and data associated lock the device by depressing the button and releasing the with maps (e.g., driving directions; data on stores and other button before the predefmed time interval has elapsed; and/or points of interest at or near a particular location; and other to unlock the device or initiate an unlock process. In an location-based data). y Each ofthe above identified modules and applications cor- 55 alternative embodiment, the device 100 also may accept verbal input for activation or deactivation of some functions respond to a set of instructions for performing one or more through the microphone 113. functions described above. These modules (i.e., sets of Attention is now directed towards embodiments of user instructions) need not be implemented as separate software interfaces ("UI") and associatedprocesses that may be impleprograms, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in 60 , mented on a portable multifunction device 100. various embodiments. For example, video playermodule145 FIG.3illustratesanexemplaryuserinterfaceforunlocking a portable electronic device in accordancewith some embodimay be combined with music playermodule 146 into a single ments. In some embodiments, user interface 300 includes the module (e.g., video and music player module 152, FIG.1B). following elements, or a subset or superset thereof: In some embodiments, memory 102 may store a subset ofthe Unlock image 302 that is moved with a finger gesture to modules and data structures identified above. Furthermore, 65 unlock the device; memory 102 may store additional modules and data strubArrow 304 that provides a visual cue to the unlock gesture; tures not described above. Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027910 US 7,864,163 B2 15 16 need to scroll through a list of applications (e.g., via a scroll bar). In some embodiments, as the number of applications increase, the icons corresponding to the applications may Time 308; decrease in size so that all applications may be displayedon a Day 310; 5 single screen without scrolling. In some embodiments, havDate 312; and ing all applications on one screen and a menu button enables Wallpaper image 314. a user to access any desired application with at most two In some embodiments, the device detects contact with the touch-sensitive display (e.g., a user's finger making contact inputs, such as activating the menu button 204 and then activating the desired application (e.g., by a tap or other finger on or near the unlock image 302) while the device is in a user-interface lock state. The device moves the unlock image 10 gesture on the icon corresponding to the application). 302 in accordance with the contact. The device transitions to FIGS. 5A-5M illustrate exemplary user interfaces for a Channel 306 that provides additional cues to the unlock gesture; a user-interface unlock state if the detected contact corresponds to a predefined gesture, such as moving the unlock image across channe1306. browser in accordance with some embodiments. In some embodiments, user interfaces 3900A-3900M (in FIGS. 5A-5M, respectively) include the following elements, Conversely, the device maintains the user-interface lock 15 or a subset or superset thereof: 402, 404, and 406, as described above; state if the detected contact does not correspond to the prePrevious page icon 3902 that when activated (e.g., by a defined gesture. As noted above, processes that use gestures finger tap on the icon) initiates display of the previous on the touch screen to unlock the device are described in U.S. web page; patent application Ser. Nos. 11/322,549, "Unlocking A Web page name 3904; Device By Performing Gestures On An Unlock Image," filed 20 Next page icon 3906 that when activated (e.g., by a finger Dec. 23, 2005, and 11/322,550, "Indication Of Progress tap on the icon) initiates display of the next web page; Towards Satisfaction OfA User Input Condition," filed Dec. 23, 2005, which are hereby incorporated by refe--. FIGS. 4A and 4B illustrate exemplary user interfaces for a menu of applications on a portable multifunction device in 25 accordance with some embodiments. In some embodiments, user interface 400A includes the following elements, or a subset or superset thereof: Signal strength indicator(s) 402 for wireless --Aca30 tion(s), such as cellular and Wi-Fi signals; Time 404; Battery status indicator 406; Tray 408 with icons for frequently used applications, such as one or more of the following: Phone 138, which may include an indicator 414 of the 35 number of missed calls or voicemail messages; E-mail client 140, whichmay include an indicator410 of the number of unread e-mails; Browser 147; and 40 Music player 146; and Icons for other applications, such as one or more of the following: IM 141; Image management 144; 45 Camera 143; Video player 145; Weather 149-1; Stocks 149-2; Blog 142; so Calendar 148; Calculator 149-3; Alarm clock 149-4; Dictionary 149-5; and User-created widget 149-6. In some embodiments, user interface 400B includes the ss following elements, or a subset or superset thereof: 402, 404, 406, 141, 148, 144, 143, 149-3, 149-2, 149-1, 149-4, 410, 414, 138, 140, and 147, as described above; Map 154; 60, Notes 153; Settings 412, which provides access to settings for the device 100 and its various applications 136, as described further below; and Video and music player module 152, also referred to as iPod (trademark ofApple Computer, Inc.) module 152. 65 In some embodiments, UI 400A or 400B displays all ofthe available applications 136 on one screen so that there is no URL (Uniform Resource Locator) entry box 3908 for inputting URLs of web pages; Refresh icon 3910 that when activated (e.g., by a finger tap on the icon) initiates a refresh of the web page; Web page 3912 or other structured document, which is made of blocks 3914 of text content and other graphics (e.g., images and inline multimedia); Settings icon 3916 that when activated (e.g., by a finger tap on the icon) initiates display of a settings menu for the browser; Bookmarks icon 3918 that when activated (e.g., by a finger tap on the icon) initiates display of a bookmarks list or menu for the browser; Add bookmark icon 3920 that when activated (e.g., by a finger tap on the icon) initiates display ofa UI for adding bookmarks (e.g., UI 3900F, FIG. 5F, which like other UIs and pages, can be displayed in either portrait or landscape view); New window icon 3922 that when activated (e.g., by a finger tap on the icon) initiates display ofa UI for adding new windows to the browser (e.g., UI 3900G, FIG. SG); Vertical bar 3962 (FIG. 5H) for the web page 3912 or other structured document that helps a user understand what portion of the web page 3912 or other structured docu- ment is being displayed; Horizontal bar 3964 (FIG. 5H) for the web page 3912 or other structured document that helps a user understand what portion of the web page 3912 or other structured document is being displayed; Share icon 3966 that when activated (e.g., by a fingertap on the icon) initiates display ofa UI for sharinginformation with other users (e.g., UI 3900K, FIG. 5K); URL clear icon 3970 (FIG. 51) that whenactivated (e.g., by a finger tap on the icon) clears any input in URL entry box 3908; Search term entry box 3972 (FIG. SI) for inputting search terms for web searches; URL suggestion list 3974 that displays URLs that match the input in URL entry box 3908 (FIG. SI), wherein activation of a suggested URL (e.g., by a finger tap on the suggestedURL) initiates retrieval ofthe corresponding web page; URL input keyboard 3976 (FIGS. 5I and SM) with period key 3978, backslash key 3980, and ".com"key 3982 that make it easier to enter ca....2 characters in URLs; Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027911 US 7,864, 163 B2 17 18 Search term clear icon 3984 that when activated (e.g., by a fmger tap on the icon) clears any input in search term entry box 3972; Email link icon 3986 (FIG. 5K) that when activated (e.g., page may be enlarged. Conversely, in response to a multitouch pinching gesture by the user, the web page may be reduced. In some embodiments, in response to a substantially vertiby a finger tap or other gesture on the icon) prepares an 5 cal upward (or downward) swipe gesture by the user, the web email that contains a link to be shared with one or more page (or, more generally, other electronic documents) may other users; scroll one-dimensionally upward (or downward) in the vertiEmail content icon 3988 (FIG. 5K) that when activated cal direction. For example, in response to an upward swipe (e.g., by a finger tap or other gesture on the icon) pregesture 3937 by the user that is within a predetermined angle pares an email that contains content to be shared with 10 (e.g., 27°) ofbeing perfectly vertical, the web page may scroll one or more other users; one-dimensionally upward in the vertical direction. IM link icon 3990 (FIG. 5K) that when activated (e.g., by a Conversely, in some embodiments, in response to a swipe finger tap or other gesture on the icon) prepares an IM gesture that is not within a predetermined angle (e.g., 27°) of that contains a link to be shared with one or more other being perfectly vertical, the web page may scroll two-dimenusers; and 15 sionally (i.e., with simultaneous movement in both the vertiCancel icon 3992 (FIG. 5K) that when activated (e.g., by a finger tap or other gesture on the icon) cancels the shar- cal and horizontal directions). For example, in response to an upward swipe gesture 3939 (FIG. 5C) by the user that is not ing UI (e.g., UI 3900K, FIG. 5K) and displays the previous UI. within a predetermined angle (e.g., 27°) of being perfectly vertical, the web page may scroll two-dimensionally along In some embodiments, in response to a predefined gesture 20 the direction of the swipe 3939. by the user on a block 3914 (e.g., a single tap gesture or a In some embodiments, in response to a multi-touch 3941 double tap gesture), the block is enlarged and centered (or and 3943 rotation gesture by the user (FIG. 5C), the web page substantially centered) in the web page display. For example, may be rotated exactly 90° (UI 3900D, FIG. 5D) for landin response to a single tap gesture 3923 on block 3914-5, scape viewmg, even if the amount of rotation in the multiblock 3914-5 may be enlarged and centered in the display, as 25 touch 3941 and 3943 rotation gesture is substantially differshown in UI 3900C, FIG. 5C. In some embodiments, the ent from90°. Similarly, in response to a multi-touch 3945 and width ofthe block is scaled to fill the touch screen display. In 3947 rotation gesture by the user (UI 3900D, FIG. 5D), the some embodiments, the width ofthe block is scaled to fill the web page may be rotated exactly 90° for portrait viewing, touch screen display with a predefined amount of padding even if the amount of rotation in the multi-touch 3945 and along the sides ofthe display. In some embodiments, a zoom- 3o 3947 rotation gesture is substantially different from 90°. ing animation ofthe block is displayedduring enlargement of Thus, in response to imprecise gestures by the user, precise the block. Similarly, in response to a single tap gesture 3925 movements of graphics occur. The device behaves in the --, desired by the user despite is.-te input by the on block 3914-2, block 3914-2 may be enlargedwith a zooming animationand two-dimensionallyscrolled to the center of user. Also, note that the gestures described for UI 3900C, the display (not shown). 35 which has a portrait view, are also applicable to UIs with a In some embodiments, the device analyzes the render tree landscape view (e.g., UI 3900D, FIG. 5D) so that the user can ofthe web page 3912 to determinethe blocks 3914 in the web choose whichever view the user prefers for web browsing. page. In some embodiments, a block 3914 corresponds to a FIGS. 6A-6C are flow diagrams illustrating a process 6000 rendernode that is: a replaced inline; a block; an inline block; for displaying structured electronic documents such as web or an inline table. 40 pages on a portable electronic device with a touch screen In some embodiments, in response to the same predefined display (e.g., device 100) in accordance with some embodigesture by the user on a block 3914 (e.g., a single tap gesture ments. The portable electronic device displays at least a poror a double tap gesture) that is already enlarged and centered, tion of a structured electronic do---i on the touch screen the enlargement and/or centering is substantially or comdisplay. The structured electronic document comprises a plupletely reversed. For example, in response to a single tap 45 rality ofboxes ofcontent (e.g., blocks 3914, FIG. 5A) (6006). gesture 3929 (FIG. SC) on block 3914-5, the web page image In some embodiments, the plurality of boxes is defined by may zoom out and return to UI 3900A, FIG. 5A. a style sheet language. In some embodiments, the style sheet In some embodiments, in response to a predefined gesture language is a cascading style sheet language. In some (e.g., a single tap gesture or a double tap gesture) by the user embodiments, the structured electronic document is a web on a block 3914 that is already enlarged but not centered, the so page (e.g., web page 3912, FIG. SA). In some embodiments, block is centered (or substantially centered) in the web page the structured electronic document is an - or XML display. For example, in response to a single tap gesture 3927 document. (FIG. 5C) on block 3914-4, block 3914-4 may be centered (or In some embodiments, displaying at least a portion of the substantially centered) in the web page display. Similarly, in structured electronic document comprises scaling the docuresponse to a single tap gesture 3935 (FIG. 5C) on block 55 ment width to fit within the touch screen display width inde3914-6, block 3914-6 may be centered (or substantially cenpendent of the document length (6008). tered) in the web page display. Thus, for a web page display In some embodiments, the touch screen display is rectanthat is already enlarged, in response to a predefined gesture, gular with a short axis and a long axis (also called the minor the device may display in an intuitive a series of axis and major axis); the display width corresponds to the blocks that the user wants to view. This same gesture may 60 short axis (or minor axis) when the structured electronic initiate different actions in different contexts (e.g., (1) zoomdocument is seen in portrait view (e.g., FIG. 5C); and the ing and/or enlarging in combination with scrolling when the display width corresponds to the long axis (or major axis) web page is reduced in size, UI 3900A and (2) reversing the when the structured electronic document is seen in landscape enlargement and/or centering if the block is already centered view (e.g., FIG. 5D). and enlarged). , 65 In some embodiments, prior to displayingat least a portion In some embodiments, in response to a multi-touch 3931 ofa structured electronic document, borders, margins, and/or and 3933 de-pinching gesture by the user (FIG. 5C), the web paddings are determinedfor the plurality ofboxes (6002) and Conv provided bv USPTO from the PIRS Imaae Database on " """"11 APLNDC00027912 US 7,864,163 B2 19 20 In some embodiments, text in the structured electronic adjusted for display on the touch screen display (6004). In document is resized to meet or exceed a predeterminedminisome embodiments, all boxes in the plurality of boxes are mum text size on the touch screen display (6026; FIG. 6B). In adjusted. In some embodiments,just the first box is adjusted. some embodiments, the text resizing comprises: determining In some embodiments,just the first box and boxes adjacent to s a scale factor by which the first box will be enlarged (6028); the first box are adjusted. dividing the predetermined - J-- text size on the touch A first gesture is detected at a location on the displayed screen display by the scaling factor to determine a portion of the structured electronic document (e.g., gesture text size for text inthe structured electronic document (6030); 3923, FIG. 5A) (6010). In some embodiments, the first gesand if a text size for text in the structured electronic document ture is a finger gesture. In some embodiments, the first gesture 10 is less than the determined minimum text size, ° --ing the is a stylus gesture. text size for text in the structured electronic document to at In some embodiments, the first gesture is a tap gesture. In least the determined minimum text size (6032). In some some embodiments, the first gesture is a double tap with a embodiments, the text resizing comprises: identifying boxes single finger, a double tap with two fingers, a single tap with containing text in the plurality of boxes; determining a scale a single finger, or a single tap with two fingers. A first box (e.g., Block 5 3914-5, FIG. 5A) in the plurality 15 factor by which the first box will be enlarged; dividing the predetermined ...:..:...- text size on the touch screen display of boxes is determined at the location of the first gesture by the scaling factor to determinea minimum text size for text (6012). In some embodiments, the structured electronic docuin the structured electronic document; and for each identified ment has an associated render tree with a plurality of nodes box containing text, if a text size for text in the identified box and determiningthe first box at the location ofthe first gesture comprises: traversing downthe rendertree to determinea first 20 is less than the determined ...:..:...-text size, increasing the text size for text in the identifiedbox to at least the determined node in the plurality ofnodes that corresponds to the detected A--text size and adjusting the size ofthe identified box location of the first gesture (6014); traversing up the render to accommodate the resized text. tree from the first node to a closest parent node that contains a logical grouping of content (6016); and identifying content In some embodiments, a second gesture (e.g., gesture corresponding to the closest parent node as the first box 25 3929, FIG. 5C) is detected on the enlarged first box (6034). In (6018). In some embodiments, the logical grouping of conresponse to detecting the second gesture, the displayed portent comprises a paragraph, an image, a plugin object, or a tion of the structured electronic document is reduced in size table. In some embodiments, the closest parent node is a (6036). In some embodiments, the first box returns to its size replaced inline, a block, an inline block, or an inline table. prior to being enlarged (6038). The first box is enlarged and substantially centered on the 30 In some embodiments, the second gesture and the first touch screen display (e.g., Block 5 3914-5, FIG. 5C) (6020). gesture are the same type of gesture. In some embodiments, In some embodiments, enlarging and substantially centering the second gesture is a finger gesture. In some embodiments, comprises simultaneously zooming and translating the first the second gesture is a stylus gesture. box on the touch screen display (6022). In some embodiIn some embodiments, the second gesture is a tap gesture. ments, enlarging comprises expandingthe first box so that the 35 In some embodiments, the second gesture is a double tap with width ofthe first box is substantially the same as the width of a single finger, a double tap with two fingers, a single tap with the touch screen display (6024). a single finger, or a single tap with two fingers. In some embodiments, text in the enlarged first box is In some embodiments, while the first box is enlarged, a resized to meet or exceed a predeterminedminimum text size 40 third gesture (e.g., gesture 3927 or gesture 3935, FIG. 5C) is onthe touch screen display (6026). In some embodiments, the detected on a second box other than the first box (6040). In text resizing comprises: determining a scale factor by which response to detecting the third gesture, the second box is the first box will be enlarged (6028); dividing the predetersubstantially centered on the touch screen display (6042). In mined minimum text size on the touch screen display by the some embodiments, the third gesture and the first gesture are scaling factor to determine a mid-- text size for text in the 45 the same type of gesture. In some embodiments, the third first box (6030); and ifa text size for text inthe first box is less gesture is a finger gesture. In some embodiments, the third than the determined minimum text size, increasing the text gesture is a stylus gesture. size for text inthe first box to at least the determined In some embodiments, the third gesture is a tap gesture. In text size (6032). In some embodiments, the first box has a some embodiments, the third gesture is a double tap with a width; the display has a display width; and the scale factor is the display width divided by the width of the first box prior to so single finger, a double tap with two fmgers, a single tap with a single finger, or a single tap with two fingers. enlarging. In some embodiments, the resizing occurs during In some embodiments, a swipe gesture (e.g., gesture 3937 the enlarging. In some embodiments, the resizing occurs after or gesture 3939, FIG. 5C) is detected on the touch screen the enlarging. display (6044; FIG. 6C). In response to detecting the swipe For example, suppose the predetermined mid-- text size is an 18-point font andthe scale factor is determinedto be 55 gesture, the displayed portion of the structured electronic document is translated on the touch screen display (6046). In two.Inthatcase,theminimumtextsizefortextinthefirstbox some embodiments, the translating comprises vertical, horiis 18 divided by 2, or 9. If text in the first box is in a 10-point zontal, or diagonal movement of the structured electronic font, its text size is not increased, because 10 is greater than document on the touch screen display (6048). In some the 9-pointminimum. Once the scale factor is applied, the text will be displayed in a 20-point font, which is gieater than the 60 embodiments, the swipe gesture is a finger gesture. In some embodiments, the swipe gesture is a stylus gesture. predetermined -L- text size of 18. If, however, text in In some embodiments, a fifth gesture (e.g., multi-touch the first box is in an 8-point font, application of the scale gesture 3941/3943, FIG. 5C) is detected on the touch screen factor would cause the text to be displayed in a 16-point font, display (6050). In response to detecting the fifth gesture, the which is less than the predeterminedAL--text size of 18. Therefore, since 8 is less than 9, thetext size is increasedto at 65 displayed portion of the structured electronic document is rotated on the touch screen display by 90° (6052). In some least a 9-point font and displayed in at least an 18-point foilt embodiments, the fifth gesture is a finger gesture. In some after application of the scale factor. Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027913 US 7,864,163 B2 21 22 embodiments, the fifth gesture is a multifinger gesture. In In response to detecting a gesture on the touch screen some embodiments, the fifth gesture is a twisting gesture. display, a displayed window in the application is moved off the display and a hidden window is moved onto the display. In some embodiments, a change in orientationofthe device For example, in response to detecting a tap gesture 3949 on is detected (6054). For example, the one or more accelerometers 168 (FIGS. 1A-1B) detect a change in orientation ofthe 5 the left side of the screen, the window with web page 3912-2 is moved partially or fully off-screen to the right, the window device. In response to detecting the change in orientation of with web page 3912-3 is moved completely off-screen, parthe device, the displayed portion of the structured electronic tially hidden window with web page 3912-1 is moved to the document is rotated on the touch screen display by 90° center ofthe display, and another completely hidden window (6056). In some embodiments, a multi-finger de-pinch gesture 10 with a web page (e.g., 3912-0) may be moved partially onto the display. Alternatively, detection of a left-to-right swipe (e.g., multi-touch gesture 3931/3933, FIG. 5C) is detected on gesture 3951 may achieve the same effect. the touch screen display (6058). In response to detecting the Conversely, in response to detecting a tap gesture 3953 on multi-finger de-pinch gesture, a portion ofthe displayed porthe right side ofthe screen, the windowwithweb page 3912-2 tion of the structured electronic document is enlarged on the touch screen display in accordance with a position of the 15 is moved partially or fully off-screen to the left, the window with web page 3912-1 is moved completely off-screen, parmulti-finger de-pinch gesture and an amount of finger move- tially hidden window with web page 3912-3 is moved to the center ofthe display, and -A- completely hidden window with a web page (e.g., 3912-4) may be moved partially onto specific order, it should be apparent that the process 6000 can 20 the display. Alternatively, detection of a right-to-left swipe gesture 3951 may achieve the same effect. include more or fewer operations, which can be executed In some embodiments, in response to a tap or other preserially or in parallel (e.g., using parallel processors or a defmed gesture on a delete icon 3934, the corresponding multi-threadingenvironment), an order oftwo ormore operament in the multi-finger de-pinch gesture (6060). While the content display process 6000 described above includes a number of operations that appear to occur in a window3912 is deleted. In some embodiments, inresponseto tions may be changed and/or two or more operations may be 25 a tap or other predefined gesture on Done icon 3938, the combined into a single operation. window in the center ofthe display (e.g., 3912-2) is enlarged A graphical user interface (e.g., UI 3900A, FIG. 5A) on a to fill the screen. portable electronic device with a touch screen display comAdditional description of adding windows to an applicaprises at least a portion of a structured electronic document tion can be found in U.S. patent application Ser. No. 11/620, (e.g., web page 3912, FIG. 5A). The structured electronic document comprises a plurality of boxes of content (e.g., 30 647, "Method, System, And Graphical User Interface For Viewing Multiple Application Windows," filed Jan. 5, 2007, blocks 3914, FIG. 5A). In response to detecting a first gesture the content of which is hereby incorporated by reference. (e.g., gesture 3923, FIG. 5A) at a location on the portionofthe FIGS. 7A-7F illustrate exemplary user interfaces for playstructured electronic document, a first box (e.g., Block 5 ing an item of inline multimedia content in accordance with 3914-5, FIG. 5A) in the plurality of boxes at the location of the first gesture is determinedand the first box is enlarged and 35 some embodiments. I In some embodiments, user interfaces 4000A-4000F (in substantially centered onthe touch screen display (e.g., Block FIGS. 7A-7F, respectively) include the following elements, 5 3914-5, FIG. 5C). or a subset or superset thereof: In some embodiments, in response to a tap or other pre402, 404, 406, 3902, 3906, 3910, 3912, 3918, 3920, 3922, defmed user gesture on URL entry box 3908, the touch screen as described above; displays an enlarged entry box 3926 and a keyboard 616 (e.g., 40 inline multimedia content 4002, such as QuickTime conUI3900B, FIG.5BinportraitviewingandUI3900E, FIG.5E tent (4002-1), Windows Media content (4002-2), or in landscape viewing). In some embodiments, the touch Flash content (4002-3); screen also displays: Contextual clear icon 3928 that when activated (e.g., by a finger tap on the icon) initiates deletion of all text in entry box 3926; 45 a search icon 3930 that when activated (e.g., by a fmger tap on the icon) initiates an Internet search using the search terms input in box 3926; and 50 Go to URL icon 3932 that when activated (e.g., by a finger tap on the icon) initiates acquisitionofthe web page with the URL input in box 3926; Thus, the same entry box 3926 may be used for inputting both search terms and URLs. In some embodiments, whether 55 or not clear icon 3928 is displayed depends on the context. UI 3900G (FIG. 5G) is a UI for adding new windows to an application, such as the b-,,-, 147. UI 3900G displays an other types of content 4004 in the structured document, such as text; Exit icon 4006 that when activated (e.g., by a fmger tap on the icon) initiates exiting the inline multimedia content player UI (e.g., UI 4000B or 4000F) and returning to another UI (e.g., UI 4000A, FIG. 7A); Lapsed time 4008 that shows how much of the inline multimedia content 4002 has been played, in units of time; Progress bar 4010 that indicates what fraction of the inline multimedia content 4002 has been played and that may be used to help scroll through the inline multimedia content in response to a user gesture; Remaining time 4012 that shows how much of the inline multimedia content 4002 remains to be played, in units of time; application (e.g., the browser 147), which includes a disDownloading icon 4014 that indicates when inline multimedia content 4002 is being downloaded or streamed to played window (e.g., web page 3912-2) and at least one 60 , hidden window (e.g., web pages 3912-1 and 3934-3 and the device; possibly other web pages that are completely hidden offFast Reverse/Skip Backwards icon 4016 that when actiscreen). UI 3900G also displays an icon for adding windows vated (e.g., by a finger tap on the icon) initiates reversing to the application (e.g., new window or new page icon 3936). or skipping backwards through the inline multimedia In responseto detecting activation ofthe icon3936 for adding 65 content 4002; windows, the browser adds a window to the application (e.g a new window for a new web page 3912). Play icon 4018 that when activated (e.g., by a fmger tap 4026 (FIG. 7C) on the icon) initiates playing the inline Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027914 US 7,864,163 B2 23 24 multimedia content 4002, either from the beginning or from where the inline multimedia content was paused; In response to detecting the second gesture, one or more playback controls for playing the enlarged item of inline multimedia content are displayed (8010). In some embodiFast Forward/Skip Forward icon 4020 that initiates forments, the one or more playback controls comprise a play warding or skipping forwards through the inline multimedia content 4002; 5 icon (e.g., icon 4018, FIG. 7C), a pause icon (e.g., icon 4024, FIG. 7E), a sound volume icon (e.g., icon 4022), and/or a Volume adjustment slider icon 4022 that that when actiplayback progress bar icon (e.g., icon 4010). vated (e.g., by a finger tap on the icon) initiates adjustIn some embodiments, displaying one or more playback ment of the volume of the inline multimedia content controls comprises displaying one or more playback controls 4002; and Pause icon 4024 that when activated (e.g., by a fingertap on 10 on top ofthe enlarged item ofinline multimedia content (e.g., playback controls 4016, 4018, 4020, and 4022 are on top of the icon) initiates pausing the inline multimedia content enlarged inline multimedia content 4002-1 in FIG. 7C). In 4002. FIG. 8 is a flow diagram illustrating a process 8000 for some embodiments, the one or more playback controls are superimposed on top of the enlarged item of inline multime- displaying inline multimedia content on a portable electronic device with a touch screen display (e.g., device100) in accor- 15 dia content. In some embodiments, the one or more playback controls are semitransparent. dance with some embodiments. The portable electronic In some embodiments, an instructioninthe structured electronic document to automatically start playing the item of inline mul"-, content is overridden, which gives the and 4004, FIG. 7A). In some embodiments, the structured 20 device time to downloadmore ofthe selected inline multimedia content prior to starting playback. electronic document is a web page (e.g., web page 3912). In A third gesture is detected on one ofthe playback controls some embodiments, the structured electronic document is an (e.g., gesture 4026 on play icon 4018, FIG. 7C) (8012). - - or XML document. In response to detecting the third gesture, the enlarged item A first gesture (e.g., gesture 4028, FIG. 7A) is detected on device displays at least a portion of a structured electronic document on the touch screen display (8002). The structured electronic document comprises content (e.g., content 4002 an item of inline multimedia content (e.g., content 4002-1, 26 of inline multimedia content is played (8014). In some embodiments, playing the enlarged item of inline multimedia FIG. 7A) in the displayedportion ofthe structured electronic content comprises playing the enlarged item of inline multidocument (8004). In some embodiments, the inline multimemedia content with a plugin for a content type associatedwith dia content comprises video and/or audio content. In some In response to detecting the first gesture, the item of inline multimedia content is enlarged on the touch screen display the item of inline multimedia content. In some embodiments, while the enlarged item of inline multimedia content is played, the one or more playback controls cease to be displayed (e.g., FIG. 7D, which no longer and other content (e.g., content 4004 and other content 4002 besides 4002-1, FIG. 7A) in the structured electronic docu- ments, all of the playback controls cease to be displayed. In embodiments, the content can be played with a QuickTime, Windows Media, or Flash plugin. ment besides the enlarged item of inline multimedia content 30 displays playback controls 4016, 4018, 4020, and 4022, but still shows 4006, 4008, 4010, and 4012). In some embodi- some embodiments, ceasing to display the one or more playceases to be displayed (e.g., UI 4000B, FIG. 7B orUI 4000F, back controls comprises fading out the one or more playback FIG. 7F) (8006). controls. In some embodiments, the display of the one or In some embodiments, enlarging the item of inline multimore playback controls is ceased after a predeterminedtime. media content comprises animated zooming in on the item. In 40 In some embodiments, the display of the one or more playsome embodiments, enlarging the item of inline multimedia back controls is ceased after no contact is detected with the content comprises simultaneously zooming and translating touch screen display for a predetermined time. the item of inline multimedia content on the touch screen In some embodiments, a fourth gesture is detected on the display. In some embodiments, enlarging the item of inline touch screen display (8016). In response to detecting the multimedia content comprises rotating the item ofinline mul- 45 fourth gesture, at least the portion ofthe structured electronic timedia content by 90° (e.g., from UI 4000A, FIG. 7A to UI do---i is displayed again (e.g., FIG. 7A) (8018). In some 4000B, FIG. 7B). embodiments, the fourth gesture comprises a tap gesture on a In some embodiments, the item of inline multimedia conplayback completion icon, such as a done icon (e.g., gesture tent has a full size; the touch screen display has a size; and 4032 on done icon 4006, FIG.7D). In some embodiments, the enlarging the item of inline multimedia content comprises se item of inline multimedia content returns to its size prior to enlarging the item ofinline multimedia content to the smaller being enlarged. of the full size of the item and the size of the touch screen In some embodiments, the first, second, and third gestures display. are fmger gestures. In some embodiments, the first, second, In some embodiments, enlarging the item of inline multiand third gestures are stylus gestures. media content comprises expanding the item of inline multi- ss In some embodiments, the first, second, and third gestures media content so that the width of the item of inline multiare tap gestures. In some embodiments, the tap gesture is a media content is substantially the same as the width of the double tap with a single finger, a double tap with two fingers, touch screen display (e.g., UI 4000B, FIG. 7B or UI 4000F, a single tap with a single finger, or a single tap with two FIG. 7F). fingers. In some embodiments, ceasing to display other content in 60 While the multimedia display process 8000 described the structured electronic document besides the item of inline above includes a number ofoperations that appear to occur in multimedia content comprises fading out the other content in a specific order, it should be apparent that the process 8000 the structured electronic document besides the item of inline can include more or fewer operations, which can be executed multimedia content. serially or in parallel (e.g., using parallel processors or a While the enlarged item of inline multimedia content iß 65 multi-threading environment), an order oftwo or more operadisplayed, a second gesture is detected on the touch screeri tions may be changed and/or two or more operations may be display (e.g., gesture 4030, FIG. 7B) (8008). combined into a single operation. Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027915 US 7,864,163 B2 25 A graphical user interface on a portable electronic device with a touch screen display comprises: at least a portion of a structured electronic document, wherein the structured electronic document comprises content; an item of inline multi- 26 in response to detecting the second gesture, the structured electronic document is translated so that the second box is substantially centered on the touch screen display. 3. The method of claim 2, including: prior to displaying at media content in the portion ofthe structured electronic docu- s least a portion of a structured electronic document, determining borders, margins, and/or paddings for the plument; and one or more playback controls. In response to detecting a first gesture on the item of inline multimedia rality of boxes that are specified in the structured electronic document; and content, the item of inline multimedia content on the touch adjusting the borders, margins, and/or paddings for the screen display is enlarged, and display of other content in the plurality of boxes for display on the touch screen disstructured electronic document besides the enlarged item of 10 play. inline multimedia content is ceased. In response to detecting 4. The method ofclaim 2, wherein the structuredelectronic a second gesture on the touch screen display while the enlarged item of inline multimedia content is displayed, the one or more playback controls for playing the enlarged item document is a web page. 5. The method ofclaim 2, wherein the structured electronic of inline multimedia content are displayed. In response to 15 document is an - - or XML document. 6. The method of claim 2, wherein: detecting a third gesture on one of the playback controls, the the structured electronic document has a document width enlarged item of inline multimedia content is played. The foregoing description, for purpose of explanation, has beendescribed with reference to specific embodiments. However, the illustrative discussions above are not intended to be 20 exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the 1-tion and its practical applications, to thereby enable 25 others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. and a document length; the touch screen display has a display width; and displaying at least a portion of the structured electronic document comprises scaling the document width to fit within the display width independent of the document length. 7. The method of claim 6, wherein: the touch screen display is rectangularwith a short axis and a long axis; the display width corresponds to the short axis when the structured electronic document is seen in portrait view; and the display width corresponds to the long axis when the structured electronic document is seen in landscape view. What is claimed is: 30 1. A computer-implementedmethod, comprising: at a portable electronic device with a touch screen display; displaying at least a portion of a web page on the touch 8. The method ofclaim 2, whereinthe plurality ofboxes are screen display, wherein the web page comprises a pludefined by a style sheet language. rality of boxes of content; 9. The method of claim 8, wherein the style sheet language detecting a first finger tap gesture at a location on the 35 is a cascading style sheet language. displayed portion of the web page; 10. The method of claim 2, wherein the first gesture is a determining a first box in the plurality of boxes at the finger gesture. location of the first finger tap gesture; and 11. The method of claim 2, wherein the first gesture is a enlarging and translating the web page so as to substan40 stylus gesture. tially center the first box on the touch screen display, 12. The method ofclaim 2, wherein the first gesture is a tap wherein enlarging comprises expanding the first box so that the width ofthe first box is substantially the same as gesture. the width of the touch screen display; 13. The method of claim 12, wherein the first gesture is a resizing text in the enlarged first box to meet or exceed a 45 double tap with a single finger, a double tap with two fingers, predetermined 21- text size on the touch screen a single tap with a single finger, or a single tap with two display; fingers. while the first box is enlarged, detecting a second finger tap 14. The method of claim 2, wherein: gesture on a second box other than the first box; and the structured electronic document has an associated renin response to detecting the second finger tap gesture, 50 der tree with a plurality of nodes; and determining the translating the web page so as to substantially center the first box at the location of the first gesture comprises: second box on the touch screen display. traversing down the render tree to determine a first node in the plurality of nodes that corresponds to the detected 2. A computer-implemented method, comprising: location of the first gesture; at a portable electronic device with a touch screen display; traversing up the render tree from the first node to a displayingat least a portionofa structured electronic docu- ss closest parent node that contains a logical grouping of ment on the touch screen display, whereinthe structured content; and electronic document comprises a plurality of boxes of identifying content corresponding to the closest parent content; node as the first box. detecting a first gesture at a location on the displayed portion of the structured electronic document; 60 15. The method of claim 14, wherein the logical grouping of content comprises a paragraph, an image, a plugin object, determining a first box in the plurality of boxes at the or a table. location of the first gesture; 16. The method of claim 14, wherein the closest parent enlarging and translating the structured electronic docunode is a replaced inline, a block, an inline block, or an inline ment so that the first box is substantially centered on the 65 table. touch screen display; 17. The method of claim 2, wherein enlarging and transwhile the first box is enlarged, a second gesture is detected on a second box other than the first box; and lating the structured electronic document comprises display- Conv provided bv USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027916 US 7,864,163 B2 27 28 ing at least a portion of the second box of the plurality of boxes of content on the touch screen display. 30. The method of claim 27, wherein the third gesture is a finger gesture. 18. The method of claim 2, wherein enlarging comprises 31. The method of claim 27, wherein the third gesture is a expanding the first box so that the width of the first box is stylus gesture. substantially the same as the width of the touch screen dis- 5 32. The method of claim 27, wherein the third gesture is a play. tap gesture. 19. The method of claim 2, including resizing text in the 33. The method of claim 32, wherein the third gesture is a enlarged first box to meet or exceed a predetermined minidouble tap witha single finger, a double tap with two fingers, mum text size on the touch screen display. a single tap with a single finger, or a single tap with two 20. The method of claim 19, wherein the text resizing 10 fmgers. comprises: 34. The method of claim 2, whereinthe second gesture and determining a scale factor by which the first box will be the first gesture are the same type of gesture. enlarged; 35. The method of claim 2, whereinthe second gesture is a dividing the predetermined ...:..:...-text size on the touch fmger gesture. screen display by the scaling factor to determine a mini- 15 36. The method of claim 2, whereinthe second gesture is a mum text size for text in the first box; and stylus gesture. if a text size for text in the first box is less than the deter37. The method ofclaim 2, wherein the second gesture is a mined text size, mcreasmg the text size for tap gesture. text in the first box to at least the determined minimum 38. The method of claim 37, whereinthe second gesture is text size. 20 a double tap with a single finger, a double tap with two fingers, 21. The method of claim 20, wherein: the first box has a a single tap with a single finger, or a single tap with two width; the display has a display width; and the scale factor is fingers. the display width divided by the width ofthe first box prior to enlarging. 22. The method of claim 19, wherein the resizing occurs 25 during the enlarging. 23. The method of claim 19, wherein the resizmg occurs after the enlarging. 39. The method of claim 2, including: detecting a swipe gesture on the touch screen display; and in response to detecting the swipe gesture, translating the displayed portion ofthe structured electronic document on the touch screen display. 40. The method of claim 39, whereintranslating comprises 24. The method of claim 2, including resizing text in the 30 vertical, horizontal, or diagonal movement of the structured structured electronic document to meet or exceed a predetermined minimum text size on the touch screen display. 25. The method of claim 24, wherein the text resizing comprises: electronic document on the touch screen display. 41. The method of claim 39, wherein the swipe gesture is a finger gesture. 42. The method of claim 39, wherein the swipe gesture is a determining a scale factor by which the first box will be 35 stylus gesture. enlarged; 43. The method of claim 2, including: dividing the predeterminedminimumtext size on the touch detecting a fifth gesture on the touch screen display, screen display by the scaling factor to determine a miniin response to detecting the fifth gesture, rotating the dismum text size for text in the structured electronic docuplayed portion ofthe structured electronic document on ment; and 40 the touch screen display by 90° . if a text size for text in the structured electronic document 44. The method of claim 43, wherein the fifth gesture is a is less than the determined =2- text size, L -a finger gesture. ing the text size for text in the structured electronic 45. The method of claim 44, wherein the fifth gesture is a document to at least the determined ...:..:...- text size. multifinger gesture. 26. The method of claim 24, wherein the text resizing 45 46. The method of claim 45, wherein the fifth gesture is a comprises: twisting gesture. identifying boxes containing text in the plurality ofboxes; 47. The method of claim 2, including: determining a scale factor by which the first box will be enlarged; dividing the predeterminedminimumtext size on the touch 50 screen display by the scaling factor to determine a minimum text size for text in the structured electronic document; and for each identified box containing text, ifa text size for text in the identified box is less than the determined mini- 55 mum text size, increasing the text size for text in the identified box to at least the determined =&- text size and adjusting the size of the identified box. 27. The method of claim 2, including: detecting a third gesture on the enlarged second box; and 60 , in response to detecting the third gesture, reducing in size the displayed portion of the structured electronic document. detecting a change in orientation of the device, in response to detecting the change in orientation of the device, rotating the displayed portion of the structured electronic document on the touch screen display by 90° 48. The method of claim 2, including: detecting a multi-finger de-pinch gesture on the touch screen display, in response to detecting the multi-finger de-pinch gesture, enlarging a portion ofthe displayed portion ofthe structured electronic documenton the touch screen display in accordance with a position of the multi-finger de-pinch gesture and an amount of finger movement in the multifinger de-pinch gesture. 49. A graphical user interface on a portable electronic device with a touch screen display, comprising: 28. The method ofclaim 27, whereinthe first box returns to 65 at least a portion of a structured electronic document, its size prior to being enlarged. whereinthe structured electronic document comprises a 29. The method of claim 27, wherein the third gesture and plurality of boxes of content; the first gesture are the same type of gesture. Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027917 US 7,864,163 B2 29 wherein: in response to detecting a first gesture at a location on the portion of the structured electronic document: a first box in the plurality of boxes at the location of the first gesture is determined; the structured electronic document is enlargedandtranslated so that the first box is substantially centered on the touch screen display; while the first box is enlarged, a .._1gesture is detected on a second box other than the first box; and in response to detecting the second gesture, the structured electronic document is translated so that the second box is substantially centered on the touch screen display. 50. A portable electronic device, comprising: a touch screen display; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including: instructions for displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content; instructions for detecting a first gesture at a location on the displayed portion of the structured electronic document; instructions for determining a first box in the plurality of boxes at the location of the first gesture; instructions for enlarging and translating the structured electronic document so that the first box is substantially centered on the touch screen display; instruction for, while the first box is enlarged, a second gesture is detected on a secondbox other than the first box; and instructions for, in response to detecting the second gesture, the structured electronic document is translated so that the second box is substantially centered on the touch screen display. 51. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device with a touch screen display, cause the device to: 30 5 means for detecting a first gesture at a location on the displayedportion ofthe structured electronic document; means for determining a first box in the plurality ofboxes at the location of the first gesture; means for enlarging and translating the structured electronic document so that the first box is substantially centered on the touch screen display; 10 means for, while the first box is enlarged, a second gesture is detected on a second box other thanthe first box; and means for, in response to detecting the second gesture, the structured electronic document is translated so that the second box is substantially centered on the touch screen display. 53. A portable electronic device, comprising: 15 a touch screen display; one or more processors; memory; and a program, wherein the program is stored in the memory and configured to be executed by the one or more processors, the 20 program including instructions for: displaying a user interface on the touch screen display, wherein the user interface includes: a displayed window of an application, the displayed windowbeing in full view on the touch screen display, 25 and one or more partially hidden windows ofthe application; while displaying the displayedwindow andthe one ormore 30 partially hidden windows, detecting a gesture on the touch screen display; in response to detecting the gesture, moving the displayed window partially or fully off the touch screen display, and moving a first partially hidden window into full view on 35 the touch screen display; and in response to detecting a gesture on an icon, a window of the application in the center of the touch screen display is enlarged. 54. The portable electronic device ofclaim 53, whereinthe detected gesture is a swipe gesture. 40 55. The portable electronic device of claim 53, including: in response to detecting the gesture, moving a second par- tially hidden window off the touch screen display. 56. The portable electronic device ofclaim 53, whereinthe 45 display at least a portion of a structured electronic document on the touch screen display, wherein the struc- tured electronic document comprises a plurality of boxes of content; detect a first gesture at a location on the displayed por- 50 tion of the structured electronic document; det.. . a first box in the plurality of boxes at the location of the first gesture; enlarge and translate the structured electronic document detected gesture is a left-to-right swipe gesture, including: in response to detecting the left-to-right swipe gesture: moving the displayed window partially off-screen to the right, moving the first partially hidden window into full view on the touch screen display, and moving a second partially hidden window completely off-screen. 57. The portable electronic device ofclaim 53, whereinthe user interface is displayedin response to activation ofan icon for initiating display of the user interface, and wherein the so that the first box is substantially centered on the 55 icon for initiating display of the user interface indicates the number of windows in the application. touch screen display; 58. The portable electronic device ofclaim 53, whereinthe while the first box is enlarged, detect a second gesture on a displayed window and the one or more partially hidden winsecond box other than the first box; and dows are displayed prior to detecting the gesture. in response to detecting the second gesture, translate the 59. A computer-implementedmethod, comprising: structured electronic document so that the second box is 60 at a portable electronic device with a touch screen display: substantially centered on the touch screen display. displaying a user interface on the touch screen display, 52. A portable electronic device with a touch screen diswherein the user interface includes: play, comprising: means for displaying at least a portion of a structured electronic document on the touch screen display, 65 whereinthe structured electronic document comprises'a plurality of boxes of content; a displayed window of an application, the displayed window being in full view on thetouch screen display, and one or more partially hiddenwindows ofthe application; Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027918 US 7,864,163 B2 31 32 while displayingthe displayedwindow andthe one ormore partially hidden windows, detecting a gesture on the in response to detecting a gesture on an icon, a window of the application in the center of the touch screen display touch screen display; is enlarged. in response to detecting the gesture, 61. A non-transitory computer readable storage medium moving the displayed window partially or fully off the 5 storing one or more programs, the one or more programs comprising instructions, which when executed by a portable touch screen display, and electronic device with atouch screen display, causethe device moving a first partially hidden window into full view on to: the touch screen display; and, display a user interface on the touch screen display, in response to detecting a gesture on an icon, a window of wherein the user interface includes: the application in the center of the touch screen display 10 is enlarged. 60. A graphical user interface on a portable electronic device with a touch screen display, the graphical user interface comprising: a displayed window of an application, the displayed win- 15 dow being in full view on the touch screen display, and one or more partially hidden windows of the application; wherein: while displaying the displayed window and the one or more partially hidden windows, a gesture is detected 20 on the touch screen display; in response to detecting the gesture: the displayed window is moved partially or fully off the touch screen display, and 25 a first partially hidden window is moved into full view on the touch screen display; and, a displayed window of an application, the displayed window being in full view on the touch screen display, and one or morepartially hiddenwindows ofthe application; while displaying the displayedwindow andthe one ormore partially hidden windows, detect a gesture on the touch screen display; in response to detecting the gesture, move the displayed window partially or fully off the touch screen display, and move a first partially hidden window into full view on the touch screen display; and in response to detecting a gesture on an icon, enlarge a window of the application in the center of the touch screen display. ***** Copy provided by USPTO from the PIRS Image Database on 06/20/2011 APLNDC00027919

Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.


Why Is My Information Online?