Motorola Mobility, Inc. v. Apple, Inc.

Filing 94

NOTICE by Motorola Mobility, Inc. of Filing Brief on Claim Construction (Attachments: # 1 Exhibit, # 2 Exhibit, # 3 Exhibit, # 4 Exhibit, # 5 Exhibit, # 6 Exhibit, # 7 Exhibit, # 8 Exhibit, # 9 Exhibit, # 10 Exhibit, # 11 Exhibit, # 12 Exhibit, # 13 Exhibit, # 14 Exhibit, # 15 Exhibit, # 16 Exhibit, # 17 Exhibit, # 18 Exhibit, # 19 Exhibit, # 20 Exhibit, # 21 Exhibit, # 22 Exhibit, # 23 Exhibit, # 24 Exhibit, # 25 Exhibit, # 26 Exhibit, # 27 Exhibit, # 28 Exhibit, # 29 Exhibit, # 30 Exhibit, # 31 Affidavit)(Giuliano, Douglas)

Download PDF
Exhibit 1 to Motorola’s Opening Claim Construction Brief July 28, 2011 1 1 1 1 1 1 1 101 1 1 1 1 1 Il t1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 6 (12) United States Patent (10) Patent No.: US 7,657,849 B2 Feb. 2, 2010 (45) Date of Patent: Chaudhri et al. (54) UNLOCKING A DEVICE BY PERFORMING 5,907,327 A 6,151,208 A 6,160,555 A 6,192,478 B1 6,249,606 B1 6,323,846 B1 6,347,290 B1 GESTURES ON AN UNLOCK IMAGE (75) Inventors: Imran Chaudhri, San Francisco, CA (US); Bas Ording, San Francisco, CA (US); Freddy Allen Anzures, San Francisco, CA (US); Marcel Van Os, San Francisco, CA (US); Stephen 0. Lemay, San Francisco, CA (US); Scott Forstall, Mountain View, CA (US); Greg Christie, San Jose, CA (US) FOREIGN PATENT DOCUMENTS EP Filed: (65) GridLock 1.32, Oct. 8, 2003, pp. 1-2, http://gridlocksen.softonic. com/palm.* Primary Examiner Dennis-Doon Chow Assistant Examiner Andres E Gutierrez (74) Attorney, Agent, or Firm Morgan, Lewis & Bockius Prior Publication Data US 2007/0150842 Al (51) Int. Cl. G06F 3/033 LLP Jun. 28, 2007 (57) (2006.01) (52) U.S. Cl. 715/863; 345/173; 345/179 (58) Field of Classification Search 713/154, 713/156, 182; 715/853, 863; 345/173, 179, 345/156 See application file for complete search history. (56) 2/2003 (Continued) Dec. 23, 2005 1 284 450 A2 OTHER PUBLICATIONS Appl. No.: 11/322,549 (22) 345/339 361/683 345/358 713/202 702/195 345/173 702/150 (Continued) Subject to any disclaimer, the term of this patent is extended or adjusted under 35 U.S.C. 154(b) by 394 days. (21) Ogura et al. Bartlett Kang et al. Elledge Kiraly et al. Westerman et al. Bartlett (Continued) (73) Assignee: Apple Inc., Cupertino, CA (US) * ) Notice: 5/1999 11/2000 12/2000 2/2001 6/2001 11/2001 2/2002 References Cited U.S. PATENT DOCUMENTS 5,465,084 A 11/1995 Cottrell 340/825.31 5,559,961 A 9/1996 Blonder 395/188.01 5,677,710 A 10/1997 Thompson-Rohrlich .... 345/173 5,821,933 A * 10/1998 Keller et al. 715/741 ABSTRACT A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture. Device 1000 EXHIBIT 1 PAGE 1 23 Claims, 15 Drawing Sheets US 7,657,849 B2 Page 2 U.S. PATENT DOCUMENTS WO 7/2002 Kanevsky et al. 6,421,453 B1 382/115 6,570,557 B1 5/2003 Westerman et al. 345/173 6,573,883 B1 6/2003 Bartlett 345/156 6,633,310 B1 10/2003 Andrew et al. 345/728 1/2004 Westerman 6,677,932 B1 345/173 6,720,860 B1 4/2004 Narayanaswami 340/5.54 5/2004 Gopalakrishnan et al. 713/186 6,735,695 B1 7,124,433 B2 10/2006 Little 726/2 7,151,843 B2 12/2006 Rui et al. 382/103 7,174,462 B2 2/2007 Pering et al. 713/182 7,263,670 B2 8/2007 Rekimoto 715/837 2001/0011308 Al 8/2001 Clark et al 710/20 2001/0012022 Al 8/2001 Smith 345/768 2/2002 Westerman et al. 2002/0015024 Al 345/173 2002/0196274 Al 12/2002 Comfort et al. 345/741 2003/0142138 Al 7/2003 Brown et al. 345/797 2/2004 Mizoguchi et al. 2004/0030934 Al 713/202 2004/0034801 Al * 2/2004 Jaeger 713/202 2004/0085351 Al 5/2004 Tokkonen 345/741 2004/0088568 Al 5/2004 Tokkonen 713/200 2004/0230843 Al 11/2004 Jansen 713/202 2004/0250138 Al 12/2004 Schneider 713/202 2004/0260955 Al 12/2004 Mantyla 713/202 2004/0268267 Al 12/2004 Moravcsik 715/821 2005/0050477 Al 3/2005 Robertson et al. 715/853 2005/0060554 Al 3/2005 O'Donoghue 713/183 2005/0079896 Al 4/2005 Kokko et al. 455/566 2005/0212760 Al * 9/2005 Marvit et al. 345/156 2005/0216862 Al 9/2005 Shinohara et al. 715/825 11/2005 Sawanobori 2005/0248542 Al 345/173 2005/0253817 Al * 11/2005 Rytivaara et al. 345/173 2005/0264833 Al 12/2005 Hiraoka et al. 358/1.9 2006/0174339 Al 8/2006 Tao 726/18 2006/0267955 Al 11/2006 Hino 345/173 2008/0034292 Al 2/2008 Brunner et al. 715/700 3/2008 Shinohara et al. 2008/0072172 Al 715/772 FOREIGN PATENT DOCUMENTS GB JP JP JP WO WO 2 313 460 A 11/1997 60 171560 9/1985 2 249062 10/1990 6 214954 8/1994 WO 02/33882 Al 4/2002 WO 04/001560 Al 12/2003 WO 2004/021108 A2 3/2004 OTHER PUBLICATIONS A Passive-Style Buttonless Mobile Terminal; Youichi Horry, Issyu Nakajima, Takeshi Hoshino, andYukinobu Maruyama; IEEE Transactions on Consumer Electronics; vol. 49; No. 3; Aug. 2003; pp. 530-535.* PCT International Search Report for International Application No. PCT/US2006/061370, mailed May 25, 2007. IBM Corp., "Access/Control Icons (Icon Keys)," IBM Technical Disclosure Bulletin, vol. 38, No. 4, Apr. 1995, pp. 407-409. "Touch Password Protection," JGUI Professional, http://www.jgui. net/touch/index.html, Dec. 8, 2003. Office Action dated Oct. 31, 2007 for related U.S. Appl. No. 11/322,550, filed Dec. 23, 2005, pp. 1-26. Office Action dated Apr. 21, 2008 for related U.S. Appl. No. 11/322,550, filed Dec. 23, 2005, pp. 1-22. International Search Report for International Application No. PCT/ US2006/061380, mailed Apr. 23, 2007. Notice of Allowance dated Sep. 19, 2008, for related U.S. Appl. No. 11/322,550. Bardram, J., "The Trouble with Login: on usability and computer security in Ubiquitous Computing," Centre for Pervasive Healthcare, Department of Computer Science, University of Aahus, Published online: Jul. 23, 2005. Fitzpatrick, G. et al., "Method for Access Control via Gestural Verification," IBM Technical Disclosure Bulletin, vol. 36, No. 09B, Sep. 1993, 2 pages. IBM, "Touch Pad Authentication," Sep. 21, 2004, 2 pages. Jansen, W., "Authenticating Users on Handheld Devices," The National Institute of Standards and Technology, Gaithersburg, Maryland, 2003, 13 pages. Jermyn, I. et al., "The Design and Analysis of Graphical Passwords," Submission to the 8th Usenix Security Symposium, Aug. 1999, 15 pages. Monrose, N., "Towards Stronger User Authentication," Ph.d dissertation, 1999, New York University, vol. 60/05-B of Dissertation Abstract International, Order No. AAD99-30229, 115 pages. Najjar, L., "Graphical Passwords," International Technology Disclosures vol. 10, No. 1, Jan. 25, 1992, 1 page. Renaud, K. et al., "My Password is Here! An Investigation into Visuo-Spatial Authentication Mechanisms," 2004, 25 pages, www. sciencedirect.com . Wiedenbeck, S., et al. "PassPoints: Design and Longitudinal Evaluation of a Graphical Password System," International Journal of Human-Computer Studies, vol. 63, Issues 1-2, Jul. 2005, pp. 102127. Office Action dated Feb. 4, 2009, received in the German patent application which corresponds to U.S. Appl. No. 11/322,549. Office Action dated Mar. 25, 2009, received in the European patent application which corresponds to U.S. Appl. No. 11/322,549. * cited by examiner EXHIBIT 1 PAGE 2 U.S. Patent Sheet 1 of 15 Feb. 2, 2010 132 134 138 140 142 Operating System Communication Module Contact/Motion Module Graphics Module 'Optical Intensity Module ,-146 Application(s) 104 1...,0 Power System _/-150 .y-152 Lock Module Unlock Module 1110--- ) ../-144 User Interface State Module ( - 108 Memory Controller 11e‘ 110 Peripherals Interface 106.1 7,657,849 B2 Device 100 Figure 1 Memory 102 US 11ex CPU 110.-- T RF Circuitry 112 External Port 148 Audio Circuitry 114 Microphone 118 - I/O Subsystem 120 Touch-Screen Controller 122 11 0-/ Other Input Controller(s) 124 110- - Touch Screen 126 Speaker 116 ---1 Other Input / Control Devices 128 EXHIBIT 1 PAGE 3 U.S. Patent Sheet 2 of 15 Feb. 2, 2010 US 7,657,849 B2 200 ---... User Device 202 --. Device set to user interface lock state by any predefined manner, prevent device from performing predefined set of actions 204 - .... 206 Contact the touch sensitive display Display visual cue(s) of unlock action (e.g., gesture) 208 • --...., Detect contact with touch sensitive display 210 Does contact correspond to unlock gesture? Yes 212 Figure 2 1 Maintain device in user interface lock state 214 • Transition device to user interface unlock state EXHIBIT 1 PAGE 4 .40- U.S. Patent Feb. 2, 2010 User Sheet 3 of 15 300 ---.., US 7,657,849 B2 Device Device set to user interface lock state by 302 any predefined manner, prevent --.... device from performing predefined set of actions 1 304 Display unlock image and visual cue(s) of unlock action using the image (e.g., gesture, moving image to location, moving image along path) 306 ... Contact the touch sensitive display 308 Detect contact with touch sensitive display 310 Does contact Yes correspond to unlock gesture using the image? 312 No Maintain device in user interface lock state Figure 3 314 Transition device to user interface unlock state EXHIBIT 1 PAGE 5 .41-..-. U.S. Patent Feb. 2, 2010 Sheet 4 of 15 US 7,657,849 B2 Device 400 Touch Screen 408 Figure 4A Menu Button 410 Touch Screen 408 Device 400 Figure 4B Menu Button 410 EXHIBIT 1 PAGE 6 U.S. Patent Device 400 Feb. 2, 2010 Sheet 5 of 15 US 7,657,849 B2 Touch Screen 408 402 Figure 5A 404 502 Menu Button 410 Device 400 Touch Screen 408 Figure 5B Menu Button 410 EXHIBIT 1 PAGE 7 U.S. Patent Feb. 2, 2010 Sheet 6 of 15 US 7,657,849 B2 Device 400 402 Figure 5C Movement 504 i..) Menu Button 410 Touch Screen 408 Device 400 Figure 5D 506 Menu Button 410 EXHIBIT 1 PAGE 8 U.S. Patent Feb. 2, 2010 Sheet 7 of 15 US 7,657,849 B2 600 --Th ‘ 602 While an electronic device is in a first userinterface state, detect progress towards satisfaction of a user input condition needed to transition to a second user-interface state 1w 604 While the device is in the first user-interface state, indicate progress towards satisfaction of the condition by transitioning an optical intensity of one or more user interface objects associated with the second user-interface state • 606 V Transition the device to the second userinterface state if the condition is satisfied Figure 6 EXHIBIT 1 PAGE 9 U.S. Patent Feb. 2, 2010 Sheet 8 of 15 US 7,657,849 B2 Touch screen 714 Incoming call from: John Doe mobile Device 700 706 Figure 7A ' 702 704 710 Touch screen 714 706 Incoming call from: John Doe mobile Device 700 ...._,1 ,.. „...--------_, Accept ( Decline) , .--..._________. --................-- Figure 7B 708 710 Movement 712 702 704 ■IIMMEMIL EXHIBIT 1 PAGE 10 U.S. Patent Device 700 Feb. 2, 2010 Sheet 9 of 15 Incoming call from: John Doe mobile US 7,657,849 B2 Figure 7C Movement 712 Device 700 Incoming call from: John Doe mobile EXHIBIT 1 PAGE 11 Figure 7D U.S. Patent Feb. 2, 2010 Sheet 10 of 15 Figure 8A Figure 8B Figure 8C EXHIBIT 1 PAGE 12 US 7,657,849 B2 U.S. Patent User Feb. 2, 2010 Sheet 11 of 15 US 7,657,849 B2 900 ---...,1 Device Device set to user interface lock state by any predefined manner, ...._ prevent device from performing predefined set of actions 902 904 --..... 906 C ontact the touch Display 2 (or more) unlock images and visual cue(s) of manner of transitioning to active state using the images sensitive display Detect contact with touch sensitive display 910 Does contact Yes correspond to unlock gesture using an image? Figure 9 912 Maintain device in user interface lock state 914 Transition device to active state corresponding to the image used (e.g., first image -> first active state, second image -> second active state, ...) EXHIBIT 1 PAGE 13 U.S. Patent Feb. 2, 2010 Sheet 12 of 15 Device 1000 Figure 10 EXHIBIT 1 PAGE 14 US 7,657,849 B2 U.S. Patent Feb. 2, 2010 Sheet 13 of 15 US 7,657,849 B2 Device 1000 Figure 11A Device 1000 Figure 11B EXHIBIT 1 PAGE 15 U.S. Patent Feb. 2, 2010 Sheet 14 of 15 US 7,657,849 B2 Figure 11C Device 1000 Device Figure 11D 1 00 0 EXHIBIT 1 PAGE 16 U.S. Patent Feb. 2, 2010 Sheet 15 of 15 Device 1000 US 7,657,849 B2 Figure 11E Music Player Device 1000 Song A Artist X I 2 : 05/4 : 21 I *S000** L _J y 1108 Touch Screen 1014 EXHIBIT 1 PAGE 17 Figure 11F US 7,657,849 B2 1 2 UNLOCKING A DEVICE BY PERFORMING GESTURES ON AN UNLOCK IMAGE ing contact with the touch-sensitive display while the device is in a user-interface lock state; moving an image corresponding to a user-interface unlock state of the device in accordance with the contact; transitioning the device to the user-interface unlock state if the detected contact corresponds to a predefined gesture; and maintaining the device in the user-interface lock state if the detected contact does not correspond to the predefined gesture. In some embodiments, a method of controlling a device with a touch-sensitive display includes: displaying an image on the touch-sensitive display while the device is in a userinterface lock state; detecting contact with the touch-sensitive display; transitioning the device to a user-interface unlock state if the detected contact corresponds to moving the image to a predefined location on the touch-sensitive display; and maintaining the device in the user-interface lock state if the detected contact does not correspond to moving the image to the predefined location. In some embodiments, a method of controlling a device with a touch-sensitive display includes: displaying an image on the touch-sensitive display while the device is in a userinterface lock state; detecting contact with the touch-sensitive display; and transitioning the device to a user-interface unlock state if the detected contact corresponds to moving the image on the touch-sensitive display according to a predefined path on the touch-sensitive display; and maintaining the device in the user-interface lock state if the detected contact does not correspond to moving the image according to the predefined path. In some embodiments, a method of controlling a device with a touch-sensitive display includes: displaying first and second images on the touch-sensitive display while the device is in a user-interface lock state; detecting contact with the touch-sensitive display; transitioning the device to a first active state corresponding to the first image if the detected contact corresponds to a predefined gesture with respect to the first image; and transitioning the device to a second active state distinct from the first active state if the detected contact corresponds to a predefined gesture with respect to the second image. The aforementioned methods may be performed by a portable electronic device having a touch-sensitive display with a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing these methods. In some embodiments, the portable electronic device provides a plurality of functions, including wireless communication. Instructions for performing the aforementioned methods may be included in a computer program product configured for execution by one or more processors. In some embodiments, the executable computer program product includes a computer readable storage medium (e.g., one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state memory devices) and an executable computer program mechanism embedded therein. RELATED APPLICATIONS 5 This application is related to U.S. patent application Ser. No. 11/322,550, titled "Indication of Progress Towards Satisfaction of a User Input Condition," filed Dec. 23, 2005, which application is incorporated by reference herein in its entirety. 10 TECHNICAL FIELD The disclosed embodiments relate generally to user interfaces that employ touch-sensitive displays, and more particularly, to the unlocking of user interfaces on portable electronic devices. BACKGROUND 15 Touch-sensitive displays (also known as "touch screens" or 20 "touchscreens") are well known in the art. Touch screens are used in many electronic devices to display graphics and text, and to provide a user interface through which a user may interact with the devices. A touch screen detects and responds to contact on the touch screen. A device may display one or 25 more soft keys, menus, and other user-interface objects on the touch screen. A user may interact with the device by contacting the touch screen at locations corresponding to the userinterface objects with which she wishes to interact. Touch screens are becoming more popular for use as dis- 30 plays and as user input devices on portable devices, such as mobile telephones and personal digital assistants (PDAs). One problem associated with using touch screens on portable devices is the unintentional activation or deactivation of functions due to unintentional contact with the touch screen. Thus, 35 portable devices, touch screens on such devices, and/or applications running on such devices may be locked upon satisfaction of predefined lock conditions, such as upon entering an active call, after a predetermined time of idleness has 40 elapsed, or upon manual locking by a user. Devices with touch screens and/or applications running on such devices may be unlocked by any of several well-known unlocking procedures, such as pressing a predefined set of buttons (simultaneously or sequentially) or entering a code or password. These unlock procedures, however, have draw- 45 backs. The button combinations may be hard to perform. Creating, memorizing, and recalling passwords, codes, and the like can be quite burdensome. These drawbacks may reduce the ease of use of the unlocking process and, as a consequence, the ease of use of the device in general. 50 Accordingly, there is a need for more efficient, userfriendly procedures for unlocking such devices, touch screens, and/or applications. More generally, there is a need for more efficient, user-friendly procedures for transitioning such devices, touch screens, and/or applications between user 55 interface states (e.g., from a user interface state for a first application to a user interface state for a second application, BRIEF DESCRIPTION OF THE DRAWINGS between user interface states in the same application, or between locked and unlocked states). In addition, there is a For a better understanding of the aforementioned embodineed for sensory feedback to the user regarding progress 60 ments of the invention as well as additional embodiments towards satisfaction of a user input condition that is required thereof, reference should be made to the Description of for the transition to occur. Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding SUMMARY 65 parts throughout the figures. In some embodiments, a method of controlling an elecFIG. 1 is a block diagram illustrating a portable electronic tronic device with a touch-sensitive display includes: detectdevice, according to some embodiments of the invention. EXHIBIT 1 PAGE 18 US 7,657,849 B2 4 3 devices, or other non-volatile solid state memory devices. In some embodiments, the memory 102 may further include storage remotely located from the one or more processors 106, for instance network attached storage accessed via the RF circuitry 112 or external port 148 and a communications network (not shown) such as the Internet, intranet(s), Local Area Networks (LANs), Wide Local Area Networks (WLANs), Storage Area Networks (SANs) and the like, or any suitable combination thereof. Access to the memory 102 by other components of the device 100, such as the CPU 106 and the peripherals interface 108, may be controlled by the memory controller 104. The peripherals interface 108 couples the input and output peripherals of the device to the CPU 106 and the memory 102. The one or more processors 106 run various software programs and/or sets of instructions stored in the memory 102 to perform various functions for the device 100 and to process data. In some embodiments, the peripherals interface 108, the CPU 106, and the memory controller 104 may be implemented on a single chip, such as a chip 111. In some other embodiments, they may be implemented on separate chips. The RF (radio frequency) circuitry 112 receives and sends electromagnetic waves. The RF circuitry 112 converts electrical signals to/from electromagnetic waves and communicates with communications networks and other communications devices via the electromagnetic waves. The RF circuitry 112 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a DESCRIPTION OF EMBODIMENTS subscriber identity module (SIM) card, memory, and so forth. The RF circuitry 112 may communicate with the networks, Reference will now be made in detail to embodiments, such as the Internet, also referred to as the World Wide Web examples of which are illustrated in the accompanying draw- 35 (WWW), an Intranet and/or a wireless network, such as a ings. In the following detailed description, numerous specific cellular telephone network, a wireless local area network details are set forth in order to provide a thorough understand(LAN) and/or a metropolitan area network (MAN), and other ing of the present invention. However, it will be apparent to devices by wireless communication. The wireless communione of ordinary skill in the art that the present invention may cation may use any of a plurality of communications stanbe practiced without these specific details. In other instances, 40 dards, protocols and technologies, including but not limited well-known methods, procedures, components, and circuits to Global System for Mobile Communications (GSM), have not been described in detail so as not to unnecessarily Enhanced Data GSM Environment (EDGE), wideband code obscure aspects of the embodiments. division multiple access (W-CDMA), code division multiple FIG. 1 illustrates a portable electronic device, according to access (CDMA), time division multiple access (TDMA), some embodiments of the invention. The device 100 includes 45 Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, a memory 102, a memory controller 104, one or more proIEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice cessing units (CPU's) 106, a peripherals interface 108, RF over Internet Protocol (VoIP), Wi-MAX, a protocol for email, circuitry 112, audio circuitry 114, a speaker 116, a microinstant messaging, and/or Short Message Service (SMS)), or phone 118, an input/output (I/O) subsystem 120, a touch any other suitable communication protocol, including comscreen 126, other input or control devices 128, and an external so munication protocols not yet developed as of the filing date of port 148. These components communicate over the one or this document. more communication buses or signal lines 110. The device The audio circuitry 114, the speaker 116, and the micro100 can be any portable electronic device, including but not phone 118 provide an audio interface between a user and the limited to a handheld computer, a tablet computer, a mobile device 100. The audio circuitry 114 receives audio data from phone, a media player, a personal digital assistant (PDA), or 55 the peripherals interface 108, converts the audio data to an the like, including a combination of two or more of these electrical signal, and transmits the electrical signal to the items. It should be appreciated that the device 100 is only one speaker 116. The speaker converts the electrical signal to example of a portable electronic device 100, and that the human-audible sound waves. The audio circuitry 114 also device 100 may have more or fewer components than shown, receives electrical signals converted by the microphone 116 or a different configuration of components. The various com- 60 from sound waves. The audio circuitry 114 converts the elecponents shown in FIG. 1 may be implemented in hardware, trical signal to audio data and transmits the audio data to the software or a combination of both hardware and software, peripherals interface 108 for processing. Audio data may be including one or more signal processing and/or application may be retrieved from and/or transmitted to the memory 102 specific integrated circuits. and/or the RF circuitry 112 by the peripherals interface 108. The memory 102 may include high speed random access 65 In some embodiments, the audio circuitry 114 also includes a memory and may also include non-volatile memory, such as headset jack (not shown). The headset jack provides an interone or more magnetic disk storage devices, flash memory face between the audio circuitry 114 and removable audio FIG. 2 is a flow diagram illustrating a process for transitioning a device to a user-interface unlock state, according to some embodiments of the invention. FIG. 3 is a flow diagram illustrating a process for transitioning a device to a user-interface unlock state, according to 5 some embodiments of the invention. FIGS. 4A-4B illustrate the GUI display of a device in a user-interface lock state, according to some embodiments of the invention. FIGS. 5A-5D illustrate the GUI display of a device at 10 various points of the performance of an unlock action gesture, according to some embodiments of the invention. FIG. 6 is a flow diagram illustrating a process for indicating progress towards satisfaction of a user input condition according to some embodiments of the invention. 15 FIGS. 7A-7D illustrate the GUI display of a device that is transitioning the optical intensity of user-interface objects, according to some embodiments of the invention. FIGS. 8A-8C are graphs illustrating optical intensity as a function of the completion of the user input condition, 20 according to some embodiments of the invention. FIG. 9 is a flow diagram illustrating a process for transitioning a device to a user interface active state, according to some embodiments of the invention. FIG. 10 illustrates the GUI of a device in a user-interface 25 lock state that displays a plurality of unlock images, according to some embodiments of the invention. FIGS. 11A-11F illustrate the GUI display of a device at various points in the performance of an unlock action gesture, according to some embodiments of the invention. 30 EXHIBIT 1 PAGE 19 5 US 7,657,849 B2 6 input/output peripherals, such as output-only headphones or a system, a power failure detection circuit, a power converter or headset with both output (headphone for one or both ears) and inverter, a power status indicator (e.g., a light-emitting diode input (microphone). (LED)) and any other components associated with the gen The I/O subsystem 120 provides the interface between eration, management and distribution of power in portable input/output peripherals on the device 100, such as the touch 5 devices. screen 126 and other input/control devices 128, and the In some embodiments, the software components include peripherals interface 108. The I/O subsystem 120 includes a an operating system 132, a communication module (or set of touch-screen controller 122 and one or more input controllers instructions) 134, a contact/motion module (or set of instruc 124 for other input or control devices. The one or more input tions) 138, a graphics module (or set of instructions) 140, a controllers 124 receive/send electrical signals from/to other 10 user interface state module (or set of instructions) 144, and input or control devices 128. The other input/control devices one or more applications (or set of instructions) 146. 128 may include physical buttons (e.g., push buttons, rocker The operating system 132 (e.g., Darwin, RTXC, LINUX, buttons, etc.), dials, slider switches, sticks, and so forth. UNIX, OS X, WINDOWS, or an embedded operating system The touch screen 126 provides both an output interface and such as VxWorks) includes various software components an input interface between the device and a user. The touch- 15 and/or drivers for controlling and managing general system screen controller 122 receives/sends electrical signals from/ tasks (e.g., memory management, storage device control, to the touch screen 126. The touch screen 126 displays visual power management, etc.) and facilitates communication output to the user. The visual output may include text, graphbetween various hardware and software components. ics, video, and any combination thereof. Some or all of the The communication module 134 facilitates communicavisual output may correspond to user-interface objects, fur- 2o tion with other devices over one or more external ports 148 ther details of which are described below. and also includes various software components for handling The touch screen 126 also accepts input from the user data received by the RF circuitry 112 and/or the external port based on haptic and/or tactile contact. The touch screen 126 148. The external port 148 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other forms a touch-sensitive surface that accepts user input. The touch screen 126 and the touch screen controller 122 (along 25 devices or indirectly over a network (e.g., the Internet, wire with any associated modules and/or sets of instructions in the less LAN, etc.). memory 102) detects contact (and any movement or break of The contact/motion module 138 detects contact with the the contact) on the touch screen 126 and converts the detected touch screen 126, in conjunction with the touch-screen con contact into interaction with user-interface objects, such as troller 122. The contact/motion module 138 includes various one or more soft keys, that are displayed on the touch screen. 30 software components for performing various operations In an exemplary embodiment, a point of contact between the related to detection of contact with the touch screen 122, such touch screen 126 and the user corresponds to one or more as determining if contact has occurred, determining if there is digits of the user. The touch screen 126 may use LCD (liquid movement of the contact and tracking the movement across crystal display) technology, or LPD (light emitting polymer the touch screen, and determining if the contact has been display) technology, although other display technologies may 35 broken (i.e., if the contact has ceased). Determining move be used in other embodiments. The touch screen 126 and ment of the point of contact may include determining speed touch screen controller 122 may detect contact and any move(magnitude), velocity (magnitude and direction), and/or an ment or break thereof using any of a plurality of touch sensi- acceleration (including magnitude and/or direction) of the tivity technologies, including but not limited to capacitive, point of contact. In some embodiments, the contact/motion resistive, infrared, and surface acoustic wave technologies, as 40 module 126 and the touch screen controller 122 also detects well as other proximity sensor arrays or other elements for contact on the touchpad. determining one or more points of contact with the touch The graphics module 140 includes various known software screen 126. The touch-sensitive display may be analogous to components for rendering and displaying graphics on the the multi-touch sensitive tablets described in the following touch screen 126. Note that the term "graphics" includes any U.S. Pat. Nos. 6,323,846 (Westerman et al.), 6,570,557 (Wes- 45 object that can be displayed to a user, including without terman et al.), and/or 6,677,932 (Westerman), and/or U.S. limitation text, web pages, icons (such as user-interface Patent Publication 2002/0015024A1, each of which is hereby objects including soft keys), digital images, videos, anima incorporated by reference. However, the touch screen 126 tions and the like. displays visual output from the portable device, whereas In some embodiments, the graphics module 140 includes touch sensitive tablets do not provide visual output. The touch so an optical intensity module 142. The optical intensity module screen 126 may have a resolution in excess of 100 dpi. In an 142 controls the optical intensity of graphical objects, such as exemplary embodiment, the touch screen 126 may have a user-interface objects, displayed on the touch screen 126. resolution of approximately 168 dpi. The user may make Controlling the optical intensity may include increasing or contact with the touch screen 126 using any suitable object or decreasing the optical intensity of a graphical object. In some appendage, such as a stylus, finger, and so forth. 55 embodiments, the increase or decrease may follow predefined In some embodiments, in addition to the touch screen, the functions. device 100 may include a touchpad (not shown) for activating The user interface state module 144 controls the user interor deactivating particular functions. In some embodiments, face state of the device 100. The user interface state module 144 may include a lock module 150 and an unlock module the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The 60 152. The lock module detects satisfaction of any of one or touchpad may be a touch-sensitive surface that is separate more conditions to transition the device 100 to a user-inter from the touch screen 126 or an extension of the touchface lock state and to transition the device 100 to the lock sensitive surface formed by the touch screen 126. state. The unlock module detects satisfaction of any of one or The device 100 also includes a power system 130 for pow- more conditions to transition the device to a user-interface ering the various components. The power system 130 may 65 unlock state and to transition the device 100 to the unlock include a power management system, one or more power state. Further details regarding the user interface states are sources (e.g., battery, alternating current (AC)), a recharging described below. EXHIBIT 1 PAGE 20 US 7,657,849 B2 7 8 The one or more applications 130 can include any applications installed on the device 100, including without limitation, a browser, address book, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights man- 5 agement, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), a music player (which plays back recorded music stored in one or more files, such as MP3 or AAC files), etc. 10 In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.). The device 100 may, therefore, include a 3 6-pin connector that is compatible with the iPod. In some embodiments, the device 100 may include one or more 15 optional optical sensors (not shown), such as CMOS or CCD image sensors, for use in imaging applications. In some embodiments, the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through the touch screen 126 and, if 20 included on the device 100, the touchpad. By using the touch screen and touchpad as the primary input/control device for operation of the device 100, the number of physical input/ control devices (such as push buttons, dials, and the like) on the device 100 may be reduced. In one embodiment, the 25 device 100 includes the touch screen 126, the touchpad, a push button for powering the device on/off and locking the device, a volume adjustment rocker button and a slider switch for toggling ringer profiles. The push button may be used to turn the power on/off on the device by depressing the button 30 and holding the button in the depressed state for a predefined time interval, or may be used to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed. In an alternative embodiment, the device 100 also may accept verbal input for activation or deactiva- 35 tion of some functions through the microphone 118. The predefined set of functions that are performed exclusively through the touch screen and the touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the device 40 100 to a main, home, or root menu from any user interface that may be displayed on the device 100. In such embodiments, the touchpad may be referred to as a "menu button." In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a 45 touchpad. device 100 may be said to be locked. In some embodiments, the device 100 in the lock state may respond to a limited set of user inputs, including input that corresponds to an attempt to transition the device 100 to the user-interface unlock state or input that corresponds to powering the device 100 off. In other words, the locked device 100 responds to user input corresponding to attempts to transition the device 100 to the user-interface unlock state or powering the device 100 off, but does not respond to user input corresponding to attempts to navigate between user interfaces. It should be appreciated that even if the device 100 ignores a user input, the device 100 may still provide sensory feedback (such as visual, audio, or vibration feedback) to the user upon detection of the input to indicate that the input will be ignored. In embodiments where the device 100 includes the touch screen 126, while the device 100 is locked, a predefined set of operations, such as navigation between user interfaces, is prevented from being performed in response to contact on the touch screen 126 when the device 100 is locked. In other words, when the contact is being ignored by the locked device 100, the touch screen may be said to be locked. A locked device 100, however, may still respond to a limited class of contact on the touch screen 126. The limited class includes contact that is determined by the device 100 to correspond to an attempt to transition the device 100 to the user-interface unlock state. In the user-interface unlock state (hereinafter the "unlock state"), the device 100 is in its normal operating state, detecting and responding to user input corresponding to interaction with the user interface. A device 100 that is in the unlock state may be described as an unlocked device 100. An unlocked device 100 detects and responds to user input for navigating between user interfaces, entry of data and activation or deactivation of functions. In embodiments where the device 100 includes the touch screen 126, the unlocked device 100 detects and responds to contact corresponding to navigation between user interfaces, entry of data and activation or deactivation of functions through the touch screen 126. Unlocking a Device via Gestures FIG. 2 is a flow diagram illustrating a process 200 for transitioning a device to a user-interface unlock state, according to some embodiments of the invention. As used herein, transitioning from one state to another refers to the process of going from one state to another. The process may be, as perceived by the user, instantaneous, near-instantaneous, User Interface States gradual or at any suitable rate. The progression of the process The device 100 may have a plurality of user interface states. so may be controlled automatically by the device, such as the device 100 (FIG. 1), independent of the user, once the process A user interface state is a state in which the device 100 is activated; or it may be controlled by the user. While the responds in a predefined manner to user input. In some process flow 200 described below includes a number of embodiments, the plurality of user interface states includes a operations that appear to occur in a specific order, it should be user-interface lock state and a user-interface unlock state. In some embodiments, the plurality of user interface states 55 apparent that these processes may include more or fewer operations, which may be executed serially or in parallel (e.g., includes states for a plurality of applications. using parallel processors or a multi-threading environment). In the user-interface lock state (hereinafter the "lock state"), the device 100 is powered on and operational but A device is set to the lock state (202). The device may be set ignores most, if not all, user input. That is, the device 100 (that is, transition completely to the lock state from any other takes no action in response to user input and/or the device 100 60 state) to the locked state upon satisfaction of any of one or is prevented from performing a predefined set of operations in more lock conditions. The lock conditions may include events response to the user input. The predefined set of operations such as the elapsing of a predefined time of inactivity, entry may include navigation between user interfaces and activainto an active call, or powering on the device. The lock contion or deactivation of a predefined set of functions. The lock ditions may also include user intervention, namely the user state may be used to prevent unintentional or unauthorized 65 locking the device by a predefined user input. In some use of the device 100 or activation or deactivation of functions embodiments, the user may be allowed to specify the events on the device 100. When the device 100 is in the lock state, the that serve as lock conditions. For example, the user may EXHIBIT 1 PAGE 21 9 US 7,657,849 B2 configure the device to transition to the lock state upon the elapsing of a predefined time of inactivity but not upon powering on the device. 10 on the touch screen and aborts the transition as soon as the device determines that the contact does not correspond to an unlock action or is a failed/aborted unlock action. For In some embodiments, the locked device displays on the example, if the unlock action is a predefined gesture, the touch screen one or more visual cues of an unlock action that 5 device may begin the process of transitioning to the unlock the user may perform to unlock the device (204). The visual state as soon as it detects the initial contact of the gesture and cue(s) provide hints or reminders of the unlock action to the continues the progression of the transition as the gesture is user. The visual cues may be textual, graphical or any com- performed. If the user aborts the gesture before it is com bination thereof. In some embodiments, the visual cues are pleted, the device aborts the transition and remains in the lock displayed upon particular events occurring while the device is 10 state. If the gesture is completed, the device completes the locked. The particular events that trigger display of the visual transition to the unlock state and becomes unlocked. As cues may include an incoming call, incoming message, or another example, if the unlock action is a horizontal move some other event that may require the user's attention. In ment of the point of contact across the touch screen while some embodiments, the visual cues may also be displayed maintaining continuous contact with the touch screen, and the upon particular user inputs, such as the user interacting with 15 user taps the touch screen once, the device begins the process the menu button, the user making contact with the locked of the state transition as soon as it detects the tap but also touch screen and/or the user interacting with any other input/ aborts the process soon after because it realizes that the tap is control device. The locked device, when not displaying the just a tap and does not correspond to the unlock action. visual cues, may power down the touch screen (which helps to While the device is unlocked, the device may display on the conserve power) or display other objects on the touch screen, 20 touch screen user-interface objects corresponding to one or such as a screen saver or information that may be of interest to more functions of the device and/or information that may be the user (e.g., battery charge remaining, date and time, net- of interest to the user. The user-interface objects are objects work strength, etc.). that make up the user interface of the device and may include, The unlock action includes contact with the touch screen. without limitation, text, images, icons, soft keys (or "virtual In some embodiments, the unlock action is a predefined ges- 25 buttons"), pull-down menus, radio buttons, check boxes, ture performed on the touch screen. As used herein, a gesture selectable lists, and so forth. The displayed user-interface is a motion of the object/appendage making contact with the objects may include non-interactive objects that convey infor touch screen. For example, the predefined gesture may mation or contribute to the look and feel of the user interface, include a contact of the touch screen on the left edge (to interactive objects with which the user may interact, or any initialize the gesture), a horizontal movement of the point of 30 combination thereof. The user may interact with the user contact to the opposite edge while maintaining continuous interface objects by making contact with the touch screen at contact with the touch screen, and a breaking of the contact at one or more touch screen locations corresponding to the the opposite edge (to complete the gesture). interactive objects with which she wishes to interact. The While the touch screen is locked, the user may initiate device detects the contact and responds to the detected concontact with the touch screen, i.e., touch the touch screen 35 tact by performing the operation(s) corresponding to the (206). For convenience of explanation, contact on the touch interaction with the interactive object(s). screen in the process 200 and in other embodiments described While the device is locked, the user may still make contact below will be described as performed by the user using at on the touch screen. However, the locked device is prevented least one hand using one or more fingers. However, it should from performing a predefined set of actions in response to any be appreciated that the contact may be made using any suit- 40 detected contact until the device is unlocked. The prevented able object or appendage, such as a stylus, finger, etc. The predefined set of action may include navigating between user contact may include one or more taps on the touch screen, interfaces and entry of data by the user. maintaining continuous contact with the touch screen, moveWhile the device is locked, the device may display one or ment of the point of contact while maintaining continuous more visual cues of the unlock action, as described above. In contact, a breaking of the contact, or any combination thereof. 45 some embodiments, the device may also display, along with The device detects the contact on the touch screen (208). If the visual cues, an unlock image. The unlock image is a the contact does not correspond to an attempt to perform the graphical, interactive user-interface object with which the unlock action, or if the contact corresponds to a failed or user interacts in order to unlock the device. In other words, the aborted attempt by the user to perform the unlock action unlock action is performed with respect to the unlock image. (210 no), then the device remains locked (212). For so In some embodiments, performing the unlock action with example, if the unlock action is a horizontal movement of the respect to the image includes dragging the unlock image in a point of contact across the touch screen while maintaining predefined manner, which moves the unlock image across the continuous contact with the touch screen, and the detected touch screen. In some embodiments, if the unlock action is contact is a series of random taps on the touch screen, then the not completed, the GUI display can show reverse progress device will remain locked because the contact does not cor- 55 towards the locked state by gradually returning the unlock respond to the unlock action. image to its position in the locked state If the contact corresponds to a successful performance of In some embodiments, in addition to visual feedback, the the unlock action, i.e., the user performed the unlock action electronic device supplies non-visual feedback to indicate successfully (210 yes), the device transitions to the unlock progress towards completion of the unlock action. In some state (214). For example, if the unlock action is a horizontal 60 embodiments, in addition to visual feedback, the electronic movement of the point of contact across the touch screen device supplies non-visual feedback to indicate completion while maintaining continuous contact with the touch screen, of the unlock action. The additional feedback may include and the detected contact is the horizontal movement with the audible feedback (e.g., sound(s)) or physical feedback (e.g., continuous contact, then the device transitions to the unlock vibration(s)). 65 FIG. 3 is a flow diagram illustrating a process 300 for state. In some embodiments, the device begins the process of transitioning a device to a user-interface unlock state using an transitioning to the unlock state upon detection of any contact unlock image, according to some embodiments of the inven- EXHIBIT 1 PAGE 22 US 7,657,849 B2 11 12 tion. The process 300 is similar to the process 200 (FIG. 2) broadly. For example, the unlock action may be to drag the with the addition of an unlock image that is displayed with the unlock image from its initial location, along the predefined visual cues. The unlock action in the process 300 is performed path, to any spot within a predefined region on the touch with respect to the unlock image, i.e., the unlock action screen. The predefined path may include one or more straight includes interaction with the unlock image. While the process 5 lines or lines with twists and turns. flow 300 described below includes a number of operations The user makes contact with the touch screen (306), similar that appear to occur in a specific order, it should be apparent to the operation 206 (FIG. 2). The device detects the contact that these processes can include more or fewer operations, with the touch screen (308), similar to the operation 208 (FIG. which can be executed serially or in parallel (e.g., using 2). If the contact does not correspond to successful perfor10 parallel processors or a multi-threading environment). mance of the unlock action with respect to the image (310 The device is locked upon satisfaction of a lock condition no), the device remains locked. If the contact does correspond (302), similar to the operation 202 (FIG. 2). An unlock image to successful performance of the unlock action with respect to and visual cues of the unlock action using the unlock image the image (310 yes), the device is unlocked (314). are displayed (304). The operation 304 is the same as the operation 204 (FIG. 2), except that in the operation 304 an 15 FIGS. 4A-4B illustrate the GUI display of a device in a unlock image is displayed in addition to the visual cues. user-interface lock state, according to some embodiments of As described above, the unlock action includes interaction the invention. In FIG. 4A, device 400 includes a touch screen with the unlock image. In some embodiments, the unlock 408 and a menu button 410. The device 400 is locked and the action includes the user performing a predefined gesture with touch screen 408 is displaying an unlock image 402 and respect to the unlock image. In some embodiments, the ges- 20 visual cues. The visual cues shown include a channel 404 ture includes dragging the unlock image to a location on the indicating the path of the gesture/movement along which the touch screen that meets one or more predefined unlock criteunlock image 402 is to be dragged, similar to a groove along ria. In other words, the user makes contact with the touch which a slider switch moves; and one or more arrows 406 screen at a location corresponding to the unlock image and indicating the direction of the gesture/movement. The end of then performs the predefined gesture while maintaining con- 25 the channel 404 (in FIGS. 4A-4B and 5A-5D, the "end" of the tinuous contact with the touch screen, dragging the image to channel is the right end) also serves as a predefined location to the location that meets the predefined unlock criteria. In some which the unlock image 402 is to be dragged. The unlock embodiments, the unlock action is completed by breaking the image 402 may also include an arrow to further remind the contact with the touch screen (thus releasing the unlock user the direction of the gesture/movement. As described 30 image) upon completion of the predefined gesture. above, the visual cues and the unlock image may be displayed A location meeting one or more predefined unlock criteria by the device 400 upon an event that may require the user's is simply a location on the touch screen that is predefined as attention (e.g., incoming call or message) or upon user intera location to which the unlock image is to be dragged in order vention (e.g., the user pressing the menu button 410 while the to unlock the device. The location(s) may be defined narrowly device is locked). or broadly and may be one or more particular locations on the 3 5 In some embodiments, the arrows 406 and the arrow on the touch screen, one or more regions on the touch screen, or any unlock image 402 may be animated. For example, the arrow combination thereof. For example, the location may be on the unlock image 402 may appear and disappear in a defined as a particular marked location, areas at each of the pulse-like manner and the arrows 406 may emanate from one four corners of the touch screen, or a quadrant of the touch end of the channel 406 in sync with the pulsing of the arrow 40 screen, etc. on the unlock image 402. As shown in FIG. 4B, the arrow 406 In some embodiments, the interaction includes dragging may move along the channel 404 and disappear when it the unlock image to a predefined location on the touch screen. moves to the end of the channel 404. For example, the unlock action may include dragging the The visual cues illustrated in FIGS. 4A and 4B remind the unlock image from one corner of the touch screen to another corner of the touch screen. As another example, the unlock 45 user that the unlock action is a predefined gesture that includes a horizontal movement of the finger (and thus movaction may include dragging the unlock image from one edge ing the point of contact) along the channel 404, from the of the touch screen to the opposite edge. The emphasis here is beginning of the channel 404, where the unlock image is on the final destination of the unlock image (and of the finger) initially located, to the end of the channel 404. It should be . Thus, the user can drag the unlock image from its initial location along any desired path. As long as the unlock image 50 appreciated, however, that the visual cues shown in FIGS. 4A-4B are merely exemplary and that more or fewer visual reaches the predefined location and is released at that locacues, or alternative visual cues may be used. The content of tion, the device is unlocked. It should be appreciated that the the visual cues may be based on the particulars of the unlock predefined location may be, as described above, defined naraction. rowly or broadly and may be one or more particular locations on the touch screen, one or more regions on the touch screen, 55 FIGS. 5A-5D illustrate the GUI display of a device at various points of the performance of an unlock action gesture, or any combination thereof. according to some embodiments of the invention. In FIG. 5A, In some other embodiments, the unlock action includes the user, represented by the hand and finger 502 (not drawn to dragging the unlock image along a predefined path. For scale), begins the unlock action by touching the touch screen example, the unlock action may include dragging the unlock image clockwise along the perimeter of the touch screen (the 60 408 of device 400 with her finger 502. In some embodiments, the touch screen 408 is initially in sleep mode and/or dark, path being the perimeter of the touch screen), from one of the and the screen 408 displays the unlock image 402 when corners and back. As another example, the unlock action may touched. The user touches the touch screen 408 at the location include dragging the unlock image from one edge of the touch corresponding to the unlock image 402, which is located screen to the opposite edge in a linear path. The emphasis here is on the path along which the unlock image (and the finger) 65 initially at the left end of the channel 404. The contact, either overlapping with the unlock image 402 or in proximity to the moves. Because of the emphasis on the path, the final location unlock image 402, is detected by the device 400 and is deterto which the unlock image is to be moved may be defined EXHIBIT 1 PAGE 23 US 7,657,849 B2 13 14 mined to be an attempt to unlock the touch screen, based on the fact that the user 502 is interacting with the unlock image 402. Indication of Progress Towards Satisfaction of a User Input Condition FIG. 6 is a flow diagram illustrating a process 600 for In FIG. 5B, the user is in the process of performing the 5 indicating progress towards satisfaction of a user input congesture by moving her finger, which is in continuous contact dition according to some embodiments of the invention. with the touch screen 408, in the direction of movement 504. While the process flow 600 described below includes a numThe unlock image 402 is dragged along the channel 404 as a ber of operations that appear to occur in a specific order, it result of the gesture. The channel 404 reminds the user that should be apparent that these processes can include more or the unlock gesture is a horizontal motion. In some embodi- 10 fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environments, the channel 404 indicates the predefined location (in ment). FIGS. 5A-5D, the right end of the channel) to which the user While an electronic device is in a first user-interface state, drags the unlock image 402 to complete the unlock action progress is detected (602) towards satisfaction of a user input and/or the predefined path along which the user drags the 15 condition needed to transition to a second user-interface state. unlock image 402 to complete the unlock action. In some embodiments, the first user-interface state is for a first In FIG. 5C, the user has dragged the unlock image 402 to application and the second user-interface state is for a second the right end of the channel 404. Once the user releases the application. In some embodiments, the first user-interface unlock image 402 at the right end of the channel 404, the state is a lock state and the second user-interface state is an unlock action is complete. Upon completion of the unlock 20 unlock state. While the device is in the first user-interface state, progress gesture, the device unlocks and displays on the touch screen is indicated (604) towards satisfaction of the condition by 408 user-interface objects associated with normal operation transitioning an optical intensity of one or more user interface of the device 400. FIG. 5D illustrates an example of userobjects associated with the second user-interface state. The interface objects that may be displayed when the device 400 25 change in optical intensity of the user-interface objects prois unlocked. In FIG. 5D, the device 400 displays a menu 506. vides a user with sensory feedback of the progress in transiThe menu 506 includes interactive user-interface objects cortioning between user interface states. responding to various applications or operations. A user may In some embodiments, in addition to visual feedback, the interact with the user-interface objects to activate an applicadevice supplies non-visual feedback to indicate progress tion or perform an operation. It should be appreciated, how- 30 towards satisfaction of the user input condition. The addiever, that the device 400, upon being unlocked, may display tional feedback may include audible feedback (e.g., sound(s)) additional or alternative user-interface objects. or physical feedback (e.g., vibration(s)). The device transitions (606) to the second user-interface In some embodiments, the unlock image 402 may also be state if the condition is satisfied. In some embodiments, in used to indicate failure of performance of the unlock action. For example, if the user breaks the contact with the touch 35 addition to visual feedback, the device supplies non-visual feedback to indicate satisfaction of the user input condition. screen before the unlock image reaches the right end of the The additional feedback may include audible feedback (e.g., channel 404, the unlock action has failed. The device 400 may sound(s)) or physical feedback (e.g., vibration(s)). display the unlock image 402 returning to its initial position The optical intensity of a user-interface object, as used on the left end of the channel 404, allowing the user to attempt the unlock action again, if she so chooses. In some embodi- 40 herein, is the object's degree of visual materialization. The optical intensity may be measured along a scale between a ments, the device goes back to sleep if no gesture is applied in predefined minimum and a predefined maximum. In some a predetermined period of time. embodiments, the optical intensity may be measured along a In some embodiments, the user may unlock the device 400 logarithmic scale. In some embodiments, the optical intensity by contacting the touch screen 408 and moving the point of 45 may be perceived by users as a transparency effect (or lack contact horizontally along a fraction of the channel 404, i.e., thereof) applied to the user-interface object. In some embodithe user need not move all the way to the right end of the ments, the minimum optical intensity means that the object is channel In some embodiments, the user may unlock the not displayed at all (i.e., the object is not perceptible to the device 400 by making contact anywhere on the touch screen user), and the maximum optical intensity means that the 408 and moving the point of contact horizontally as if he or 50 object is displayed without any transparency effect (i.e., the object has completely materialized visually and is perceptible she were following the channel 404. to the user). In some other embodiments, the optical intensity In some embodiments, the lock/unlock feature may apply may be the visual differentiation between the user-interface to specific applications that are executing on the device 400 as object and the background, based on color, hue, color saturaopposed to the device 400 as a whole. In some embodiments, 55 tion, brightness, contrast, transparency, and any combination an unlock gesture transitions from one application to another, thereof. for example, from a telephone application to a music player or In some embodiments, the optical intensity of the uservice versa. The lock/unlock feature may include a hold or interface objects to be displayed in the second user-interface pause feature. In some embodiments, as the user transitions state is increased smoothly. Smoothly may include a transifrom a first application and to a second application, a user 60 tion time that is greater than a pre-defined threshold, for interface for the second application may fade in (i.e., increase example, 0.2 s, 1 s or 2 s. The rate of the transition of the in intensity) and a user interface for the first application may optical intensity may be any predefined rate. fade out (i.e., decrease in intensity). The fade in and fade out In some embodiments, the indication of progress towards may occur smoothly over a pre-determined time interval, completion of the user input condition is a function of the such as 0.2 s, 1 s or 2 s. The pre-determined time interval may 65 user's satisfaction of the condition. For example, for a tranbe in accordance with the unlock gesture, such as the time it sition to an unlock state, the indication of progress towards takes the user to perform the gesture. completion is a function of the user's performance of an EXHIBIT 1 PAGE 24 US 7,657,849 B2 15 16 unlock action. For a linear function, the indication of progress releasing the unlock image 702. The device 700 transitions to is 10% complete when the unlock action is 10% complete; the the unlock state. The unlock image 702 and the channel 704 indication of progress is 50% complete when the unlock disappear from the display and the virtual buttons 708 are at action is 50% complete, and so forth, up to 100% completion their final optical intensity levels, as illustrated by their solid of the unlock action, at which point the transition to the 5 outlines. At this point the user may interact with the virtual unlock state occurs. Correspondingly, for a linear function, buttons 708 and accept or decline the incoming call. the transition of the optical intensity from an initial value to a As described above in relation to FIGS. 5A-5D, if the final value is 10% complete when the unlock action is 10% unlock action fails because the user releases the unlock image complete; the transition is 50% complete when the unlock prematurely, the unlock image may return to its initial locaaction is 50% complete, and so forth, up to 100% completion 1 0 tion. In some embodiments, the optical intensity of the virtual of the unlock action, at which point the optical intensity is at buttons 708 or other user-interface objects that were increasits final value. In some embodiments, the user may perceive ing in optical intensity as the unlock action was performed the optical intensity transition as a fading in of the usermay, concurrent with the return of the unlock image to its interface objects as the unlock action is performed. It should initial location, have their optical intensity decreased be appreciated that the function need not be linear and alter- 1 5 smoothly, back to their initial levels. native functions may be used, further details of which are FIGS. 8A-8C are graphs illustrating optical intensity as a described below, in relation to FIGS. 8A-8C. function of the completion of the user input condition, If the user input condition includes a predefined gesture according to some embodiments of the invention. In FIG. 8A, then the indication of progress of the gesture may be defined the optical intensity is a linear function of the completion of in terms of how much of the gesture is completed and how 2 0 the user input condition. At 0% completion, the optical intenmuch of the gesture is remaining. For example, if the gesture sity is at an initial value (in this case, the initial value is 0). As includes moving the finger from one edge of the screen to the the completion percentage increases, the optical intensity opposite edge horizontally, then the indication of progress increases linearly with the completion percentage, until it may be defined in terms of the distance between the two edges reaches the final value at 100% completion. because the distance remaining objectively measures how 25 In FIG. 8B, the optical intensity is a nonlinear function of much further the user has to move her finger to complete the the completion of the user input condition. At 0% completion, gesture. the optical intensity is at an initial value (in this case, the If the user input condition includes dragging an image to a initial value is 0). As the completion percentage increases, the predefined location, then the indication of progress may be optical intensity increases gradually at first, but the increase defined in terms of the distance between the initial location of 30 becomes steeper as the completion percentage increases, the image and the predefined location to which the image is to until it reaches the final value at 100% completion. be dragged in order to complete the input condition. In FIG. 8C, the optical intensity is another nonlinear funcIf the user input condition includes dragging an image tion of the completion of the user input condition. At 0% along a predefined path, then the indication of progress may be defined in terms of the length of the predefined path. 3 5 completion, the optical intensity is at an initial value (in this case, the initial value is 0). As the completion percentage FIGS. 7A-7D illustrate the GUI display of a device that is increases, the optical intensity increases steeply at first, but transitioning the optical intensity of user-interface objects the increase becomes more gradual as the completion perconcurrent with a transition from a first user interface state to centage increases, until it reaches the final value at 100% a second user interface state, according to some embodiments of the invention. In FIG. 7A, the device 700 is locked and has 4 0 completion. In some embodiments, the optical intensity may increase according to a logarithmic scale. received an incoming call. The device 700 is displaying a In some embodiments, the optical intensity may reach its prompt 706 to the user, informing the user of the incoming final value prior to 100% completion of the user input condicall, on the touch screen 714. The device is also displaying the tion (e.g., at 90% completion). unlock image 702 and channel 704 so that the user can unlock the device 700 in order to accept or decline the incoming call. 45 User Interface Active States Corresponding to Events The user begins the unlock action by making contact on the or Applications touch screen with her finger 710 on the unlock image 702. In FIG. 7B, the user is in the process of dragging the unlock image 702 along the channel 704 in the direction of moveFIG. 9 is a flow diagram illustrating a process 900 for ment 712. As the user drags the unlock image, a set of virtual so transitioning a device to a user interface active state correbuttons 708 appears and increases in optical intensity. The sponding to one of a plurality of unlock images, according to virtual buttons 708 are shown with dotted outlines to indicate some embodiments of the invention. In some embodiments, that they are not yet at their final optical intensity levels. The the device may have one or more active applications running virtual buttons 708 are associated with the prompt 706; the when the device becomes locked. Additionally, while locked, virtual buttons shown in FIG. 7B-7D allow the user to decline 55 the device may continue to receive events, such as incoming or accept the incoming call. However, the user cannot interact calls, messages, voicemail notifications, and so forth. The with the virtual buttons 708 until the device is unlocked and device may display multiple unlock images on the touch the virtual buttons have reached their final optical intensity. In screen, each unlock image corresponding to an active appliFIG. 7C, the user drags the unlock image 702 further along cation or incoming event. Performing the unlock action using the channel 704 in the direction of movement 712. The virtual 60 one of the multiple unlock images unlocks the device and buttons 708 have increased further in optical intensity relative displays the application and/or event corresponding to the to their optical intensity in FIG. 7B, as illustrated by their unlock image. The user interface active state, as used herein, different style of dotted outlines. The increases in optical means that the device is unlocked and a corresponding appliintensity indicate to the user progress towards completion of cation or event is displayed on the touch screen to the user. 65 While the process flow 900 described below includes a numthe unlock action. In FIG. 7D, the user completes the unlock action by dragber of operations that appear to occur in a specific order, it ging the unlock image to the right end of the channel 704 and should be apparent that these processes can include more or EXHIBIT 1 PAGE 25 17 US 7,657,849 B2 fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environ- ment). 18 cues, such as the first channel 1004 and arrow 1006. A second unlock image 1008 is displayed with corresponding visual cues, such as the second channel 1010 and arrow 1012. The The device is locked upon satisfaction of a predefined lock touch screen 1014 may display additional unlock images and condition (902). The device may have active applications 5 visual cues. The first unlock image 1002 corresponds to a first running when it is locked and the active applications may running application or received event. The second unlock continue running while the device is locked. Additionally, image 1008 corresponds to a second running application or while the device is locked, the device may receive events, received event. The first and second unlock images and visual such as incoming calls, messages, and voicemail notifica- cues are similar to the unlock image and visual cues described tions. io above, in relation to FIGS. 4A and 4B. The arrows 1006 and The device displays a plurality of unlock images, each 1012 may be animated to move from one end of the channels displayed unlock image corresponding to an active applica1004 and/or 1010 to the other end, in order to indicate the tion running or an event received while the device is locked proper direction of the predefined gesture or movement of the (904). In some embodiments, the device also displays visual unlock image. cues of the unlock action with respect to each unlock image. 15 FIGS. 11A-11F illustrate the GUI display of a device at The device may display additional unlock images and visual various points in the performance of an unlock action gesture cues as additional events are received. The user makes contact corresponding to one of a plurality of unlock images, accord with the touch screen (906). The device detects the contact ing to some embodiments of the invention. In FIG. 11A, the gesture (908). If the detected contact gesture does not corre- user makes contact with the touch screen 1014 using her spond to successful performance of the unlock action with 20 finger 1102 (not shown to scale), at the location correspondrespect to any one of the displayed unlock images (e.g., ing to the second unlock image 1008. The user performs the because the contact is not an attempt to perform the unlock unlock action gesture by moving the point of contact, drag action or the unlock action failed/was aborted) (910 no), the ging the second unlock image 1008. FIG. 11B shows a snapdevice remains locked (912). If the detected contact gesture shot of the device 1000 during the pendency of the unlock does correspond to successful performance of the unlock 25 action. The second unlock image 1008 is moved along in the action with respect to one of the displayed unlock images channel 1010 in the direction of movement 1104. (910 yes), the touch screen is unlocked and the running FIG. 11C shows the second unlock image 1008 moved to application or event corresponding to the one of the unlock the end of the channel 1010, where the unlock action with images is displayed on the touch screen (914). In other words, respect to the second unlock image 1008 will be completed the device transitions to a first active state corresponding to 30 once the user breaks the contact (and releases the second the first image if the detected contact corresponds to a pre- unlock image 1008). In some embodiments, the unlock action defined gesture with respect to the first image; the device is completed when the unlock image 1008 is moved to the end transitions to a second active state distinct from the first active of the channel 1010, with or without the user breaking con state and corresponding to the second image if the detected tact, and the second unlock image 1008 disappears. As shown contact corresponds to a predefined gesture with respect to 35 in FIG. 11D, upon completion of the unlock action with the second image; and so on. respect to the second unlock image 1008, the device displays The device becomes unlocked and makes the correspond- on the touch screen the user-interface objects 1106 associated ing event or application visible to the user, active, or running with the application or event corresponding to the second in the foreground, as opposed to running in the background, unlock image 1008. In FIG. 11D, the event corresponding to upon performance of the unlock action with respect to the 40 the second unlock image is an incoming text message event particular unlock image. The user-interface active state and a prompt for the user to read it. includes the running application or incoming event correThe user, instead of performing the unlock action with sponding to the particular unlock image with which the user respect to the second unlock image 1108, may instead perinteracted being displayed prominently on the touch screen, form the unlock action gesture with respect to the first unlock in addition to the device being unlocked. Thus, unlocking 45 image 1002. In FIG. 11E, the user does so and performs the using a first unlock image (if multiple unlock images are unlock action with respect to the first unlock image 1002 by displayed) transitions the device to a first user-interface active dragging the first unlock image, in the direction 1104, to the state, in which the device is unlocked and the application/ right end of the channel 1004. Upon completion of the unlock event corresponding to the first unlock image is displayed action, the device 1000 displays the user-interface objects prominently. Unlocking using a second image transitions the so 1108 associated with the application or event corresponding device to a second user-interface active state, in which the to the first unlock image 1002. In FIG. 11F, the application device is unlocked and the application/event corresponding to corresponding to the first unlock image is a music player the second unlock image is displayed prominently. application. In some embodiments, the device may prioritize which In some embodiments, the transition to a user interface unlock images to display. The device may display a subset of 55 active state, as described in FIGS. 9 and 11A-11E, may also the corresponding unlock images on the touch screen at one include a concurrent transition in the optical intensity of time. The device may decide which subset to display based on user-interface objects, similar to that described above in rela tion to FIGS. 6, 7A-7D, and 8A-8C. Concurrent with the one or more predefined criteria. For example, the device may display only unlock images corresponding to the most recent transition to a user interface active state, the user-interface events and/or running applications. As another example, the 60 objects associated with the application or event correspond device may display only unlock images corresponding to ing to the unlock image with which the user interacted to incoming events. unlock the device increase in intensity. For example, the FIG. 10 illustrates the GUI of a device 1000 in a user- optical intensity of the user-interface objects 1106 associated interface lock state that displays a plurality of unlock images, with the text message prompt in FIG. 11D may be increased according to some embodiments of the invention. In FIG. 10, 65 smoothly, as a function of the progress towards completion of the touch screen 1014 of the device 1000 is locked. A first the unlock action with respect to the second unlock image unlock image 1002 is displayed with corresponding visual 1008. As another example, the optical intensity of the user- EXHIBIT 1 PAGE 26 US 7,657,849 B2 19 20 interface objects 1108 associated with music player application in FIG. 11F may be increased smoothly, as a function of the progress towards completion of the unlock action with respect to the first unlock image 1002. The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. What is claimed is: 1. A method of controlling an electronic device with a touch-sensitive display, comprising: detecting contact with the touch-sensitive display while the device is in a user-interface lock state; moving an unlock image along a predefined displayed path on the touch-sensitive display in accordance with the contact, wherein the unlock image is a graphical, interactive user-interface object with which a user interacts in order to unlock the device; transitioning the device to a user-interface unlock state if the detected contact corresponds to a predefined gesture; and maintaining the device in the user-interface lock state if the detected contact does not correspond to the predefined gesture. 2. The method of claim 1, further comprising, while the device is in the user-interface lock state, preventing the device from performing a predefined set of actions in response to detecting any contact with the touch-sensitive display that does not correspond to the predefined gesture. 3.The method of claim 1, wherein the predefined displayed path is a channel 4.The method of claim 1, wherein the detected contact is a movement of a point of contact across the touch-sensitive display while maintaining continuous contact with the touchsensitive display. 5. The method of claim 4, wherein the movement of the point of contact across the touch-sensitive display while maintaining continuous contact with the touch-sensitive display is a horizontal movement. 6. A method of controlling a device comprising a touchsensitive display, comprising: displaying an unlock image on the touch-sensitive display while the device is in a user-interface lock state, wherein the unlock image is a graphical, interactive user-interface object with which a user interacts in order to unlock the device; detecting contact with the touch-sensitive display; transitioning the device to a user-interface unlock state if the detected contact corresponds to moving the unlock image along a predefined displayed path on the touchsensitive display to a predefined location on the touchsensitive display; and maintaining the device in the user-interface lock state if the detected contact does not correspond to moving the unlock image along the predefined displayed path on the touch-sensitive display to the predefined location. 7. The method of claim 6, further comprising, while the device is in the user-interface lock state, preventing the device from performing a predefined set of actions in response to detecting any contact with the touch-sensitive display that does not correspond to moving the unlock image along the predefined displayed path on the touch-sensitive display to the predefined location. 8. A method of controlling a device comprising a touch5 sensitive display, comprising: displaying an unlock image on the touch-sensitive display while the device is in a user-interface lock state, wherein the unlock image is a graphical, interactive user-interface object with which a user interacts in order to unlock 10 the device; detecting contact with the touch-sensitive display; and transitioning the device to a user-interface unlock state if the detected contact corresponds to moving the unlock image across the touch-sensitive display according to a 15 predefined displayed path on the touch-sensitive display; and maintaining the device in the user-interface lock state if the detected contact does not correspond to moving the unlock image across the touch-sensitive display accord20 ing to the predefined displayed path. 9. The method of claim 8, further comprising, while the device is in a user-interface lock state, preventing the device from performing a predefined set of actions in response to detecting any contact with the touch-sensitive display that 25 does not correspond to moving the unlock image across the touch-sensitive display according to the predefined displayed path. 10. The method of claim 8, wherein the detected contact corresponding to moving the unlock image across the touch30 sensitive display according the predefined displayed path comprises contact corresponding to moving the unlock image across the touch-sensitive display to an endpoint of the predefined displayed path. 11. A method of controlling a device comprising a touch35 sensitive display, comprising: displaying a first unlock image and a second unlock image on the touch-sensitive display while the device is in a user-interface lock state; 40 detecting contact with the touch-sensitive display; transitioning the device to a first active state corresponding to the first unlock image if the detected contact corresponds to a predefined gesture with respect to the first unlock image that moves the first unlock image along a 45 first predefined displayed path on the touch sensitive display; and transitioning the device to a second active state distinct from the first active state if the detected contact corresponds to a predefined gesture with respect to the second 50 unlock image that moves the second unlock image along a second predefined displayed path on the touch-sensitive display. 12. A portable electronic device, comprising: a touch-sensitive display; 55 memory; one or more processors; and one or more modules stored in the memory and configured for execution by the one or more processors, the one or more modules including instructions: 60 to set the device to a user-interface lock state; to display an unlock image on the touch-sensitive display while the device is in the user-interface lock state, wherein the unlock image is a graphical, interactive user-interface object with which a user inter65 acts in order to unlock the device; to detect contact with the touch-sensitive display; EXHIBIT 1 PAGE 27 21 US 7,657,849 B2 to move the unlock image along a predefined displayed path on the touch-sensitive display in accordance with the contact; to transition the device to a user-interface unlock state if the detected contact corresponds to a predefined ges- 5 ture; and to maintain the device in the user-interface lock state if the detected contact does not correspond to the predefined gesture. 10 13.A portable electronic device, comprising: a touch-sensitive display; memory; one or more processors; and one or more modules stored in the memory and configured for execution by the one or more processors, the one or 15 more modules including instructions: to set the device to a user-interface lock state; to display an unlock image on the touch-sensitive dis- play while the device is in the user-interface lock state, wherein the unlock image is a graphical, inter- 20 active user-interface object with which a user inter- acts in order to unlock the device; to detect contact with the touch-sensitive display; to transition the device to a user-interface unlock state if the detected contact corresponds to moving the 25 unlock image along a predefined displayed path on the touch-sensitive display to a predefined location on the touch-sensitive display; and to maintain the device in the user-interface lock state if the detected contact does not correspond to moving 30 the unlock image along the predefined displayed path on the touch-sensitive display to the predefined location. 14.A portable electronic device, comprising: a touch-sensitive display; 35 memory; one or more processors; and one or more modules stored in the memory and configured for execution by the one or more processors, the one or 40 more modules including instructions: to set the device to a user-interface lock state; to display an unlock image on the touch-sensitive dis play while the device is in the user-interface lock state; 45 to detect contact with the touch-sensitive display; to transition the device to a user-interface unlock state if the detected contact corresponds to moving the unlock image across the touch-sensitive display according to a predefined displayed path on the touch sensitive display; and 50 to maintain the device in the user-interface lock state if the detected contact does not correspond to moving the unlock image across the touch-sensitive display according to the predefined displayed path. 15. A portable electronic device, comprising: 55 a touch sensitive display; memory; one or more processors; one or more modules stored in the memory and configured for execution by the one or more processors, the one or 60 more processors including instructions: to set the device to a user-interface lock state; to display a first unlock image and a second unlock image on the touch-sensitive display; 65 to detect contact with the touch-sensitive display; to transition the device to a first active state correspond- ing to the first unlock image if the detected contact 22 corresponds to a predefined gesture with respect to the first unlock image that moves the first unlock image along a first predefined displayed path on the touchsensitive display; and to transition the device to a second active state distinct from the first active state if the detected contact corresponds to a predefined gesture with respect to the second unlock image that moves the second unlock image along a second predefined displayed path on the touch-sensitive display. 16. A portable electronic device, comprising: a touch-sensitive display; means for detecting contact with the touch-sensitive display while the device is in a user-interface lock state; means for moving an unlock image along a predefined displayed path on the touch-sensitive display in accordance with the contact, wherein the unlock image is a graphical, interactive user-interface object with which a user interacts in order to unlock the device; means for transitioning the device to the user-interface unlock state if the detected contact corresponds to a predefined gesture; and means for maintaining the device in the user-interface lock state if the detected contact does not correspond to the predefined gesture. 17. A portable electronic device, comprising: a touch-sensitive display; means for displaying an unlock image on the touch-sensitive display while the device is in a user-interface lock state, wherein the unlock image is a graphical, interactive user-interface object with which a user interacts in order to unlock the device; means for detecting contact with the touch-sensitive display; means for transitioning the device to a user-interface unlock state if the detected contact corresponds to moving the unlock image along a predefined displayed path on the touch-sensitive display to a predefined location on the touch-sensitive display; and means for maintaining the device in the user-interface lock state if the detected contact does not correspond to moving the unlock image along the predefined displayed path on the touch-sensitive display to the predefined location. 18. A portable electronic device, comprising: a touch-sensitive display; means for displaying an unlock image on the touch-sensitive display while the device is in a user-interface lock state, wherein the unlock image is a graphical, interactive user-interface object with which a user interacts in order to unlock the device; means for detecting contact with the touch-sensitive display; and means for transitioning the device to a user-interface unlock state if the detected contact corresponds to moving the unlock image across the touch-sensitive display according to a predefined displayed path on the touchsensitive display; and means for maintaining the device in the user-interface lock state if the detected contact does not correspond to moving the unlock image across the touch-sensitive display according to the predefined displayed path. 19. A portable electronic device, comprising: a touch-sensitive display; means for displaying a first unlock image and a second unlock image on the touch-sensitive display while the device is in a user-interface lock state; EXHIBIT 1 PAGE 28 US 7,657,849 B2 24 23 means for detecting contact with the touch-sensitive dismaintaining the device in the user-interface lock state if the play; detected contact does not correspond to moving the means for transitioning the device to a first active state unlock image along the predefined displayed path on the corresponding to the first unlock image if the detected touch-sensitive display to the predefined location. contact corresponds to a predefined gesture with respect 5 22. A computer program product for use in conjunction to the first unlock image that moves the first unlock with a portable electronic device comprising a touch-sensiimage along a first predefined displayed oath on the tive display, the computer program product comprising a touch-sensitive display; and computer readable storage medium and an executable commeans for transitioning the device to a second active state puter program mechanism embedded therein, the executable distinct from the first active state if the detected contact 10 computer program mechanism comprising instructions for: corresponds to a predefined gesture with respect to the displaying an unlock image on the touch-sensitive display second unlock image that moves the second unlock while the device is in a user-interface lock state, wherein image along a second predefined displayed oath on the the unlock image is a graphical, interactive user-intertouch-sensitive display. face object with which a user interacts in order to unlock 20. A computer program product for use in conjunction 15 the device; with a portable electronic device comprising a touch-sensidetecting contact with the touch-sensitive display; and tive display, the computer program product comprising a transitioning the device to a user-interface unlock state if computer readable storage medium and an executable comthe detected contact corresponds to moving the unlock puter program mechanism embedded therein, the executable image across the touch-sensitive display according to a 20 computer program mechanism comprising instructions for: predefined displayed path on the touch-sensitive disdetecting contact with the touch-sensitive display while the play; and device is in a user-interface lock state; maintaining the device in the user-interface lock state if the moving an unlock image along a predefined displayed path detected contact does not correspond to moving the on the touch-sensitive display in accordance with the unlock image across the touch-sensitive display accordcontact, wherein the unlock image is a graphical, inter- 25 ing to the predefined displayed path. active user-interface object with which a user interacts in 23. A computer program product for use in conjunction order to unlock the device; with a portable electronic device comprising a touch-sensitransitioning the device to the user-interface unlock state if tive display, the computer program product comprising a the detected contact corresponds to a predefined gesture; computer readable storage medium and an executable com30 and puter program mechanism embedded therein, the executable maintaining the device in the user-interface lock state if the computer program mechanism comprising instructions for: detected contact does not correspond to the predefined displaying a first unlock image and a second unlock image gesture. on the touch-sensitive display while the device is in a 21. A computer program product for use in conjunction user-interface lock state; with a portable electronic device comprising a touch-sensi- 35 detecting contact with the touch-sensitive display; tive display, the computer program product comprising a transitioning the device to a first active state corresponding computer readable storage medium and an executable comto the first unlock image if the detected contact correputer program mechanism embedded therein, the executable sponds to a predefined gesture with respect to the first computer program mechanism comprising instructions for: displaying an unlock image on the touch-sensitive display 40 unlock image that moves the first unlock image along a first predefined displayed oath on the touch-sensitive while the device is in a user-interface lock state, wherein display; and the unlock image is a graphical, interactive user-intertransitioning the device to a second active state distinct face object with which a user interacts in order to unlock from the first active state if the detected contact correthe device; 45 sponds to a predefined gesture with respect to the second detecting contact with the touch-sensitive display; unlock image that moves the second unlock image along transitioning the device to a user-interface unlock state if a second predefined displayed oath on the touch-sensithe detected contact corresponds to moving the unlock tive display. image along a predefined displayed path on the touchsensitive display to a predefined location on the touchsensitive display; and EXHIBIT 1 PAGE 29

Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.


Why Is My Information Online?